Weeping man studying unexpected Google Maps API charges

Unexpected Google Maps API charges

How it happened, how to solve it, lessons learned

Ferdy Christant
Ferdy Christant
Published in
11 min readJul 28, 2023

--

Introduction

A while ago I was reviewing my credit card statement. This is not something I regularly do as here in the Netherlands it’s an uncommon payment method. The only recurring charge I expect is from Amazon Web Services (AWS), but since I get monthly emails containing the AWS charges anyway, I’ve been neglectful to do regular due diligence. I sometimes forget that I even have a credit card at all.

Hence, I was mildly surprised to learn about a 10$ charge in May from Google, based on API usage. I do use a few Google APIs, it’s just that my low usage typically falls within the free tier. I figured that perhaps there was a minor traffic surge on my website or changes in the pricing structure of the API service. No big deal.

Anxiety did set in when I then learned about a 100$ charge for June. Followed by full panic mode after I logged into Google Cloud Console. It displayed a 150$ charge for the first week of July, followed by a cost projection of 500$ for the whole month of July. Which one day later was recalculated to 800$.

Something is deeply wrong here, this thing is going parabolic.

What happened?

In short, my API keys were not properly protected. Somebody abused the key to make calls to the Maps API but more importantly the stunningly expensive Places API.

I don’t even know what the Places API is, so I was expecting a dashboard with hundreds of thousands of daily requests. Instead, it was a series of small peaks with a few hundred requests. I would consider that minor usage but apparently these days this quickly gets you into a $1000 territory.

But let’s first own my mistake. It was me that failed to properly secure the API keys in the first place. Not out of ignorance, it was more a case of procrastination. I secured the keys before, but this broke an important feature on my website after which I disabled the protection again. I then failed to prioritize getting to the bottom of that problem.

This article serves as a warning to not make the same mistake, and documents the process of fully securing my keys along with the several gotchas I experienced along the way.

Analysis

Google Cloud Console navigation is a true labyrinth. There seems to be 17 different ways to get to data, none particularly useful or intuitive. Ultimately the most useful overview I could find was the metric explorer with a custom date range:

Metrics explorer

This does reveal the problem: unexpected usage of the Places API, which I do not use at all in my web application. Still this overview can be confusing/lacking.

The overview shows costs, not usage. The above chart may mislead you to think that for example on June 15, there was no usage. In reality, usage was significant on that day, but its costs were covered by discounts. Which are complicated by itself: free tier, spending based discounts, and cloud credit all play a factor. The correlation between usage and costs is very weak, making analysis confusing.

For a deeper understanding, let’s inspect July 7, the highest cost day:

July 7 cost

65€ charged for just the Places API with discounts disabled. This corresponds to the following usage:

July 7 usage

0.634K amounts to 634 requests, which is ~0.1$ per every request. That’s one expensive API! This isn’t complicated enough though since apparently the Places API is sub divided into a dozen or so “SKUs”. Whatever.

Here’s another gotcha:

Long term usage

I secured my API keys around July 9, which should bring usage of the Places API to zero, but the abuser seems to happily continue using it per the above chart. Yet costs have stopped as of that date:

Long term costs

No more blue bar charts. The explanation: the continued usage are calls returning an error. The abuser continues to use my API key but it no longer works. These errors are shown as usage yet errors are not charged. I confirmed this with the support team.

Why do the orange cost bars continue (Google Maps calls) after July 9 as they usually are charged as zero based on my usage? Because the free tier/discount for the month was already fully used.

As you can see, analysis is not trivial if you’re not a veteran Google Cloud Engineer. This doesn’t even mention that I did not find a way to discover the source of the abuse. I suspect it may be possible to find out by enabling more advanced monitoring, which itself has a cost. I didn’t look into it as my priority was on making the abuse stop.

Disable unused APIs and services

Let’s start the journey of properly protecting API keys:

Project dasboard

From my project (named “JungleDragon”) here I have a short list of only the Google API services that the application actually uses. Every other API is disabled at project level.

That list was a lot longer before and I don’t remember enabling any of the other services, so I suspect most APIs are enabled by default and need to be disabled manually. Regardless, whether it is opt-in or opt-out, this is the main method of blocking usage of APIs you don’t even use yourself, and how I was able to instantly stop the Places API abuse. As explained in the previous section, traffic of disabled APIs can still occur, yet these error out and are not charged.

Host protection (front-end)

After only allowing particular APIs at project level, you still need to protect the enabled APIs so that your key is only used by your application and nobody else. The typical way to do that is to use key restrictions:

Key restrictions

See above. The key can only be used from my host (https://*.jungledragon.com). Further, I’ve applied API restrictions where you restrict the key to applicable APIs only.

Host protection (back-end)

The reason why my earlier attempt to secure my API keys failed is because in my application, there is one API call being triggered from the back-end (PHP), specifically the Geocoding API.

Host-based protection does not work for back-end calls, instead it requires IP protection. This forced me to split my keys into a front-end key and a back-end key. The front-end key has host protection, the back-end key has IP protection. I obviously needed to change my application to make use of two keys instead of one.

As a side note, there seems to be a distinction between front-end keys and so-called “service keys” that are to be used for back-end calls. I’ve been unable to discover a meaningful difference between the two. Google documentation for service keys/accounts seems identical to the conventional documentation. There’s helper libraries for some back-end libraries but in the end, all they do is make network calls to the same end-point as the front-end API. There’s nothing “special” about them.

Discovering the correct IP address to use for protection was more difficult than I expected. I started by just pinging my website (which is single server) but this public IP did not work. I then output the IP as it is known from the PHP host, which also didn’t work.

Ultimately, I did a PHP var_dump on the return result of these failing requests and discovered useful diagnostics in the error info. Something like “request rejected for IP “……”, revealing what Google considers to be the IP. The gotcha was that it was an IPv6 address, not an IPv4 address.

In the IP restriction section of my back-end API key, I entered both address formats. This doesn’t work. I would consider that a bug. If you allow me to enter multiple IPs to protect a key, I’d figure it makes sense that the request is allowed for as a long as it matches at least one in the list of IPs, otherwise what is the point of multiple IP inputs? Somehow, it doesn’t work that way. After having only the IPv6 address in there, and not the IPv4 address as well, it finally worked properly.

Enable quotas

We’ve now locked down the project to only allow specific APIs and each enabled API in turn is configured to only be usable from your application using restricted keys. This eradicates the problem of key leakage, where somebody else is using your API key.

Hence, we’re now dealing with valid usage only. However, valid usage does not fully protect you from unexpectedly high costs. There could be a traffic surge on your application, either a genuine one (popularity) or a malicious one (DDOS, AI bot, etc).

If popularity at any cost is your goal, you don’t need quotas (or very high ones) yet if you’re running a small non-commercial website and need to protect your budget, quotas are strongly recommended.

Example of quotas

Above is an example of quotas applied from the Google Cloud Console. You need to configure these for each individual API service.

Gotcha: I initially set one of these values too low, which led to the quota actually being reached. I guess that does prove that it works as the maps on my website showed a big warning message on the embedded maps. The gotcha is that upping the quota has no immediate effect, they are reset once per day only.

I love this feature as it makes me sleep at night. On other cloud services it’s typically far more difficult to apply a budget cap. I’m further relaxed by the idea that when the map breaks due to a quota breach in my application, this does not make the application as a whole dysfunctional, it is largely supplemental functionality.

Budget alerts

The final layer of protection is in budget alerts, which apply at billing account level:

Budget alerts

I’ve set up a monthly maximum budget of 50€, which is low but generous when you consider that typical costs are zero due to discounts and the free tier. At different progress levels of this maximum budget being reached, I automatically get an email. I can then analyze the high usage and decide to change quotas or not. Another great feature that I strongly recommend enabling.

Credit to Google support

Despite the Google Cloud Console lacking in some ways, I can’t say the same thing of my support experience.

I opened a ticket to ask why API usage continued despite me disabling them, which as explained earlier was due to errors which are not charged. As part of this interaction, the support engineer offered to nullify my May and June bills without me even asking. One day later it was done and I was fully compensated, except for July as it was still ongoing. Note that the July projection recalculated downwards from 800$ to a little over 100$ after my changes. I suppose I could try to get that compensated as well but instead will consider it the cost of being a fool.

That’s excellent service. It was fast and went above what I asked for. Even more so because the mistake was on my end and I’m not exactly a customer producing revenue.

I do believe Google can take more preventative measures to avoid this situation altogether. Specifically by enabling budget alerts by default when usage goes parabolic. Still I’m glad this disaster in the making had a happy ending.

Lessons learned

  • Don’t make my mistake of being neglectful in reviewing charges and not properly protecting API keys. The potential for financial disaster is real.
  • Disable all APIs that you do not use at project level and regularly come back to verify it (as a new service might be enabled by default).
  • Per API key, restrict usage by host or IP and do not stop until this works correctly. Further, restrict each API key to services it actually uses.
  • Enable quotas on each API you use. As they are automated, this is how you sleep at night or go on a holiday without worrying about unexpected costs.
  • Enable budget alerts so that you’re in the know early when even valid usage of the keys is higher than usual.
  • Verify that every measure you take actually works, as I’ve come across a few gotchas along the way.
  • Don’t be afraid to make use of the support channel when stuck, in my (limited) experience, they are responsive and helpful.

Bonus chapter: pricing history

As part of my introspection about being so neglectful, I figured that it may be due to my long usage of Google Maps (since 2011) combined with favorable memories of better times. So I did some digging:

  • Before 2016, the service could be used without any API key at all. Usage was anonymous and practically limitless. The golden age of maps.
  • In June 2016, it was announced that API keys were to become mandatory, which was actually enforced in June 2018.
  • In May 2018, there was the biggest course correction. Any usage as of then required a billing account with working credit card, hence removing anonymity. Further, prices beyond the free tier were increased by as much as 1400%. For small users this price hike was obscured by a monthly 200$ credit.

Small users like me have always experienced the service as free, it’s just that administrative requirements changed over time. However, this perception is misleading due to the 200$ credit obscuring very high raw prices underneath. As soon as you cross that credit, the true nature of pricing dynamics becomes painfully visible and this is how even a few extra hundred requests per day amount to an eye-watering bill.

This is no excuse for my neglect to protect keys, it’s an explanation for why the consequences are so big even with minor additional usage.

For the record, I do not believe Google APIs should be free. It being free created the monopoly and dependency in the first place, and no other service can compete with free. I strongly believe that digital services that have significant costs to run should charge a fair and sustainable price from the very beginning.

Thanks for reading, I hope this was of some use to you.

--

--