In the same time, this quota will reduce the (potential) attack impacts, the services misuses, and the inherent cost induced by the neglectful users that forget to stop resources…

However, I dislike that solution because it requires us to create a service account key file; with all the possible security issues that it can imply. And despite my request to the Google Cloud team, that dangerous piece of doc is still here.

Sams Club has a variety of disposable cups with lids, which are perfect for hot beverages. They come in various sizes and most are made from foam or insulated paper. If you have a business such a convenience store, you can use these cups for customers to serve themselves hot beverages. Plus, its easy to keep them and their matching lids tidy and organized in their own dispensers.

You can even automate these quotas limits, per type of environment, per folder or whatever you want with the terraform module.

And to POST this body to the service usage API with the quota name (picked in the previous part). Use the gcurl command for that

The cloud platforms are wonderful sandboxes where you can spend hours to experiment and try out. However, resources aren’t free!

Hopefully, Google Cloud has automatic monitoring and alerting to contact the project owner in case of suspicious activity. And the cost impact was low but it could be worse!

You can choose to achieve this by API call or by terraform. I will present both to help to choose the best fit for your use cases and existing project creation/update processes.

Because I use my own credential, I also don’t need a service account key file with terraform. On your workstation, perform a gcloud auth application-default login to set your credential in your runtime context.

Or even to use a service account by impersonation (be sure that your current authenticated user has the right roles to impersonate the service account)

It’s wonderful and, because you pay the volume of data that you scan in each query, the cost at the end of the day can quickly skyrocket (I already saw $100k+ in one afternoon!).

Before going deeper, you will see that I use gcurl command which is an alias of curl with a security header populated automatically (for easy use). You can find how to create it in the Quotas API documentation.

If you have a water cooler at home or in the office, keep a stash of disposable cold cups nearby. And keep a few in the bathroom cabinet, too. The small ones are perfect for doses of medicine or vitamins, and theyre convenient to have around when children ask for “one more drink of water” before bedtime.

After a successful call (don’t forget the force query param), you can perform another get to the API to check the update. Only the us-east1 region has been changed.

Compute Engine isn’t as severe as BigQuery in terms of cost. It’s more progressive and costs accumulate over the time. However, I already saw bad actors creates dozens of large Compute Engine instances to mine BitCoins.

As you can see, there isn’t a defined limit, and thus the limits by default are enforced, by region (look at the unit field): up to 2400vCPU N1 in many regions!

Terraform is a common tool used by DevOps teams to automate the infrastructure of the project (IaC: Infra as Code). You can also create projects and configure the environment with it.

Even if free tiers are generous on Google Cloud, some cases can lead to expensive bills. This situation frightens the companies and top management and, for them, the easiest solution is to restrict the users authorizations to use only a subset of the platforms. The bad side is that you limit the freedom and the capacity to innovate for your teams.

As you can see, you need to know the metric and limit names. You can’t guess them and a discovery phase with the GET API is required to be sure of the quota fully qualified name.

BigQuery is an awesome toy: you can process petabytes of data in seconds!! Data scientists love it because they can now achieve queries that never were able to achieve in their previous on prem environment.

So, when you can, don’t use service account key file (which contain private RSA key), except when you reach the limits of the IAM service

Before setting quotas on BigQuery, it’s important to know which quotas exist. For this you can list the current quota available in the Quota API.

With Quotas API, you can now limit the amount of resources usable per service and per project without limiting the authorizations to just some services because you fear their possible cost! If your data scientists need GPUs, no problem, but they will be able to get only a few in only 1 region, not several dozens per region!

The bad actors got a service account key file of a development project, poorly protected, and used it to create the instances. That’s why, I strongly recommend not generating these sensitive files when you can!

Education is the right path but sometimes mistakes can occur. And there is no room for mistakes, you pay what you use. And that’s why quotas are very important to prevent this kind of situation!

To enforce quota definitions automatically on your projects, especially the sandbox projects, you need to automate the quotas set when you create or update the projects.

You can achieve the same thing with terraform. Here again, you need to know exactly the service, metrics and the limit name that you want to override. The first discovery step by GET API request isn’t optional!

Want to entertain a crowd but you dont have enough glassware? Or, maybe youre having people over, but you dont want to have to worry about washing all those cups and glasses afterwards. The disposable paper cups from Sams Club are the perfect solution. And of course, paper cups and plastic cups are the most sensible choices for picnics, BBQs and other casual outdoor events.

We will start slowly, only to list the consumer quotas on the BigQuery service. I have to admit that this API is not so easy to understand at the beginning due to its generic nature and the need to adapt to any quotas for all the products.

You can see a lot of quotas that you can increase or decrease. For our use case, the Query usage quota is the right one.

Once the query is accepted, you are returned the ID of the operation. Perform a get query, as before and see the change.

You can also enforce the quota in all regions. For that, remove the dimensions in the body to set a uniform quota to all regions. It’s very handful when you want to allow the use of only 1 region:

If you never set this quota in BigQuery on your project, it won’t work because you change the quota by more than 10%. You need to add a force Boolean to true as query parameter of your URL, like this

Pick up a few different types of disposable cups at Sams Club. You can buy in bulk, so youll keep costs low and have plenty on hand. Disposable paper cups and plastic cups are also a good choice for kids because theyre lightweight, shatterproof and easy to hold for little hands.

You can see the effective limit now, and a new name for the consumerOverride. You will need this full name to update (PATCH) or to delete this override.

The cloud has many benefits and one of them is the innovation speed with the motto “Fail fast, iterate faster”. Indeed, the cloud providers propose tons of services to easily test and experiment, when the same would be expensive, or impossible, on premise environment.

GDE cloud platform, Group Data Architect @Carrefour, speaker, writer and polyglot developer, Google Cloud platform 3x certified, serverless addict and Go fan.

If you have a closer look at the Compute Engine CPU default quotas, a corporate account can use up to 2400 vCPUs (N1 family) in several regions! For testing purposes, only a few CPUs in only one region is enough. You can set a quota to limit that.

The second one allows to limit the requester on a project without locking the others and the whole project itself. Interesting when you have a shared project with some users that take care of their spending and not the others!

As with BigQuery, to start with quotas on Compute Engine, a GET query to view and discover the consumer quotas is the good starting point.

A collection of technical articles and blogs published or curated by Google Cloud Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

So I propose to use your user account like this (you need to be authenticated previously with a gcloud auth init or a gcloud auth login)

If you like to take your coffee, tea or hot chocolate with you on your commute to work, these disposable cups with lids are extremely convenient.

Let’s reduce that! Get a copy of the name field and define this new override body, this time with a dimensions: the region.

There are tons of quotas, by region, by CPUs family, by Network, by GPUs family,… Like this, you can, for example, discard the use of certain CPUs families or GPUs on test projects

Instead of limiting the set of usable services with role and permissions, the other solution is to limit the expenses on each service. It’s the purpose of the quota API, to set an upper bound on the usable resources on each service.