Google Cloud Platform Blog
Product updates, customer stories, and tips and tricks on Google Cloud Platform
BigQuery cost controls now let you set a daily maximum for query costs
Tuesday, December 15, 2015
Today we’re giving you better cost controls in BigQuery to help you manage your spend, along with improvements to the streaming API, a performance diagnostic tool, and a new way to capture detailed usage logs.
BigQuery is a Google-powered supercomputer that lets you derive meaningful analytics in SQL, letting you only pay for what you use. This makes BigQuery an analytics data warehouse that’s both powerful and flexible. Those accustomed to a traditional fixed-size cluster – where cost is fixed, performance degrades with increased load, and scaling is complex – may find granular cost controls helpful in budgeting your BigQuery usage.
In addition, we’re announcing availability of BigQuery access logs in Audit Logs Beta, improvements to the Streaming API, and a number of UI enhancements. We’re also launching Query Explain to provide insight on how BigQuery executes your queries, how to optimize your queries and how to troubleshoot them.
Custom Quotas: No fear of surprise when the bill comes
Custom quotas
allow you to set daily quotas that will help prevent runaway query costs. There are two ways you can set the quota:
Project wide: an entire BigQuery project cannot exceed the daily custom quota.
Per user: each individual user within a BigQuery project is subject to the daily custom quota.
Query Explain: understand and optimize your queries
Query Explain
shows, stage by stage, how BigQuery executes your queries. You can now see if your queries are write, read or compute heavy, and where any performance bottlenecks might be. You can use BigQuery Explain to optimize queries, troubleshoot errors or understand if
BigQuery Slots
might benefit you.
In the BigQuery Web UI, use the “Explanation” button next to “Results” to see this information.
Improvements to the Streaming API
Data is most valuable when it’s fresh, but loading data into an analytics data warehouse usually takes time. BigQuery is unique among warehouses in that it can easily ingest a stream of up to 100,000 rows per second per table, available for immediate analysis. Some customers even stream 4.5 million rows per second by sharding ingest across tables. Today we’re bringing several improvements to BigQuery Streaming API.
Streaming API in EU locations
. It’s not just for the US anymore: you may now use the Streaming API to load data into your BigQuery datasets residing in EU.
Template tables
is a new way to manage related tables used for streaming. It allows an existing table to serve as a template for a streaming insert request. The generated table will have the same schema, and be created in the same dataset and project as the template table. Better yet, when the schema of the template table is updated, the schema of the tables generated from this template will also be updated.
No more “warm-up” delay
. After streaming the first row into a table, we no longer require a warm-up period of a couple of minutes before the table becomes available for analysis. Your data is available immediately after the first insertion.
Create a paper trail of queries with Audit Logs Beta
BigQuery Audit Logs
form an audit trail of every query, every job and every action taken in your project, helping you analyze BigQuery usage and access at the project level, or down to individual users or jobs. Please note that Audit Logs is currently in Beta.
Audit Logs can be filtered in Cloud Logging, or exported back to BigQuery with one click, allowing you to
analyze
your usage and spend in real-time in SQL.
With today’s announcements, BigQuery gives you more control and visibility. BigQuery is already very easy to use, and with recently launched products like
Datalab
(a data science notebook integrated with BigQuery), just about anyone in your organization can become a big data expert. If you’re new to BigQuery, take a look at the
Quickstart Guide
, and the first 1TB of data processed per month is on us. To fully understand the power of BigQuery, check out the
documentation
and feel free to ask your questions using the “
google-bigquery
” tag on Stack Overflow.
-
Posted by Tino Tereshko, Technical Program Manager
Free Trial
GCP Blogs
Big Data & Machine Learning
Kubernetes
GCP Japan Blog
Firebase Blog
Apigee Blog
Popular Posts
World's largest event dataset now publicly available in BigQuery
A look inside Google’s Data Center Networks
Enter the Andromeda zone - Google Cloud Platform’s latest networking stack
Using labels to organize Google Cloud Platform resources
New in Google Cloud Storage: auto-delete, regional buckets and faster uploads
Labels
Announcements
193
Big Data & Machine Learning
134
Compute
271
Containers & Kubernetes
92
CRE
27
Customers
107
Developer Tools & Insights
151
Events
38
Infrastructure
44
Management Tools
87
Networking
43
Open
1
Open Source
135
Partners
102
Pricing
28
Security & Identity
85
Solutions
24
Stackdriver
24
Storage & Databases
164
Weekly Roundups
20
Feed
Subscribe by email
Demonstrate your proficiency to design, build and manage solutions on Google Cloud Platform.
Learn More
Technical questions? Check us out on
Stack Overflow
.
Subscribe to
our monthly newsletter
.
Google
on
Follow @googlecloud
Follow
Follow