Google Cloud Platform Blog
Product updates, customer stories, and tips and tricks on Google Cloud Platform
Distributed Load Testing Using Kubernetes
Monday, June 29, 2015
Load testing is an important part of the software development lifecycle - it’s critical that you understand how your applications and infrastructure will hold up once they go into production. But basic load testing isn’t quite enough, you need to be able to model user behavior at scale to really get a feel for how your systems will perform under load. Today we’re introducing
a new solution paper
and
a reference implementation
that will show you how to simulate user behavior and conduct distributed load testing using
Google Cloud Platform
and
Kubernetes
.
When load testing your application, you first need to setup and provision compute instances then deploy and execute load testing software. This is a good start and you can establish some baseline understanding of your application’s performance. But then it comes time to scale up the load testing to see how your application performs against increasing load. This usually involves provisioning more infrastructure and then executing tests again. Over time, you’ll have to repeat this process again and again in order to make sure you truly testing the limits of your application. Couple that process with having to execute these tests on a regular basis, for example, after each new build and you’ve created additional devops work for yourself and your team.
Containers and Kubernetes can help minimize the devops work associated with load testing. Containers rapidly scale, making them a great choice for simulating clients. And while working with individual containers is straightforward, managing a fleet of containers and orchestrating them can be difficult.
Kubernetes
,
an open source container container orchestration system, makes it easy to deploy and manage many containers.
The
solution paper
and
reference implementation
describe how you can deploy and scale distributed load testing using Docker containers and Kubernetes. You’ll learn how to create a container cluster using
Google Container Engine
, deploy a Docker-ized load testing framework, and scale the clients as you continue testing.
Check out the
Distributed Load Testing Using Kubernetes
solution and read through the details to learn about Kubernetes. Then take a look at the
reference implementation tutorial
to deploy a complete working example. Feedback is welcome and appreciated; comment here, submit a pull request, create an issue, or find me (
@crcsmnky
) on Twitter and let me know how I can help.
- Posted by Sandeep Parikh, Solutions Architect
Google Genomics and Broad Institute Team Up to Tackle Genomic Data
Wednesday, June 24, 2015
One of the fastest growing types of biological data is the As, Cs, Gs, and Ts of DNA sequencing -- already in the tens of petabytes and on track to reach exabytes. With today’s high-throughput sequencing technology, it’s much easier to generate genomic data than to transform it into information or knowledge that can improve human health. That’s why
Google Genomics
and
Broad Institute
of MIT and Harvard are teaming up to combine the power, security, and scale of
Google Cloud Platform
with Broad Institute’s expertise in scientific analysis.
Google Genomics
To deal with genomic information at scale, the life science community needs new technologies to store, process, explore, and share. Two years ago, with this in mind, we formed Google Genomics to help the life science community organize the world’s genomic information and make it accessible and useful, using some of the same technologies that power Google services like Search and Maps. Meanwhile, over the past decade Broad Institute has collected and either sequenced or genotyped the equivalent of more than 1.4 million biological samples. Just as important, Broad has developed and openly shared many of the most trusted methods for processing the resulting data, enabling valuable scientific discoveries, with hundreds of published findings in top journals.
In order to scale up by the next order of magnitude, Broad and Google will work together to explore how to build new tools and find new insights to propel biomedical research, using deep bioinformatics expertise, powerful analytics, and massive computing infrastructure. Collaboration between the world’s premier genomics and biomedical research center and the most advanced computing infrastructure can help develop a new generation of tools and services that will enable scientists – from large academic institutions, commercial organizations, or small research labs in remote corners of the world – to uncover a wealth of biological insight.
Our collaboration is a natural fit. We share many deep values -- a passion for being the best at what we do, a belief in the power of technology to improve life, a focus on the end beneficiaries of our work, and a track record of creating the innovations that lead our industries.
Eric Lander, President and Director of Broad Institute said, “Large-scale genomic information is accelerating scientific progress in cancer, diabetes, psychiatric disorders and many other diseases. Storing, analyzing and managing these data is becoming a critical challenge for biomedical researchers. We are excited to work with Google’s talented and experienced engineers to develop ways to empower researchers around the world by making it easier to access and use genomic information.”
Our first joint product is
Broad Institute GATK on Google Cloud Platform
, a managed service available now as an alpha release to a limited set of users.
GATK
, the Genome Analysis Toolkit, has become the standard for converting raw genomic data into reliable information about genetic variants. By running GATK as a service through Google Genomics, scientists can be more confident that they’re processing their data according to the best practices, without worrying about managing IT infrastructure. Google provides the infrastructure, and Broad provides the trusted analysis methods.
Through our collaboration with Broad Institute and
our work with the Global Alliance for Genomics and Health
and the life science community, we believe we can make a difference in improving human health. By making it easier for researchers to ask big questions and find answers amid complexity, we hope to unleash scientific creativity that could significantly improve our understanding of health and disease. We are at the beginning of a genomics-driven healthcare revolution, and it’s a privilege to be contributing to it with organizations like the Broad.
Visit the
GATK forum
and
www.broadinstitute.org/google
to learn more.
- Posted by Jonathan Bingham, Product Manager, Google Genomics
Container Engine & Container Registry Updates - New Features & Pricing
Monday, June 22, 2015
Containers are changing the way that people deploy and manage applications. Today, we are announcing the beta release of
Google Container Engine
, including pricing information and new features that give you more control of your container cluster. We’re also announcing that
Google Container Registry
is generally available, allowing you to easily store and access your container images from a private repository.
Container Engine beta: New features and pricing information
While containers make packaging apps easier, DevOps and IT administrators need better tools to unlock the promise of containerization. Container Engine makes it easy for you to set up a container cluster and manage your application. Simply define your containers’ needs, such as CPU and memory requirements, and Container Engine schedules your containers into your cluster and manages them automatically. Also, because it’s built on
Kubernetes
, the open source container orchestration system, you can move workloads or take advantage of multiple cloud providers.
"Container Engine unlocks the power of Google infrastructure for our startup, without locking us in. It gives us peace of mind for infrastructure, and lets us focus on writing great software." - Brian Fitzpatrick, Founder & CTO,
Tock
"Container Engine and Kubernetes helped us go from one deployment, with an hour downtime a week, to 8 zero-downtime deployments a day."
Frits Vlaanderen, Systems Engineer,
Travix
New features give you more control of your container cluster:
Create a container cluster in minutes that supports the
v1 Release Candidate of Kubernetes (also released today)
Container Engine manages the uptime of Kubernetes, so it’s always ready to schedule your containers
We manage updates to the underlying Kubernetes system and offer you the choice as to when you accept the update. You can now run a single command and your container cluster will be upgraded to the latest version.
If you use
Google Cloud VPN
to connect your datacenter to Google, you can reserve an IP address range for your container cluster, allowing your cluster IPs to coexist with private network IPs
You can now enable
Google Cloud Logging
with a single checkbox, making it even easier to gain insight into how your application is running
During the beta, you will continue to pay no additional charge for Container Engine above the underlying Google Cloud Platform resources you use. Starting at general availability, we will have two levels of pricing for Container Engine:
Standard clusters will be charged $0.15 per hour. A standard cluster can be comprised of up to to 100 virtual machine nodes and Google will manage the cluster availability for you.
Basic clusters allow you to try Container Engine on up to 5 virtual machine nodes. Upgrading to standard is easy, if you want managed uptime. Under our current promotion, we don't charge you extra to use basic clusters, but we may start charging for them in the future.
To learn more about pricing, check out our
Container Engine pricing details
.
Announcing Google Container Registry General Availability
Google Container Registry helps make it easy for you to store your container images in a private and encrypted registry, built on Cloud Platform. Pricing for storing images in Container Registry is simple: you only pay Google Cloud Storage costs. Pushing images is free, and pulling Docker images within a Google Cloud Platform region is free (Cloud Storage egress cost when outside of a region).
Container Registry is now ready for production use:
Encrypted and Authenticated - Your container images are encrypted at rest, and access is authenticated using Cloud Platform OAuth and transmitted over SSL
Fast
- Container Registry is fast and can handle the demands of your application, because it is built on
Cloud Storage
and
Google Cloud Networking
.
Simple - If you’re using Docker, just tag your image with a gcr.io tag and push it to the registry to
get started
. Manage your images in the Google Developers Console.
Local - If your cluster runs in Asia or Europe, you can
now store
your images in ASIA or EU specific repositories using asia.gcr.io and eu.gcr.io tags.
Here’s what customers, who are using Container Registry in production, are saying:
“We’ve loved using Container Registry since we started with containers and Kubernetes last fall. It’s easy to forget how valuable it is, because it just works.”
Steve Reed, Principal Software Engineer at
zulily
"Container Registry is a critical component in our move to a containerized deployment environment. It provides a simple environment for staging our Docker containers that gives us confidence they are secure and safe from tampering."
Dave Tucker, Vice President of Engineering,
Workiva
“Container Registry allows our developers to create private repositories inside our projects' existing security settings to efficiently deploy our code in private, pre-built containers. At this scale, saving time on deployment saves money. Having very high speed network between Container Registry and compute instances is critical to begin processing our data on thousands of instances in seconds rather than minutes.”
Tim Kelton, Co-Founder,
Descartes Labs
“The killer feature for Google Container Registry is the performance. Hands-down fastest to push and pull images.”
Avi Cavale, CEO & Co-Founder,
Shippable
Get Started
To try out Container Engine, visit our
site
and
documentation
. And if your team has feedback and would like to work with us, please sign up
here
. To learn more about Container Registry, visit the
documentation
or provide feedback
here
. Together, Container Engine and Container Registry will enable you to unlock the promise of containerization.
- Posted by Eric Han & Kit Merker, Product Managers on Google Cloud Platform
Near Real-Time Log Streaming and Analysis with Google Cloud Platform & Logentries
Monday, June 22, 2015
At Google we spend a lot of time thinking about how we can make DevOps easy for Google Cloud Platform customers. Whether you are using Google App Engine, Google Compute Engine, or any other service, you want access to logs produced by your system and applications.
Google Cloud Platform delivers support for centralized logging via
Google Cloud Logging
which provides you with the ability to view, search, and analyze log data. Cloud Logging includes the capability for log archival in
Google Cloud Storage
and the ability to send logs to
Google BigQuery
. In addition, Cloud Logging also allows you to forward these logs to any custom endpoint including third party log management services for advanced and tailored log analytics via the near real-time streaming
Google Cloud Pub/Sub API
.
We are happy to announce a real-time integration of Logentries, a third party log analytics service,
with the Google Cloud Platform. Log Management and Analytics is a critical customer need and we are excited to offer Google customers a choice to easily send logs to a key provider like Logentries. This integration offers Google Cloud Platform customers an easily configurable choice for log management and advanced analytics that includes anomaly detection. Customers can now use Logentries for Google App Engine and services like Cloud Dataflow as well and makes it even easier to get started. At Google, we are committed to creating an open ecosystem with easy path of integration for partners, and Logentries provides a great example of a leading partner.
"Thanks to Google Cloud Logging export feature to Pub/Sub, it was easy to build a direct integration between Logentries and Google Cloud," explained Marc Concannon, VP of Product at Logentries. "The Pub/Sub API was well documented and the Google's commitment to developing an open collaboration made the integration smooth”.
Overview of the Google Cloud Pub/Sub API
Cloud Pub/Sub is a powerful messaging service responsible for routing data between applications at scale that delivers notifications within milliseconds, even when handling more than 1 million messages per second. In essence it is a near real time many-to-many, asynchronous messaging service that helps to create simple, reliable, and flexible applications by decoupling senders and receivers. It allows for secure and highly available communication between independently written applications.
Cloud Pub/Sub is thus an ideal service for transporting your logs and it allows you to either push your log events, or pull them as they happen.
Figure one: Google Cloud Pub/Sub Data Flow Schema
Logentries
, a near real-time log analytics service, is the first third party service to integrate with Google Cloud Pub/Sub near real-time log streaming, allowing users to configure alerts, perform anomaly detection as well as advanced analytics.
How To Configure Logentries with Google Cloud Logging
Streaming Google Cloud Platform logs to Logentries can be configured as follows:
Enable the Cloud Pub/Sub API
Add the Logentries Service Account to your project
Configure Export to Cloud Pub/Sub
Add a Log in Logentries
Step by Step instructions
are available to get configured quickly.
Near Real-Time Log Analytics
Logentries uses a unique pre-processing engine to perform advanced analysis on Google logs in near real-time such that data is pre-analysed, thus reducing the requirement for complex search queries on your logs to identify important system or user activity.
Figure two: Logentries Log Management and Analysis Flow
Logentries’ integration with Google Cloud Platform enables you to pinpoint issues quickly as well as look at long term trends across your log data.
Some of the most useful capabilities of Logentries for Google Cloud Platform customers include:
Live Tail with Event Tagging
: The Logentries pre-processing engine automatically tags important events such as exceptions, warnings, or errors allowing users to easily spot issues in a live view of your log data.
Figure three: Live Tail of Logs in Logentries
Near Real-Time Notifications and Inactivity Alerts
: Get notified about important events within seconds of them occurring. Notifications can be configured to be sent to email, or can be integrated with other third party APIs and tools (e.g. Slack, HipChat, PagerDuty…).
Use your Logs as Data
: Logs contain lots of very useful information beyond stack traces and error codes.
Field level log analytics
allows you to extract key metrics (e.g. server resource usage, or API response time) from your logs and roll these metrics up into interesting charts and graphs.
Figure four: Live Charts in Logentries
Google Cloud Logging supports a long list of
known log formats
via the google-fluentd collector - e.g. Apache, Chef, MongoDB, NginX and several others. Logentries also provides out-of-the-box intelligence (tags, alerts and dashboards) for these log formats via the Logentries
community packs
such that you do not need to spend time configuring rules or queries.
Get started with Logentries, now paired with Google Cloud Logging service, today.
We are excited by this collaboration between Google Cloud Platform and Logentries, and we welcome your feedback. You can find more on the
Logentries forum
as well as send us feedback at
cloud-logging-feedback@google.com
.
- Posted by Deepak Tiwari (Product Manager, Google Cloud Platform) and Trevor Parsons (Co-founder and Chief Scientist, Logentries)
Free Trial
GCP Blogs
Big Data & Machine Learning
Kubernetes
GCP Japan Blog
Firebase Blog
Apigee Blog
Popular Posts
Understanding Cloud Pricing
World's largest event dataset now publicly available in BigQuery
A look inside Google’s Data Center Networks
New in Google Cloud Storage: auto-delete, regional buckets and faster uploads
Fans come on stage in Azealia Banks’ new interactive video, built on Google Cloud Platform
Labels
Announcements
193
Big Data & Machine Learning
134
Compute
271
Containers & Kubernetes
92
CRE
27
Customers
107
Developer Tools & Insights
151
Events
38
Infrastructure
44
Management Tools
87
Networking
43
Open
1
Open Source
135
Partners
102
Pricing
28
Security & Identity
85
Solutions
24
Stackdriver
24
Storage & Databases
164
Weekly Roundups
20
Feed
Subscribe by email
Demonstrate your proficiency to design, build and manage solutions on Google Cloud Platform.
Learn More
Technical questions? Check us out on
Stack Overflow
.
Subscribe to
our monthly newsletter
.
Google
on
Follow @googlecloud
Follow
Follow