Jump to Content
Google Cloud

Getting started with Red Hat OpenShift on Google Cloud Platform

March 22, 2016
Sami Zuhuruddin

Solutions Architect, Google Cloud Platform

We recently announced that Red Hat’s container platform OpenShift Dedicated will run on Google Cloud Platform, letting you hook up your OpenShift clusters to the full portfolio of Google Cloud services. So what’s the best way to get started?

We recommend deploying a Kubernetes-based solution. In the example below, we'll analyze incoming tweets using Google Cloud Pub/Sub (Google’s fully-managed real-time messaging service that allows you to send and receive messages between independent applications) and Google BigQuery (Google's fully managed, no-ops, low cost analytics database). This can be the starting point for incorporating social insights into your own services.

Step 0: If you don’t have a GCP account already, please sign-up for Cloud Platform, setup billing and activate APIs.

Step 1: Next you'll setup a service account. A service account is a way to interact with your GCP resources by using a different identity than your primary login and is generally intended for server-to-server interaction. From the GCP Navigation Menu, click on "Permissions."

https://storage.googleapis.com/gweb-cloudblog-publish/images/openshift35yyc.max-400x400.PNG

Once there, click on "Service accounts."

https://storage.googleapis.com/gweb-cloudblog-publish/images/openshift2dc0s.max-400x400.PNG

Click on "Create service account," which will prompt you to enter a service account name. Provide a name relevant to your project and click on "Furnish a new private key." The default "JSON" Key type should be left selected.

https://storage.googleapis.com/gweb-cloudblog-publish/images/openshift1farv.max-400x400.PNG

Step 2: Once you click "Create," a service account “.json” will be downloaded to your browser’s downloads location.

Important: Like any credential, this represents an access mechanism to authenticate and use resources in your GCP account — KEEP IT SAFE! Never place this file in a publicly accessible source repo (e.g., public GitHub).

Step 3: We’ll be using the JSON credential via a Kubernetes secret deployed to your OpenShift cluster. To do so, first perform a base64 encoding of your JSON credential file:

Loading...

Keep the output (a very long string) ready for use in the next step, where you’ll replace ‘BASE64_CREDENTIAL_STRING’ in the pod example (below) with the output just captured from base64 encoding.

Important: Note that base64 is encoded (not encrypted) and can be readily reversed, so this file (with the base64 string) is just as confidential as the credential file above.

Step 4: Next you’ll create the Kubernetes secret inside your OpenShift Cluster. A secret is the proper place to make sensitive information available to pods running in your cluster (like passwords or the credentials downloaded in the previous step). This is what your pod definition will look like (e.g., google-secret.yaml):

Loading...

You’ll want to add this file to your source-control system (minus the credentials).

Replace ‘BASE64_CREDENTIAL_STRING’ with the base64 output from the prior step.

Step 5: Deploy the secret to the cluster:

Loading...

Step 6: Now you’re in a position to use Google APIs from your OpenShift cluster. To take your GCP-enabled cluster for a spin, try going through the steps detailed in the write-up: https://cloud.google.com/solutions/real-time/kubernetes-pubsub-bigquery

You’ll need to make two minor tweaks for the solution to work on your OpenShift cluster:

  • For any pods that need to access Google APIs, modify the pod to create a reference to the secret. The environment variable “GOOGLE_APPLICATION_CREDENTIALS” needs to be exported to the pod — more info on how they work:
    https://developers.google.com/identity/protocols/application-default-credentials#howtheywork

    In the PubSub-BiqQuery solution, that means you’ll modify two pod definitions:

    • pubsub/bigquery-controller.yaml
    • pubsub/twitter-stream.yaml

    For example:
Loading...

In the PubSub-BiqQuery solution, that means you’ll modify two pod definitions:

pubsub/bigquery-controller.yaml
pubsub/twitter-stream.yaml


For example:
Posted in