Dev/Test Best Practices – The Cloud Paradigm

Project Creation
Everything starts with a project, the construct in which all other components fall under within GCP. In this example, our project will be “sada-super-fun-devtest.”Instance Creation
We’ll assume that our front door is an IP enabled IoT device capable of sending relevant information to another IP enabled system. We’ll spin up a Google Compute Engine virtual machine to ingest this data and SSH into it.
Storage Setup
Great, now we have access to our new instance, and it only took a few seconds. Next, let’s create a place to keep all of the data we’re gathering from our IoT devices. With Google Cloud Storage (GCS) we’re able to access infinitely scalable BLOB storage with no need for pre-provisioning. Even better, we’ll only pay for what we use (as with all things GCP), so this test will only use pennies/month worth of storage. Since our virtual machine instance is already in our project, it has permissions to work with our projects Google Cloud Storage — no need to authorize or otherwise gain access. Creating a new storage bucket from this instance is therefore extremely straightforward. Let’s create a new bucket and move a piece of test data in to ensure it’s working.
Google BigQuery
Google BigQuery is an analytics engine provided as a service. With BigQuery, we’re able to analyze massive datasets without any regard for the underlying compute infrastructure required. There is no need to procure the compute nor human resources required to spin up a Hadoop environment, which is great for us since this is a dev/test experiment meaning time and budget are tightly controlled. Again, because BigQuery is another service in the same Google Cloud Platform Project, moving data from our storage bucket to BigQuery is simple.

Bringing it Together
Now we know we’re able to take our data from our devices, store it safely and securely in Google Cloud Storage, and ingest it into Google BigQuery for analysis. Surely our executive team won’t want to perform these commands each time they want to know a new piece of information about this dataset. Luckily, Google App Engine, GCP’s Platform-as-a-Service offering, allows the same tight integration seen in the products above. Since Google App Engine removes the need for us to worry about provisioning network or compute infrastructure, and since authentication is already built in to other services in our project, we can write some simple code to allow for GUI which taps into this BigQuery data. The frontend application we’ve built will scale automatically based on usage so we’re only concerned with the logic behind our code. Since we’re not concerned with the mechanics of this app, we can spend a lot more time building out a great user experience for our executive team. All-in-all, this exercise took about 45 minutes from start to finish and cost approximately $0.01. You read that correctly, one cent – it wasn’t a typo. In that time and with almost no spend, we set up a virtual machine to ingest our data, built an infinitely capable storage instance to hold our data, ingested that data into a data warehouse, queried the data and spun up a front-end UI to display the data. This is just a small example of the power, ease and affordability of developing and testing in Google Cloud Platform. Contact us today to get started and see what you can build.
Leave a comment