Organizations are in constant need of sandboxes for developing the next great thing.
I wrote a blog back in 2013 about how the ease of access and low costs available via public clouds enable computer science students to develop, test, and learn at a quicker pace. I realize now that this does not at all apply exclusively to students – even Snapchat started out as a concept with the burden of proof prior to becoming the behemoth they are today.
Google Cloud Platform (GCP) provides the ideal environment for such development and test workloads. Without the traditional bounds of procuring computing resources for dev/test, teams can develop, iterate, test, fail, and learn at a quicker clip without the costs previously associated with such work.
Furthermore, GCP provides access to numerous intelligent management tools which are fully integrated within the platform. GCP also provides access to a new generation of developer tools, designed to allow developers to focus on their code with little regard for the underlying infrastructure. Essentially, they can expect things to “just work.”
In this post, I’ll review the extremely simple process required to spin up (and down) a dev/test environment based in Google Cloud Platform. We’ll explore the work required to create an environment which will collect, store, process, and present information from our front door (entry scans).
Everything starts with a project, the construct in which all other components fall under within GCP. In this example, our project will be “sada-super-fun-devtest.”
We’ll assume that our front door is an IP enabled IoT device capable of sending relevant information to another IP enabled system. We’ll spin up a Google Compute Engine virtual machine to ingest this data and SSH into it.
Great, now we have access to our new instance, and it only took a few seconds. Next, let’s create a place to keep all of the data we’re gathering from our IoT devices. With Google Cloud Storage (GCS) we’re able to access infinitely scalable BLOB storage with no need for pre-provisioning. Even better, we’ll only pay for what we use (as with all things GCP), so this test will only use pennies/month worth of storage.
Since our virtual machine instance is already in our project, it has permissions to work with our projects Google Cloud Storage — no need to authorize or otherwise gain access. Creating a new storage bucket from this instance is therefore extremely straightforward. Let’s create a new bucket and move a piece of test data in to ensure it’s working.
Easy. Now we have our VM talking to our newly created infinite (and cheap) BLOB store. Let’s suppose that our IoT agent is now able to send data to the listener software we’ve installed on the instance. That software is sending the data it receives to our GCS bucket in real-time after it performs some data normalization for us.
Now that our data is safely stored in Google Cloud Storage, what can we do with it? Ultimately this test is meant to determine when people enter and exit our space. For the sake of illustration, let’s assume we have a few years of data ingested now. We need a data warehousing solution to analyze all of this data as efficiently as possible.
Google BigQuery is an analytics engine provided as a service. With BigQuery, we’re able to analyze massive datasets without any regard for the underlying compute infrastructure required. There is no need to procure the compute nor human resources required to spin up a Hadoop environment, which is great for us since this is a dev/test experiment meaning time and budget are tightly controlled. Again, because BigQuery is another service in the same Google Cloud Platform Project, moving data from our storage bucket to BigQuery is simple.
Now our data has been ingested into Google BigQuery and we’re ready to start querying. Easy, right? Now it’s time to get some value from this data. Let’s find out how many times the maintenance door was used according to our data.
Bringing it Together
Now we know we’re able to take our data from our devices, store it safely and securely in Google Cloud Storage, and ingest it into Google BigQuery for analysis. Surely our executive team won’t want to perform these commands each time they want to know a new piece of information about this dataset. Luckily, Google App Engine, GCP’s Platform-as-a-Service offering, allows the same tight integration seen in the products above.
Since Google App Engine removes the need for us to worry about provisioning network or compute infrastructure, and since authentication is already built in to other services in our project, we can write some simple code to allow for GUI which taps into this BigQuery data. The frontend application we’ve built will scale automatically based on usage so we’re only concerned with the logic behind our code. Since we’re not concerned with the mechanics of this app, we can spend a lot more time building out a great user experience for our executive team.
All-in-all, this exercise took about 45 minutes from start to finish and cost approximately $0.01. You read that correctly, one cent – it wasn’t a typo. In that time and with almost no spend, we set up a virtual machine to ingest our data, built an infinitely capable storage instance to hold our data, ingested that data into a data warehouse, queried the data and spun up a front-end UI to display the data.
This is just a small example of the power, ease and affordability of developing and testing in Google Cloud Platform. Contact us today to get started and see what you can build.