Today, we are announcing the beta availability of a new serverless compute offering called Cloud Run that lets you run stateless HTTP-driven containers, without worrying about the infrastructure. Cloud Run is a fully serverless offering: It takes care of all infrastructure management including provisioning, configuring, scaling, and managing servers. It automatically scales up or down within seconds, even down to zero depending on traffic, ensuring you pay only for the resources you actually use.
Veolia, a global leader in optimized water, waste, and energy management solutions, is already benefiting from Cloud Run:
“Cloud Run removes the barriers of managed platforms by giving us the freedom to run our custom workloads at lower cost on a fast, scalable, and fully managed infrastructure. Our development team benefits from a great developer experience without limits and without having to worry about anything.” —Hervé Dumas, Group Chief Technology Officer, Veolia
Cloud Run is also available on GKE, meaning you can run serverless workloads on your existing GKE clusters. You can deploy the same stateless HTTP services to your own GKE cluster and simultaneously abstract away complex Kubernetes concepts.
Using Cloud Run on GKE also gives you access to custom machine types, Compute Engine networks, and the ability to run side-by-side with other workloads deployed in the same cluster. It provides both the simplicity of deployment of Cloud Run and the flexibility of GKE. Customers such as Airbus Aerial are already using Cloud Run on GKE to process and stream aerial images.
“With Cloud Run on GKE, we are able to run lots of compute operations for processing and streaming cloud-optimized aerial images into web maps without worrying about library dependencies, auto-scaling or latency issues.” —Madhav Desetty, Chief Software Architect, Airbus Aerial
We are continuing to strengthen our serverless portfolio through deep partnerships with industry leaders such as Datadog, NodeSource, GitLab, and StackBlitz. These partnerships provide integration support for Cloud Run across application monitoring, coding, and deployment stages.
Enabling portability with Knative
We recognize that you may want to run some workloads on-premises or across multiple clouds. Cloud Run is based on Knative, an open API and runtime environment that lets you run your serverless workloads anywhere you choose—fully managed on Google Cloud Platform, on your GKE cluster, or on your own self-managed Kubernetes cluster. Thanks to Knative, it’s easy to start with Cloud Run and move to Cloud Run on GKE later on. Or you can use Knative in your own Kubernetes cluster and migrate to Cloud Run in the future. By using Knative as the underlying platform, you can move your workloads across platforms, substantially reducing switching costs.
Since it launched eight months ago, Knative has already reached version 0.5, with over 50 contributing companies and 400 contributors, and more than 3,000 pull requests. Click here to learn more about Knative and how you can get involved.
New enhancements to Cloud Functions
For those developers looking to quickly and easily connect cloud services, we’ve got you covered. Google Cloud Functions is an event-driven serverless compute platform that lets you write code that responds to events, without worrying about the underlying infrastructure. Cloud Functions makes it simple and easy to connect to cloud services such as BigQuery, PubSub, Firebase, and many more.
Today, we are also announcing a number of new and frequently requested features to help you adopt functions easily and seamlessly within your current environment:
- New language runtimes support such as Node.js 8, Python 3.7, and Go 1.11 in general availability, Node.js 10 in beta; Java 8 and Go 1.12 in alpha.
- The new open-source Functions Framework, available for Node.js 10, will help you take the first step towards making your functions portable. You can now write a function, run it locally and build a container image to run it in any container-based environment.
- Serverless VPC Access, which creates a VPC connector that lets your function talk to your existing GCP resources that are protected by network boundaries, without exposing the resources to the internet. This feature allows your function to use Cloud Memorystore as well as hundreds of third-party services deployed from the GCP Marketplace. It is available in beta starting today.
- Per-function identity provides security access at the most granular function level and is now generally available.
- Scaling controls, now available in beta, help prevent your auto-scaling functions from overwhelming backends that do not scale up as quickly in a serverless fashion.
Functions provide agility and simplicity to make your developers more productive. But not all applications need to be broken down into granular functions. Sometimes you want to deploy large applications, while still leveraging the benefits of serverless.
New second generation runtimes in App Engine
Google pioneered serverless computing more than 11 years ago with App Engine, a serverless application platform for deploying highly scalable web and mobile apps. Since its inception, App Engine has evolved to meet developers where they are, whether it’s adding capabilities or support for new runtimes.
Today, we are announcing support for new second generation runtimes: Node.js 10, Go 1.11, and PHP 7.2 in general availability and Ruby 2.5 and Java 11 in alpha. These runtimes provide an idiomatic developer experience, faster deployments, remove previous API restrictions and come with support for native modules. The above-mentioned Serverless VPC access also lets you connect to your existing GCP resources from your App Engine apps in a more secure manner without exposing them to the internet.
Build full-stack serverless apps
Perhaps the biggest benefit of developing applications with Google’s approach to serverless is the ease with which you can tap into a full stack of additional services. You can build end-to-end applications by leveraging services across databases, storage, messaging, data analytics, machine learning, smart assistants, and more, without worrying about the underlying infrastructure.