Welcome to serverless computing or functions as a service (FaaS).
The move from on-premises, monolithic applications to virtualized environments took about 15 years, as explained in our blog post “Are monolithic software applications doomed for extinction?”. Development tools that have evolved as cloud services, such as AWS Lambda, have matured.
In the past few years we have seen a trend toward containers and microservices to design and build, test, deploy and manage enterprise applications. Serverless computing is the logical next step for developers interested in building small, stateless applications that require no concern for the provisioning, scaling and maintenance of the hardware needed to run it.
Serverless computing shares traits with microservices, its more mature sibling. Both break down monolithic applications into smaller building blocks. But the blocks in serverless environments are called functions and, existing as just snippets of code, represent the most elemental building block in software development. Processing an image, encoding a piece of video or transforming a block of data all represent a single-purpose block of code a developer can create to accomplish a very specific business outcome.
For example, The Seattle Times uses an AWS Lambda function to automatically resize images uploaded to Amazon Simple Storage Service (S3). The act of uploading the image triggers the resize, which is a single function.
Contrast this with microservices. For example, in a containerized/microservices environment, five basic functions in a bundle might collectively represent “load shopping cart” or “provide account summary.”
Serverless computing allows the developer to construct single-purpose functions when the context warrants it. Developers don’t always need an entire microservice composed of several functions bundled into an outcome or capability. A snippet of code will do.
One way to think of the difference: Functions are the building blocks that can be part of a microservice, or they can stand on their own. The microservice is analogous to the API the developer can speak to because the microservice comprises a common API between functions.
Another difference is the hardware overhead. While lightweight and easier to provision for hardware, microservices still require an operating system, a deployment mechanism, and monitoring of OS availability, application-configuration demands. (It’s made easier by creating a library of configuration profiles that, when married to your microservice, are called an image—but this overhead still exists in microservices.)
A serverless application development approach offers several advantages in specific business circumstances.
It’s truly serverless:
All software needs hardware, and serverless is no exception. The difference is in how much developers need to worry about hardware compatibility and rely on hardware for software deployment. Those who opt for serverless don’t need to worry about it at all.
As application development evolved — from legacy on-premises computing to cloud and virtualization to microservices to FaaS — hardware consideration has become less of a concern, and now it could be a non-concern. Developers truly can spend all their time on building, testing and deploying app functions without worrying about important hardware considerations. Once developers load their code to AWS, the cloud service manages the capacity, scaling, patching and administration of the hardware required to run the function.
Historically developers have been acutely concerned with how much hardware will be needed at launch and whether the program will scale adequately to the users’ need. For some workloads, FaaS is vastly more scalable than legacy or even virtualized environments. No longer do developers have to over-provision vertically when user demand spikes.
A simple example might involve a large web retailer keeping customers informed of new products. The website builds a function to collect email addresses and adds them to a database. The website is hit with several hundred thousand unique visitors, a big population of whom signed up for the new product notification. The cloud provider easily handles the spike in traffic when an event — the request to be updated about new product releases — triggers automated hardware provisioning.
Faster time to market:
Revisit the previous example. If the product website had deployed its own hardware, either on-premises or in the cloud, it would confront serious issues of capacity planning and testing before the functionality could be added. Legacy computing environments simply could not respond with the speed and agility the current marketplace demands.
Pricing more aligned to customer need:
The prevailing business model with FaaS is for customers to pay for only the amount of computing time used. This is ideal for startups with uncertain IT needs or for workloads that are less frequently carried out. The business model simply charges by a combination of number of transactions and execution time.
Conclusion: Developers are free to do more
Serverless computing is a bit of a misnomer when considering the benefits of FaaS. As hardware’s value in enterprise computing continues to erode with this evolutionary step, enterprise application developers take a big leap upward in their value propositions. They can say: “I focus on building useful software while someone else worries about what hardware will be used, whether it will it be enough hardware, and whether it will be available when needed.“
Mike is a hands-on software architect with 24 years of experience in cloud architectures, custom application development, e-commerce, distributed systems and big data implementations. Throughout his career, he has continually pushed the use of test-driven development, the principles of continuous delivery and emphasized the merits of Agile-based development. Mike has co-presented with Google / K8s luminaries such as Kelsey Hightower, Alan Naim and Carter Morgan.