Serverless Is Not Just Lambda

Let’s take a look at the serverless AWS offerings that will save us a lot of money and effort.


What is Serverless?

Serverless means building and running applications without thinking about servers.

There are servers, of course, they’re just managed by someone else. Developers are not concerned with virtual, or physical, server machines and other infrastructure aspects such as host-level networking.

Benefits

Cost savings together with implicit scaling and high availability capabilities are the greatest advantages of serverless architectures:

Moreover, serverless computing is much more green as no resources are kept up and running when not needed, which prevents energy wasting and saves our planet.

Drawbacks

Serverless comes with some disadvantages to take into account. Due to latency issues, vendor lock-in, and potential development difficulties, Serverless doesn’t fit every use-case:

Use-Cases

Serverless architecture is a great fit for asynchronous, stateless applications that can be started instantaneously as well as for use cases that see infrequent, unpredictable surges in demand.

Typical use-cases for Serverless are:

Serverless vs FaaS (Function as a Service)

While FaaS is serverless, Serverless does not mean only FaaS. Consider the following AWS serverless services, where only AWS Lambda is a representative of FaaS:

In some literature, serverless services other than FaaS are called BaaS (Backend as a Service). Taking BaaS into the account, we can simply say that Serverless equals FaaS + BaaS.

Serverless vs PaaS (Platform as a Service)

Serverless is similar to PaaS, however, it goes beyond the traditional usages of containers by removing the concept of long-lived server components.

With PaaS (such as Cloud Foundry), developers have much low-level control of operating the application and still have to construct and scale the platform. Serverless is designed to bring entire applications up and down for individual requests automatically.

Another difference can be found in pricing models. While with PaaS the costs tie to reserved compute, network, and storage resources, you never pay for serverless resources that idle.

Serverless vs SaaS (Software as a Service)

Simply put, SaaS is about using applications while Serverless is about building them.

SaaS usually offers a complete business solution to use (such as Salesforce CRM, Microsoft Dynamics 365, Dropbox, etc.) with a different pricing model (user-based subscriptions).

Serverless offerings, however, consist of building blocks for application development such as backend services (storages, databases, etc.) and computation models (functions and container engines).

AWS Lambda

Lambda is a serverless compute service to run code. The code is organized into independent units called functions. Each function has well-defined inputs and outputs and it is written in a supported in one of the languages that Lambda supports. Alternatively, a custom execution container image can be assigned to the function.

Lambda functions provide a high-availability infrastructure including server and operating system maintenance, capacity provisioning and automatic scaling, code monitoring and logging out of the box.

You pay as you go based on the consumed compute time and the category of resources that are assigned to the function.

AWS Fargate

AWS Fargate is a serverless compute engine for containers that works with both EKS and ECS.

Fargate is a launch-type alternative to EC2 where provisioning and managing of the infrastructure resources (compute and memory) is fully managed by AWS.

Fargate supports all of the common container use cases such as microservices architecture applications, batch processing, machine learning, etc.

With Fargate the applications are decoupled from the underlying hosts. As the underlying servers are abstracted away no direct iterations such as login or customization are anymore allowed.

Fargate with EKS

Alongside self-managed EC2 nodes and managed node groups, AWS Fargate is the third option for a compute engine on the EKS scheduler. Fargate profile is an interface to configure Fargate on an EKS cluster.

You don’t see any virtual machines in your account when running EKS pods on Fargate — it’s completely serverless.

Fargate scheduler is installed on EKS as an add-on extension to the Kubernetes scheduler to schedule and run pods on Fargate.

Cluster scaling is fully managed by AWS, you scale only your application.

Load balancing can be easily done by using AWS Load Balancer Controller.

Limitation of running EKS on Fargate

Amazon Aurora Serverless

Amazon Aurora Serverless is an on-demand, auto-scaling configuration for Amazon Aurora, MySQL, and PostgreSQL-compatible relational databases.

Serverless Aurora scales automatically up and down, starts up, and shuts down based on your application’s demands.

Serverless Aurora has typical use-cases of serverless databases:

For instance, if you run a test environment where testing is performed a few times a day and nothing is going on during the night, Serverless Aurora will definitely save you a couple of bucks...