My serverless combat-the big front-end trend of serverless from entry to dark

My serverless combat-the big front-end trend of serverless from entry to dark

Introduction: Dachang is doing Serverless, what is he? Maybe many people have never heard of it. I heard that this is a trend in the big front end. It has been hot in the last two years. If you have not heard of it, then you need to pay attention. Today, Sun called the beast takes everyone to learn about Serverless.

table of Contents


What does Serverless want to solve?

 What does Serverless do?

Some common serverless platforms

How to understand serverless technology-FaaS and BaaS

How does serverless computing work?

Serverless usage scenarios

Serverless advantages and disadvantages

Xiaobai Operation Guide

 work process

What does Serverless want to solve?

Problem : After the front-end and the back-end are separated, they are independent of each other, which causes the front-end to pay attention to some back-end concerns, as follows.

  • Complete back-end application launch process
  • Machine management operation and maintenance: expansion and contraction
  • Downgrade, fusing, current limiting
  • Domain name, performance, monitoring

Did it touch the knowledge blind spots of many students? As a front-end, it is true that most people do not understand the server-side environment, deployment infrastructure, etc. But now that the front-end is deployed independently, the front-end is bound to face these things. This is the problem to be solved by serverless.

 What does Serverless do?

Question : Is it possible to get a tool? We only need to care about the front-end code, and the server tools will automatically do it for us.

This is what serverless does. As shown in the figure below, we only need to care about the business code, not the server infrastructure.

Some common serverless platforms

(I will introduce you to some serverless platforms, you can try it out, and those who are interested can explore it.)

How to understand serverless technology-FaaS and BaaS

Serverless server-side logic implemented by developers runs in a stateless computing container. It is triggered by events and is completely managed by a third party. Its business-level state is recorded by the database and storage resources used by the developer. Serverless covers many technologies, divided into two categories: FaaS and BaaS.

FaaS (Function as a Service, function as a service)

FaaS intends to run back-end code directly without having to manage the server system or its own server applications. The server application referred to is the biggest difference between this technology and other modern architectures such as containers and PaaS (Platform as a Service).

FaaS can replace some service processing servers (which may be physical computers, but absolutely need to run some kind of application), so that not only does it not need to supply the server by itself, but also does not need to run the application full-time.

FaaS products do not require specific frameworks or libraries for development. In terms of language and environment, FaaS functions are regular applications. For example, AWS Lambda functions can be implemented through Javascript, Python, and any JVM language (Java, Clojure, Scala), etc. However, the Lambda function can also execute any process bundled with the required deployment components, so any language can be used, as long as it can be compiled into a Unix process. FaaS functions do have certain limitations in terms of architecture, especially in terms of state and execution time.

In the process of moving to FaaS, the only code that needs to be modified is the "main method/start" code, which may need to delete the related code of the top-level message handler (implementation of the "message listener interface"), but this may only need to be changed The method signature is fine. In the world of FaaS, all the rest of the code (such as the code written to the database) does not need any changes.

Compared with the traditional system, the deployment method will have a big change-upload the code to the FaaS provider, and other things can be done by the provider. At present, this method usually means uploading a new definition of the code (such as uploading a zip or JAR file), and then calling a proprietary API to initiate the update process.

Functions in FaaS can be triggered by event types defined by the supplier. For Amazon AWS, such trigger events can include S3 (file) updates, time (scheduled tasks), and messages added to the message bus (such as Kinesis). Usually your function needs to specify the event source to which it needs to be bound through parameters.

Most vendors also allow functions to be triggered as a response to an incoming Http request. Usually this type of request comes from an API gateway of this type (for example, AWS API gateway, Webtask).

BaaS (Backend as a Service, backend as a service)

BaaS (Backend as a Service, backend as a service) means that we no longer write or manage all server-side components, and can use domain-common remote components (rather than in-process libraries) to provide services. To understand BaaS, you need to understand the difference between it and PaaS.

First of all, BaaS is not PaaS. The difference is that PaaS needs to participate in the lifecycle management of applications, while BaaS only provides third-party services that applications rely on. A typical PaaS platform needs to provide means for developers to deploy and configure applications, such as automatically deploying applications to Tomcat containers and managing the life cycle of applications. BaaS does not include these contents. BaaS only provides back-end services that applications rely on in the form of APIs, such as databases and object storage. BaaS can be provided by public cloud service providers or third-party vendors. Secondly, from a functional point of view, BaaS can be regarded as a subset of PaaS, that is, the part that provides third-party dependent components.

BaaS services also allow us to rely on application logic that others have already implemented. For this, certification is a good example. Many applications have to write their own code to implement registration, login, password management and other logic, and these codes are often similar for different applications. It is possible to extract these repetitive tasks and make them into external services, which is exactly the goal of products such as Auth0 and Amazon Cognito. They can achieve comprehensive authentication and user management, and the development team no longer has to write or manage the code that implements these functions.

How does serverless computing work?

The characteristic of synchronous calls is that the client expects the server to return the calculation result immediately. When the request arrives in the function calculation, the execution environment will be immediately assigned to execute the function.

Take the API gateway as an example. The API gateway triggers the function calculation synchronously. The client will always wait for the execution result of the server. If an error is encountered during execution, the function calculation will return the error directly without retrying the error. In this case, the client needs to add a retry mechanism for error handling.

The characteristic of asynchronous call is that the client is not eager to know the result of the function immediately, and the function calculation will return the success by throwing the request into the queue instead of waiting until the end of the function call.

The function calculation will gradually consume the requests in the queue, allocate the execution environment, and execute the function. If an error is encountered during execution, the function calculation will retry the wrong request, and the function error will be retried three times, and the system error will be retried indefinitely in an exponential backoff method until it succeeds.

Asynchronous calls are suitable for data processing. For example, the OSS trigger triggers the function to process audio and video, and the log trigger triggers the function to clean the log. They are all scenarios that are not sensitive to delay and need to ensure the successful execution of the task as much as possible. If the user needs to understand the failed request and customize the request, you can use the Destination function. Function computing is serverless. This is not to say that there is no server, but that developers don't need to care about the server. Function computing will allocate instances to perform functions for developers.

Serverless usage scenarios

Send notification

For such services as PUSH Notification, email notification interface, and SMS, they all need infrastructure to build. Moreover, their requirements for real-time performance are relatively low. Even if it is a few seconds late in time, the user can still accept it. In the examples of SMS sending we have seen, it is generally assumed that the user can receive the SMS within 60 seconds. Therefore, the user will not be annoyed at this time error of 1s.

Lightweight API

Serverless is particularly suitable for lightweight, fast-changing APIs. In fact, I have never thought of a suitable example. In my hypothesis, an AutoSuggest API may be such an API, but this kind of API is sometimes accompanied by quite complex services. Therefore, I would like to give an example of Featrue Toggle, although there are some inappropriate. However, it may be the most valuable part.

Internet of Things

When we talk about the Internet of Things, we will discuss event triggers, transmission protocols, and massive amounts of data (data storage, data analysis). With Serverless, no matter how much data it is, it is quite easy to process. For the server side of an IoT application, the system needs to collect data from various places, and create a pipeline to process, filter, transform these data, and store the data in the database. For hardware developers, docking different hardware is itself a challenge. And the direct use of countries such as AWS IoT can, to a certain extent, help us better develop applications that write server-side connections.

Data statistical analysis, etc.

Data statistics itself requires only a small amount of calculation, but to generate charts, you can generate them on a regular basis. When receiving data, we don't need to consider any delays. The delay of 50~200 ms will not affect our system.

Serverless advantages and disadvantages


1. Faster time to market We can bring applications to the market faster, because OPS becomes simpler and will help developers focus on their development. The OPS team does not need to write code that can handle scaling or worry about the underlying infrastructure. In addition, the team can build applications faster with the help of third-party integrations, such as API services such as OAuth, Twitter and Maps.

2. High scalability. Every company wants their applications to run better, with zero downtime, and scale quickly and easily with the increase in traffic, but through a single application development, it may become very difficult. As application load increases, Ops teams must be vigilant when scaling the underlying infrastructure. Due to the increase in traffic, downtime wastes a lot of time and money. But serverless computing is highly scalable, and applications can be scaled and scaled within seconds.

3. Low cost In serverless computing, developers only pay when the function is running. Unlike IaaS and PaaS, IaaS and PaaS charge for each server 24/7. This is very useful for companies that have a huge set of applications, APIs or microservices, which are currently running around the clock and using resources 100% of the time, whether they need it or not. But with serverless, we can perform functions on demand and share resources instead of running applications around the clock, so we can greatly reduce idle time and make applications run faster.

4. Delay and geolocation. Improving the scalability of the application depends on three factors: the number of users, user location, and network latency. In today's world, applications have a global audience, which may increase latency. But a serverless platform can greatly reduce the risk of delay. When using serverless, instantiate a container to run a function on every event call, and this container can be created near the user's geographic area, which will automatically improve the performance of the application.


1. Increased complexity The more sophisticated we use an application, the more complex it is. The code for each function may become simpler, but the entire application will become more complex. For example, we decompose the application into 10 different microservices. We have to manage 10 different applications, and in a single application, it is just one application that must be managed.

2. Lack of tool support. Suppose we decompose a monolithic application into 50 different functions. There are still various processes and tools to manage, record, monitor, and deploy the overall application. Since serverless is a new product on the market, monitoring or recording applications that run for a few seconds is limited and challenging, but over time, there will be many effective ways to achieve this.

3. It is difficult to determine the granularity of the function with the complexity of the architecture, and it is time-consuming to evaluate, implement and test to check our preferences.

4. It will be troublesome to manage too many functions, and ignoring the granularity will cause us to set up mini boulders.

5. Disadvantages in implementation The biggest challenge of serverless is the difficulty of integration testing.

Xiaobai Operation Guide


Serverless is a Node.js  CLI tool, so the first thing you need to do is to install Node.js on your computer.

Note: Serverless runs on Node v6 or higher.

Open a terminal and type

npm install -g serverless
To install Serverless. 

 After the installation process is complete, you can verify that Serverless is successfully installed by running the following command in the terminal:

serverlesscopy code

To view the installed serverless version:

serverless --version copy the code


Separate business logic from FaaS Provider

class Users { constructor(db, mailer) { this.db = db; this.mailer = mailer; } save(email, callback) { const user = { email: email, created_at:, }; this.db.saveUser(user, function (err) { if (err) { callback(err); } else { this.mailer.sendWelcomeEmail(email); callback(); } }); } } module.exports = Users; Copy code
const db = require('db').connect(); const mailer = require('mailer'); const Users = require('users'); let users = new Users(db, mailer); module.exports.saveUser = (event, context, callback) => {, callback); }; Copy code


Now, the above class separates the business logic. In addition, the code responsible for setting up dependencies, injecting dependencies, invoking business logic functions, and interacting with AWS Lambda is located in its own file, which is less frequently changed. In this way, business logic does not depend on the provider and is easier to reuse and test.

Furthermore, this code does not need to run any external services. Replaced by real

Service, we can pass the mock exam and assert that if
The appropriate parameters are called.

It is easy to write unit tests to cover the above courses. You can call

serverless invoke
A function with a fixture email address () to add an integration test, check whether the user is actually saved to the database, and check whether an email is received to see if everything is normal.


Deploy using the serverless framework:

serverless deploy copy the code

Serverless framework will all syntax

Convert to a single AWS CloudFormation template. By relying on CloudFormation for deployment, users of serverless frameworks can gain the security and reliability of CloudFormation.

  • The AWS CloudFormation template was created from
  • If the stack has not been created, no resources will be stored except the S3 bucket (S3 bucket) when the stack is created. The S3 bucket will store the zip file of your function code.
  • If you use a locally built ECR image, a dedicated ECR repository will be created for your service.
    docker login
    If needed, you will also log in to the repository via.
  • Then, your function code will be packaged as a zip file.
  • If you use locally built ECR images, then build and upload them to ECR.
  • Serverless obtains the hash of all files (if any) of the previous deployment and compares it with the hash of the local file.
  • If all file hashes are the same, Serverless will terminate the deployment process.
  • The Zip file of your feature code has been uploaded to your code S3 bucket.
  • Any IAM roles, functions, events, and resources will be added to the AWS CloudFormation template.
  • The CloudFormation stack has been updated with the new CloudFormation template.
  • Each deployment will release a new version for each feature in the service.

Deployment function

serverless deploy function --function myFunction copy the code

Deployment package

serverless deploy --package path-to-package copy the code
  • The
    The parameters of the flag are previously provided by Serverless (with
    serverless package
    ) The packaged directory.
  • The deployment process bypasses the package step and uses existing packages to deploy and update the CloudFormation stack.

 work process

Development Process

  1. Write your function
  2. serverless deploy
    only at
    Used when making changes to the CI/CD system. For more information on setting up CI/CD for serverless applications, please read this article .
  3. currently using
    serverless deploy function -f myFunction
    Specific AWS Lambda functions can be used to quickly deploy changes.
  4. Used for
    serverless invoke -f myFunction -l
    Test your AWS Lambda function on AWS.
  5. Open a separate tab in the console and enter the flow log by
    serverless logs -f myFunction -t
  6. Write tests to run locally.

Use stage

  • At least use
  • Use different AWS accounts for the stage.
  • In larger teams, each member should use a separate AWS account and its own development stage.

Larger project

  • Break your application/project into multiple serverless services.
  • Model serverless services around data models or workflows.
  • Keep the functions and resources in serverless services to a minimum.

Common commands

Create a new service

serverless create -p [SERVICE NAME] -t aws-nodejs duplicated code

install service

serverless install -u [GITHUB URL OF SERVICE ] Copy the code

Deploy all

serverless deploy -s [STAGE NAME] -r [REGION NAME] -v duplicated code

Deployment function

serverless deploy function -f [FUNCTION NAME] -s [STAGE NAME] -r [REGION NAME] duplicated code

Call function

serverless invoke -f [FUNCTION NAME] -s [STAGE NAME] -r [REGION NAME] -l duplicated code

Flow log

serverless logs -f [FUNCTION NAME] -s [STAGE NAME] -r [REGION NAME] duplicated code








[This article is participating in the "100% prizes|My Serverless actual combat" call for papers], the activity address: