Everything can be serverless about cloud function hot and cold start

Posted Jun 16, 20207 min read

This article takes everyone to understand the cold and hot startup process of cloud function, and what problems developers need to pay attention to in the cold and hot startup mode of cloud function.

This article was submitted by the serverless community user "Yuyouyou"

Show results

Cloud function is called for the first time(cold start)

Cloud function is called for the first time(cold start)

Cloud function is called multiple times consecutively(warm start)

Cloud function is called multiple times consecutively(hot start)

Cloud function cold and hot start mode

Let me tell you what the cloud function hot and cold start mode means.

  • Cold start means that you have created a new space in the server for a function instance to run. This process is a bit like you put this function in a virtual machine to run. You must start the virtual machine to load this function before each run. It is a time-consuming process, so the cloud function needs to minimize the number of cold starts.
  • Hot start means that if a cloud function is continuously triggered, then I will not release this cloud function instance. The next request will still be run by the cloud function instance that has been created before, just like we open the virtual machine and run this. After the function does not shut down the virtual machine, but let it stand by, waiting for the next time it is re-triggered to call and run. The advantage of this is that it saves a time-consuming part of "powering on" the virtual machine. The disadvantage is that the virtual machine must be maintained all the time. System activation, the system overhead will be larger.

Of course, the problem of cloud function resource allocation here does not require us to worry, the bottom layer of cloud function will be deployed by the algorithm.

There is such a description in Introduction in Tencent Cloud Function Document:

Tencent Cloud Function is a Serverless execution environment provided by Tencent Cloud. You only need to write a simple, single purpose cloud function to associate it with events generated by your Tencent cloud infrastructure and other cloud services.
When using cloud functions, you only need to write code in the languages supported by the platform(Python, Node.js, PHP, Golang and Java). Tencent Cloud will fully manage the underlying computing resources, including server CPU, memory, network and other configuration/resource maintenance, code deployment, elastic scaling, load balancing, security upgrades, resource operation monitoring, etc. But it also means that you cannot log in or manage the server, or customize the system and environment.
Cloud function is automatically deployed in multiple Availability Zones in the same region, while providing extremely high fault tolerance. The cloud function will expand and shrink according to the request load during execution. From several requests per day to thousands of requests per second, the bottom layer of the cloud function scales by itself. You do not need to manually configure and intervene, you only need to pay for the running cloud function to meet the availability and stability of the service in different scenarios. If the cloud function is not running, there is no cost.
You can customize the timing of running the cloud function, for example, when the COS Bucket is uploaded, when the file is deleted, when the application is called through the SDK, or when the cloud function is specified to be executed periodically. You can use cloud functions as data processing triggers for COS services to easily implement IFTTT logic, or you can build flexible and controllable software architectures by building flexible scheduled automation tasks that cover manual operations.

Pay attention to this sentence

When the cloud function is executed, it will expand and shrink according to the request load. From a few requests per day to thousands of requests per second, the bottom layer of the cloud function scales by itself.

It can be seen that the number of cloud function function instances is scaled by the algorithm at the bottom of the system,

Let's look down

In Serverless 2.0, we have not only thoroughly reconstructed and optimized the control flow and data flow modules, virtualization layer, network layer, and scheduling layer, but also comprehensively upgraded security, availability, and performance. Through the use of lightweight virtualization technology, VPC Proxy forwarding solutions and other optimization methods to use a unified underlying architecture. Optimized for the ability to automatically expand and shrink cores in real time, completely avoiding the criticized cold start problem in traditional serverless architectures.
Cloud function no longer restricts the running time, and supports richer application scenarios. E.g:
Service functions do not limit the length of a single request. When the request continues to arrive, the service will maintain a long-running mode with no delay in warm and cold start.
Service function supports WebSocket long connection.
Event Function(trigger function) has a single call duration limit, but when the request continues to arrive, the service is to maintain a long running mode, there is no warm and cold start delay.

Note this sentence:

The trigger function has a single call duration limit, but when the request continues to arrive, the service is maintained in a long-running mode with no warm and cold start delays.

In other words, the cloud function instances that we trigger through various methods are not all completely cold-started, and may also be instances of cloud functions that were previously called.

Let's do an experiment together

import json


# api gateway reply message format
def apiReply(reply, code=200):
    return {
        "headers":{'Content-Type':'application/json', "Access-Control-Allow-Origin":"*"},
        "body":json.dumps(reply, ensure_ascii=False)

def main_handler(event, context):
    global global_v
    return apiReply({

The above is a simple Python cloud function, let's add an API gateway trigger to it to test what result it will return:

  • The first call returns 1, indicating that our cloud function was cold started

The first call returns 1, indicating that our cloud function has been cold-started

  • Continue to call and found that this time returned 2, indicating that our cloud function was hot started based on the previous instance:

Continued to call, found that this time returned 2, indicating that our cloud function was hot started based on the previous instance

After a few more attempts, we found that some were started by hot, some were still started by cold:




But this performance is obviously not in line with our expectations. We expect that the previous request will not affect the results of the subsequent cloud function operation. This is the problem.

Okay, let s go and see what the official documentation says now

Will SCF reuse function instances?
To improve performance, SCF will retain your function instance for a certain period of time and reuse it for subsequent requests. But your code should not assume that this operation always occurs.
Why keep the SCF function stateless?
Maintaining the statelessness of the function allows the function to start as many instances as necessary to meet the requested rate.

In other words, we must ensure that the SCF function is stateless when editing the cloud function, otherwise there will be some strange problems that cannot be predicted.

So what is statelessness? To put it bluntly, your cloud function cannot depend on the state or result of the previous function operation, and try to avoid the use of global variables!

Because like in our previous experiments, the value of global variables will become unpredictable during the cold and hot start of the cloud function, which is undoubtedly a disaster in our subsequent function commissioning process~

For more common questions about Tencent Cloud Function SCF usage, please refer to Official Document

Serverless Framework 30-day trial plan

We invite you to experience the most convenient way to develop and deploy Serverless. During the trial period, related products and services provide free resources and professional technical support to help your business realize Serverless quickly and conveniently!

For details, please refer to: Serverless Framework Trial Program

One More Thing

What can you do in 3 seconds? Take a sip of water, read an email, or - deploy a complete Serverless application?

Copy link to PC browser to visit: https://serverless.cloud.tenc...

Deploy in 3 seconds and experience the fastest Serverless HTTP in real history!


Welcome to visit: Serverless Chinese Network , you can experience more about Serverless application development in Best Practice !

Recommended reading: "Serverless architecture:from principle, designed to project combat"