Flask + Serverless — API in AWS Lambda the easy way

Apr 27, 2022

If you either need to quickly deploy a small API or just decided to migrate your codebase to leverage the advantages of AWS Lambda, you can use a powerful combo of Flask and Serverless framework. In fact, any WSGI application such as Django can work too.

There are of course other guides on how to accomplish that, either packing and uploading your app by yourself or using Zappa. The process gets a bit more tricky when you realize some dependencies are not compatible (because of Linux on lambda) and you need to handle that too. We are going to take a look at the basic setup of Serverless framework together with two amazing plugins to make it breeze.

Let's say you already have an AWS account. The second thing you need is Serverless framework installed — it's incredibly easy and you can follow the two-step guide here.

Now is the time to get your hands dirty and make it happen. But, we will need to have a Flask app first. Let's build a simple random quote API. It's going to be a very simple example, but the process is very much the same for larger apps. Also, I will be using virtualenv, but it works with Pipenv too :)

mkdir quotes
cd quotes/# create virtualenv, activate it
virtualenv venv -p python3
. venv/bin/activate.fish# install Flask and freeze requirements
(venv) pip install Flask
(venv) pip freeze > requirements.txt

And that's it, we have the foundation ready.

For the API itself, let's say we have a mighty service like this, no rocket science.

from collections import namedtuple
from random import choice

from flask import Flask, jsonify

Quote = namedtuple("Quote", ("text", "author"))

quotes = [
    Quote("Talk is cheap. Show me the code.", "Linus Torvalds"),
    Quote("Programs must be written for people to read, and only incidentally for machines to execute.", "Harold Abelson"),
    Quote("Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live",
          "John Woods"),
    Quote("Give a man a program, frustrate him for a day. Teach a man to program, frustrate him for a lifetime.", "Muhammad Waseem"),
    Quote("Progress is possible only if we train ourselves to think about programs without thinking of them as pieces of executable code. ",
          "Edsger W. Dijkstra")

app = Flask(__name__)

@app.route("/quote", methods=["GET"])
def get_random_quote():
    return jsonify(choice(quotes)._asdict())

Now it's time to wire it with the Serverless framework. To do so we need to create a serverless.yml file (in our root) manually. The file will look like this:

service: quotes #name this whatever you want

 name: aws
 runtime: python3.6
 region: us-east-1
 memorySize: 128

That's almost the minimum you need to specify for having a declared service, although we don't have any link to our API (function) yet. But before we do that, let's install two Serverless plugins. The first thing we need is to make Lambda understand WSGI (protocol Flask/Django is using), second is to make Serverless pack our python requirements into our deployment package.

sls plugin install -n serverless-wsgi
sls plugin install -n serverless-python-requirements

Those two commands do the job and we can see that Serverless registered those two in our serverless.yml file.

Note: Serverless-wsgi plugin should be able to pack the requirements too, but I have found that the second plugin is more configurable and can eventually pack those in docker (if they need compilation and you are running a non-linux system). Also, the plugin supports Pipenv which we are using on one of our projects.

service: quotes
 name: aws
 runtime: python3.6
 stage: dev
 region: us-east-1
 memorySize: 128
 - serverless-wsgi
 - serverless-python-requirements
   app: app.app
   packRequirements: false
   handler: wsgi.handler
     - http: ANY /
     - http: 'ANY {proxy+}'

You can see a new section (custom) which holds configuration for those plugins. The wsgi part is saying where your app is and turning off packing of requirements. The last part (functions) declares what our service contains. We can have more functions within one service and also require specific permissions. In this case, we are just saying that all requests will be served through a WSGI handler, which is provided by our installed plugin.

Local development

Before we deploy our API, we can verify everything works locally. As the serverless-wsgi plugin is smart, we can simply run serverless wsgi serve to have a local environment up and running :)

Deploy time!

Is as easy as typing one command. Run serverless deploy in your terminal and Serverless will do the job. The first deployment will take a little time, due to the initial setup, but later ones will be faster.

And that's it. We have our service up & running with the power of AWS Lambda behind it.

Final Notes

  • You can easily make python-requirements use docker to pack requirements which require compilation
  • If you get an internal error or want to see recent logs, open CloudWatch in your AWS console or just type serverless logs -f app
  • In most cases, you can fit into AWS free tier — which is nice especially for prototypes
  • You can specify stages to distinguish between staging and production environment with the –stage option

Subscribe to our newsletter to get the latest product updates, tips, and best practices!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.