Python – A shared Python library between multiple APIs on AWS

A shared Python library between multiple APIs on AWS… here is a solution to the problem.

A shared Python library between multiple APIs on AWS

I have several different python APIs (i.e. python scripts) running using AWS lambda. The standard approach is to generate a zip file that contains all the external libraries required for your Lambda function, and then upload it to AWS. Now I have some functions that are common between different APIs (such as custom utility functions such as parsing text files or dates). Currently, I’m just copying the file utils.py in each zip file. However, this approach is inefficient (I don’t like to duplicate code). I want an S3 bucket that contains all my .py shared files and have my API load those files directly. Is this possible?
An easy way is to download the files to the tmp folder and load them, but I’m not sure this is the best/fastest way :

import boto3
client_s3 = boto3.client("s3")
client_s3.download_file("mybucket", "utils.py", "/tmp/utils.py")

Can this be done in a more elegant way?

Solution

This is not a simple problem. We’ve been using lambda layers For a while, it was designed to solve the problem so you could share common code.
The problem with Lambda layers is that when you change something inside the layer (layer + your Lambda function), you have to redeploy it twice. This can quickly become neck pain, and in the case of CICD, you may also have a problem.

We tried it for a while, and now we go back to packaging the code and including it in a lambda. Not efficient if you want to avoid code duplication, but at least you don’t have all the bugs associated with the fact that you forgot to deploy dependent functions.

Related Problems and Solutions