Getting Python to AWS Lambda From Non-Linux Environments

AWS Lambda supports a variety of ways to get code and dependencies up and running in the cloud. You can use Lambda layers, custom runtimes or for ultimate control just specify a full container definition and push that up to ECR/Docker Hub for use by the Lambda platform. However, as flexible as Lambda is, there are two approaches that stand above the rest as "drop-dead simple". You can write your code in the browser with the built in editor, but let's be honest that isn't the best development experience, or you can zip your assets up locally and push them to the cloud. This is of course provided that your assets/dependencies are less than the 50MB cap at the time of writing this post. The remaining content of this post will focus on the zip deployment approach and problems you can run into when working on a Mac/Windows environment which most professional developers are likely to spend the bulk of their time in.

Python being the target function language for our routine has some design considerations to orient ourselves with when we do a â€‹pip install​ of package dependencies locally. Typically, a pip package will be compiled on the platform it is installed on by the python wheel. As a result, you can run into problems where MacOS does things differently than Windows or Linux during that compilation. For example, MacOS handles encryption differently than Windows and differently than Linux being a FreeBSD based operating system. The compiled dependency will likely be incompatible with the Linux OS running in the Lambda platform. Any zip file we push up to the cloud must have the dependencies compiled for the host Linux context!

So how do we solve the previously described problem? Enter the lambci collection of docker images which includes the â€‹lambci/lambda:build-python3.8 for python version 3.8. This image includes all the necessary build tooling for installing python dependencies with pip. The goal of the lambci project is to produce a docker image that is as close to the AWS host environment for a Lambda function as possible. This is a great way to test your function code locally while iterating quickly during development. Using the build image previously mentioned, we are able to build our zip file within docker for uploading to AWS and share it out locally via a docker volume. Here is a handy script you can use in bash for creating your python code archive:


# Execute from your python3.8 code directory

# Copy contents to temp directory
cp -r ./ ./temp && cd ./temp

# Run the lambci container with a volume in the personal working directory. Then execute the commands
# in the container to do a local pip install. Finally, zip contents from the /var/task path in the container.
docker run -v "$PWD":/var/task lambci/lambda:build-python3.8 cd /var/task && \
pip install --target ./packages -r requirements.txt && \
zip -r ./ -x "packages/*" && \ 
cd packages/ && \
zip -ur ../ ./

The comments explain what is going on, but it is important to mention that when we create the zip file we initially exclude the packages directory created by the targeted pip install​. We then add the installed packages to the root of the zip avoiding folder nesting that will prevent the dependencies from being recognized by the Lambda runtime. Take a look at this documentation to upload your zip into AWS Lambda via the AWS CLI! If all goes according to plan you will be running your function with properly detected dependencies!