18

I have a directory structure like the following in my serverless application(simplest app to avoid clutter) which I created using AWS SAM with Python 3.8 as the runtime:

├── common
│   └── a.py
├── hello_world
│   ├── __init__.py
│   ├── app.py
│   └── requirements.txt
└── template.yaml

I would like to import common/a.py module inside the Lambda handler - hello_world/app.py. Now I know that I can import it normally in Python by adding the path to PYTHONPATH or sys.path, but it doesn't work when the code is run in Lambda inside a Docker container. When invoked, the Lambda handler function is run inside /var/task directory and the folder structure is not regarded.

I tried inserting /var/task/common, /var/common, /var and even /common to sys.path programmatically like this:

import sys
sys.path.insert(0, '/var/task/common')
from common.a import Alpha

but I still get the error:

ModuleNotFoundError: No module named 'common'

I am aware of Lambda Layers but given my scenario, I would like to directly reference the common code in multiple Lambda functions without the need of uploading to a layer. I want to have something like the serverless-python-requirements plugin in the Serverless framework but in AWS SAM.

So my question is, how should I add this path to common to PYTHONPATH or sys.path? Or is there an alternative workaround for this like [serverless-python-requirements][3] to directly import a module in a parent folder without using Lambda Layers?

13
  • you have to create package "common" and import it. geeksforgeeks.org/create-access-python-package Commented Jan 16, 2020 at 6:43
  • 1
    lambda works on virtual environment. /var/task/ create runtime that we will not able to get import path. Better you have create package name "common" folder name. Commented Jan 16, 2020 at 7:04
  • 2
    I am not able to understand this: "Better you have create package name "common" folder name." Do you mean deploying the package separately because like I mentioned the common folder is already a package in Python 3.3+ and I know that Lambda is a virtual environment, that is why I want a workaround. Can you please be more elaborate about this? Commented Jan 16, 2020 at 7:13
  • 1
    i am saying you cant use common name as folder or import. because it cant override you custom folder to virtual environment folder. you can use "shared" or something different which not comes with system keyword. Commented Jan 16, 2020 at 7:22
  • 3
    sys.path or no sys.path, it doesn't matter to me whatsoever. I just want a way to import modules from the parent folder in a Lambda function handler. If you know the answer, please consider posting one and mention the details! Commented Jan 16, 2020 at 7:54

2 Answers 2

3

I didn't find what I was looking for but I ended up with a solution to create a single Lambda function in the root which handles all the different API calls within the function. Yes my Lambda function is integrated with API Gateway, and I can get the API method and API path using event["httpMethod"] and event ["httpPath"] respectively. I can then put all the packages under the root and import them between each other.

For example, say I have 2 API paths /items and /employees that need to be handled and both of them need to handle both GET and POST methods, the following code suffices:

if event["path"] == '/items':
   if event["httpMethod"] == 'GET':
      ...
   elif event["httpMethod"] == 'POST':
      ...
elif event["path"] == '/employees':
   if event["httpMethod"] == 'GET':
      ...
   if event["httpMethod"] == 'POST':
      ...

So now I can have as much packages under this Lambda function. For example, the following is how repository looks like now:

├── application
│   └── *.py
├── persistence
│   └── *.py
├── models
│   └── *.py
└── rootLambdaFunction.py
└── requirements.txt

This way, I can import packages at will wherever I want within the given structure.

Sign up to request clarification or add additional context in comments.

Comments

1

I was having a similar issue with package dependencies and according to AWS Knowledge Center you should put all of your items at the root level:

You might need to create a deployment package that includes the function modules located in the root of the .zip file with read and execute permissions for all files.

According to https://aws.amazon.com/premiumsupport/knowledge-center/build-python-lambda-deployment-package/

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.