serverless-python-requirements: Issue with serverless deploy, requirements.txt not found
I have the following serverless.yml:
# Welcome to Serverless!
#
# This file is the main config file for your service.
# It's very minimal at this point and uses default values.
# You can always add more config options for more control.
# We've included some commented out config examples here.
# Just uncomment any of them to get that config option.
#
# For full config options, check the docs:
# docs.serverless.com
#
# Happy Coding!
service: awsTest
# You can pin your service to only deploy with a specific Serverless version
# Check out our docs for more details
# frameworkVersion: "=X.X.X"
plugins:
- serverless-python-requirements
custom:
pythonRequirements:
invalidateCaches: true
dockerizePip: true
dockerImage: lambda-python3.6-with-mysql-build-deps
provider:
name: aws
runtime: python3.6
role: arn:aws:iam::443746630310:role/EMR_DefaultRole
# you can overwrite defaults here
# stage: dev
# region: us-east-1
# you can add statements to the Lambda function's IAM Role here
# iamRoleStatements:
# - Effect: "Allow"
# Action:
# - "s3:ListBucket"
# Resource: { "Fn::Join" : ["", ["arn:aws:s3:::", { "Ref" : "ServerlessDeploymentBucket" } ] ] }
# - Effect: "Allow"
# Action:
# - "s3:PutObject"
# Resource:
# Fn::Join:
# - ""
# - - "arn:aws:s3:::"
# - "Ref" : "ServerlessDeploymentBucket"
# - "/*"
# you can define service wide environment variables here
# environment:
# variable1: value1
# you can add packaging information here
#package:
# include:
# - include-me.py
# - include-me-dir/**
# exclude:
# - exclude-me.py
# - exclude-me-dir/**
functions:
emotion-analysis:
handler: handler.emotionAnalysis
events:
- http:
path: emotionAnalysis
method: post
audio-analysis:
handler: handler.audioAnalysis
events:
- http:
path: vokaturiAnalysis
method: post
# The following are a few example events you can configure
# NOTE: Please make sure to change your handler code to work with those events
# Check the event documentation for details
# events:
# - http:
# path: users/create
# method: get
# - s3: ${env:BUCKET}
# - schedule: rate(10 minutes)
# - sns: greeter-topic
# - stream: arn:aws:dynamodb:region:XXXXXX:table/foo/stream/1970-01-01T00:00:00.000
# - alexaSkill
# - iot:
# sql: "SELECT * FROM 'some_topic'"
# - cloudwatchEvent:
# event:
# source:
# - "aws.ec2"
# detail-type:
# - "EC2 Instance State-change Notification"
# detail:
# state:
# - pending
# - cloudwatchLog: '/aws/lambda/hello'
# - cognitoUserPool:
# pool: MyUserPool
# trigger: PreSignUp
# Define function environment variables here
# environment:
# variable2: value2
# you can add CloudFormation resource templates here
#resources:
# Resources:
# NewResource:
# Type: AWS::S3::Bucket
# Properties:
# BucketName: my-new-bucket
# Outputs:
# NewOutput:
# Description: "Description for the output"
# Value: "Some output value"
and the requirements.txt:
cycler==0.10.0
decorator==4.1.2
imutils==0.4.3
Keras==2.1.1
matplotlib==2.1.0
networkx==2.0
numpy==1.13.3
olefile==0.44
opencv-python==3.3.0.10
pandas==0.21.0
Pillow==4.3.0
pyparsing==2.2.0
python-dateutil==2.6.1
pytz==2017.3
PyWavelets==0.5.2
PyYAML==3.12
scikit-image==0.13.1
scikit-learn==0.19.1
scipy==1.0.0
six==1.11.0
sklearn==0.0
dlib==19.7.0
I am using this Dockerfile to compile dlib and boost:
FROM amazonlinux:latest
RUN touch /var/lib/rpm/*
RUN yum install -y yum-plugin-ovl && cd /usr/src
#RUN yum check-update
#RUN rpm --rebuilddb
RUN yum history sync
RUN yum install -y wget
RUN yum install -y sudo
RUN yum install -y sudo && sudo yum install -y yum-utils && sudo yum groupinstall -y development
RUN sudo yum install -y https://centos6.iuscommunity.org/ius-release.rpm && sudo yum install -y python36u && yum install -y python36u-pip && yum install -y python36u-devel
#RUN yum install -y grub2
RUN ln -s /usr/include/python3.6m /usr/include/python3.6
RUN wget --no-check-certificate -P /tmp http://flydata-rpm.s3-website-us-east-1.amazonaws.com/patchelf-0.8.tar.gz
RUN tar xvf /tmp/patchelf-0.8.tar.gz -C /tmp
RUN cd /tmp/patchelf-0.8 && ./configure && make && sudo make install
RUN yum install -y blas-devel boost-devel lapack-devel gcc-c++ cmake git
RUN git clone https://github.com/davisking/dlib.git
RUN cd dlib/python_examples/
RUN mkdir build && cd build
RUN cmake -DPYTHON_INCLUDE_DIR=$(python3.6 -c "from distutils.sysconfig import get_python_inc; print(get_python_inc())") -DPYTHON_LIBRARY=$(python3.6 -c "import distutils.sysconfig as sysconfig; print(sysconfig.get_config_var('LIBDIR'))") -DUSE_SSE4_INSTRUCTIONS:BOOL=ON dlib/tools/python
RUN sed -i 's/\/\/all/all/' Makefile && sed -i 's/\/\/preinstall/preinstall/' Makefile
RUN cmake --build . --config Release --target install
RUN cd ..
RUN mkdir ~/dlib
RUN cp dlib.so ~/dlib/__init__.so
RUN cp /usr/lib64/libboost_python-mt.so.1.53.0 ~/dlib/
RUN touch ~/dlib/__init__.py
RUN patchelf --set-rpath '$ORIGIN' ~/dlib/__init__.so
When I run serverless deploy, I get the following error:
Error --------------------------------------------------
Error: Could not open requirements file: [Errno 2] No such file or directory: '.serverless/requirements.txt'
at ServerlessPythonRequirements.installRequirements (/Users/manavdutta1/Downloads/awsTest/node_modules/serverless-python-requirements/lib/pip.js:80:11)
From previous event:
at PluginManager.invoke (/usr/local/lib/node_modules/serverless/lib/classes/PluginManager.js:366:22)
at PluginManager.spawn (/usr/local/lib/node_modules/serverless/lib/classes/PluginManager.js:384:17)
at Deploy.BbPromise.bind.then.then (/usr/local/lib/node_modules/serverless/lib/plugins/deploy/deploy.js:120:50)
From previous event:
at Object.before:deploy:deploy [as hook] (/usr/local/lib/node_modules/serverless/lib/plugins/deploy/deploy.js:110:10)
at BbPromise.reduce (/usr/local/lib/node_modules/serverless/lib/classes/PluginManager.js:366:55)
From previous event:
at PluginManager.invoke (/usr/local/lib/node_modules/serverless/lib/classes/PluginManager.js:366:22)
at PluginManager.run (/usr/local/lib/node_modules/serverless/lib/classes/PluginManager.js:397:17)
at variables.populateService.then (/usr/local/lib/node_modules/serverless/lib/Serverless.js:104:33)
at runCallback (timers.js:785:20)
at tryOnImmediate (timers.js:747:5)
at processImmediate [as _immediateCallback] (timers.js:718:5)
From previous event:
at Serverless.run (/usr/local/lib/node_modules/serverless/lib/Serverless.js:91:74)
at serverless.init.then (/usr/local/lib/node_modules/serverless/bin/serverless:42:50)
at <anonymous>
I have no idea why this is happening. I have the requirements.txt under .serverless in my local directory and it looks fine. Does anyone know why this is happening?
About this issue
- Original URL
- State: open
- Created 7 years ago
- Comments: 33 (2 by maintainers)
I had this issue as well - or a similar one, anyway - running inside Gitlab CI (docker-in-docker) with
dockerizePip: true:and I accidentally stumbled upon a workaround.
I was already going to start using download and static caching, and I wanted the cache dir to be inside my
.serverlessdirectory, so that it would be saved and restored between jobs. So, I ended up with these settings:And, lo and behold, that also fixed the packaging issue.
If you notice above, the plugin is trying to map a
requirements.txtfile into the container in/var/task/:My guess is that the Gitlab CI runner disallows this or interferes with it somehow, because when I set
cacheLocationas above, I get this instead:which works perfectly.
This could also be because the
/builds/group/projectdirectory is already being mapped into the container as a volume, allowingpipin the container to find the path torequirements.txt. Either way, hopefully this helps someone else with a similar dockerish problem.If you have
pyproject.tomlin your project but you don’t usepoetry, please remember to setusePoetry: false. The config will beRelated code:
https://github.com/UnitedIncome/serverless-python-requirements/blob/64e20db2a4acbf95a3d9391797b0c12544234a0c/index.js#L41
https://github.com/UnitedIncome/serverless-python-requirements/blob/master/lib/pip.js#L65
Ah. yeah I’ll ahve to check docker-in-docker out at some point.
Re this @thesmith:
You an do something like:
(this assumes you have a
CIenv var set totruein CI (CircleCI does this automatically, not sure how standard it is, but it’d be easy to add the var or adapt this technique to your CI provider)Just thought of something it might be… Add this to your
Dockerfile:I’ve been playing with this a bit more and there’s definitely something about running the pip install through docker, from within another docker.
I guess one way to get around this would be to run the pip install command without docker, given we’re already within a docker container - as long as the host docker is the right kind to build the package for lambda.
If there was an extended version of https://github.com/lambci/docker-lambda/tree/master/python3.6 that we could use to run
serverless deployfrom then we could setdockerizePip: false.Check your shared drives in the docker for Mac settings.
@dschep thanks for the tip! I’ll give it a try and if it works without
serverless-package-python-functionsI’ll happily remove it! Less clutter is better.Getting the same issue to @chubzor, any news on a fix.