Getting started with Serverless Framework and Python (Part 2)

Stax radio tower

Radio tower at Stax Museum of American Soul Music

This is a continuation of my previous post, which offered some tips for setting up Serverless Framework and concluded with generating a template service in Python. By this point you should have a file, serverless.yml, that will allow you to define your service. Again, the documentation is quite good. I suggest reading from Services down to Workflow. This gives a good overview and should enable you start hacking away at your YML file and adding your functions, but I’ll call out a few areas where I had some trouble.

Testing

Lambda functions are hard to test outside the context of AWS, but any testing withing AWS is going to cost you something (even if it is pennies). The Serverless folks suggest that you “write your business logic so that it is separate from your FaaS provider (e.g., AWS Lambda), to keep it provider-independent, reusable and more easily testable.” Ok! This separation, if we can create it, would allow us to write typical unit tests for each discrete function.

All of my previous Lambdas contained the handler function as well as any other functions required to process the incoming event. I was not even aware that it was possible to import local functions into my handler, but you can! And it works great!

Here’s my handler:

from getDhash import image_to_dhash

def dhash(event, context):
    image = event['image']
    dhash = image_to_dhash(image)
    return dhash

The handler accepts an image file in a string, passes this to the imported image_to_dash function, and returns the resulting dhash.

And here is the image_to_dash function, which I’ve stored separately in getDhash.py:

from PIL import Image
import imagehash
from io import BytesIO
	
def image_to_dhash(image):
    return str(imagehash.dhash(Image.open(BytesIO(image))))

Now, I can simply write my tests against getDhash.py and ignore the handler entirely. For my first test I have a local image (test/image.jpg) and a Python script, test.py, containing my unit tests:

import unittest
from getDhash import image_to_dhash

class TestLambdaFunctions(unittest.TestCase):
    
    with open('test/image.jpg', 'r') as f:
        image  = f.read()
        
    def testGetDhash(self):
        self.assertEqual(image_to_dhash(self.image), 'db5b513373f26f6f')
        
if __name__ == '__main__':
    unittest.main()

Running test.py should return some testing results:

(myenv) dfox@dfox-VirtualBox:~/myservice$ python test.py 
.
----------------------------------------------------------------------
Ran 1 test in 0.026s

OK

Environment variables

AWS Lambdas support the use of environment variables. These variables can also be encrypted, in case you need to store some sensitive information along with your code. In other cases, you may want to use variables to supply slightly different information to the same Lambda function, perhaps corresponding to a development or production environment. Serverless makes it easy to supply these environment variables at the time of deployment. And making use of Serverless’ feature-rich variable system we have a few options for doing so.

Referencing local environment variables:

functions:
  dhash:
    handler: handler.dhash
    environment:
      MYENVVAR: ${env:MYENVVAR}

Or, referencing a different .yml file:

functions:
  dhash:
    handler: handler.dhash
    environment:
      MYENVVAR: ${file(./serverless.env.yml):${opt:stage}.MYENVVAR}

The above also demonstrates how to reference CLI options, in this case the stage we provided with our deploy command:

serverless deploy --stage dev

And for completeness sake, the serverless.env.yml file:

dev:
    MYENVVAR: "my environment variable is the best"

Dependencies

In the past, I found that dealing with Python dependencies and Lambda could be a real pain. Inevitably, my deployment package would be incorrectly compiled. Or, I’d build the package fine, but the unzipped contents would exceed the size limitations imposed by Lambda. Using Serverless along with a plugin, Serverless Python Requirements, makes life (specifically your Lambda-creating life) a lot easier. Here’s how it works.

Get your requirements ready:

pip freeze > requirements.txt

In my case, this produced something like the following:

certifi==2017.4.17
chardet==3.0.4
idna==2.5
ImageHash==3.4
numpy==1.13.0
olefile==0.44
Pillow==4.1.1
PyWavelets==0.5.2
requests==2.17.3
scipy==0.19.0
six==1.10.0
urllib3==1.21.1

Call the plugin in your serverless.yml file:

plugins:
  - serverless-python-requirements

And that’s it. 🙂

Now, if you have requirements like mine, you’re going to hit the size limitation (note the inclusion of Pillow, numpy, and scipy). So, take advantage of the built in zip functionality, by adding the following to your serverless.yml file:

custom:
  pythonRequirements:
    zip: true

This means your dependencies will remained zipped. This also means you need to unzip them when your handler is invoked.

When you run the deploy service command, the Python requirements plugin will add a new script to your directory called unzip_requirements.py. This script will extract the required dependencies when they are needed by your Lambda functions. You will have to import this function before all of your other imports. For example:

import unzip_requirements
from PIL import Image

There does seem to be a drawback here, however. Until you run the deploy command, the unzip_requirements.py will not be added to your directory and therefore all of your local tests will fail with an ImportError:

ImportError: No module named unzip_requirements

Of course, I may be doing something wrong here.

Questions

  • There are actually two Python requirements plugins for Serverless. Am I using the best one?
  • As I add functions to my service, do I reuse the existing handler.py? Or do I create new handler scripts for each function?

2 thoughts on “Getting started with Serverless Framework and Python (Part 2)

  1. Hi David,

    Thanks for taking the time to publish this. I came here to see how you were testing with Serverless/Python. From what I’m seeing here the unit tests are pretty standard (sls doesn’t seem to introduce anything new here).

    I did got some additional understanding about handling the requirements (which I had really just faked).

    I was working around the required libs by doing a “pip install -r requirements.txt -t vendor/libs”.

    Then being sure that was imported in my serverless.yml and including it in the python modules that had dependencies.

    The method you’re sharing here seems a bit cleaner and more explicitly outlines that dependencies are being included with the lambdas.

    Much appreciated!

    1. Hi Jay, Thanks for the comment. Since I wrote this I’ve experimented with a few other requirements packaging plugins. There are several and they all seem to have positives and negatives. I ended up using serverless-package-python-functions, because I liked being able to set requirements per function and because I had one function that I wanted to package and zip manually (it was huge and needed some trimming to get it to the right size). I guess I should write another post. I’m still not totally thrilled with my testing scheme, but keeping the business logic out of the handler has made debugging easier so I will likely keep doing that.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s