Testing AWS Lambda Locally with LocalStack

AWS Lambda allows you to write code without having to configure, provision or maintain servers. It simplifies and shortens the process of the code going from development to production. Also, it is a cost effective alternative to servers – you pay only for the consumed computation time.

The challenge with the AWS Lambda service comes when testing the Lambda functions code that we upload on it. AWS provides a framework – AWS SAM, which allows you to invoke Lambda functions locally via the SAM CLI. However, if your Lambda function depends on other AWS services, then a connection to the cloud is required. The Lambda may run locally, but all the other services it depends upon will have to be available to it in the cloud, which would mean we will have to pay for the computation time of our tests and the storage space we use and so on. Imagine a CI/CD pipeline running tests on every change request in a 100 Lambdas system – the more tests, the higher the expenses. Then factor in the speed to spin up new AWS environments. In order to avoid all that, we would like to be able to run our tests entirely on a local environment – cheap and fast.

LocalStack

This is where LocalStack comes in the picture. It simulates an extensive set of the AWS services. For supported services see: https://github.com/localstack/localstack#overview. What it does is, it exposes an API for each service it simulates. The requests and responses are as if to the actual AWS services. You can even inject errors that occur in the real AWS cloud environment like ‘ProvisionedThroughputExceededException’.

LocalStack can be installed on your local machine or you can use a docker-compose.yml file, for example, the one from the LocalStack repository: https://github.com/localstack/localstack/blob/master/docker-compose.yml, to get LocalStack running in a Docker container. LocalStack also has a dashboard running on port 8080, that you can use to observe the services you’ve created. The current version (as of the time of this writing) provided by the free tier is outdated though. An up to date version is available in the PRO version.



In this tutorial we’re going to demonstrate how we test a simple Lambda function that is triggered by a DynamoDB event and writes to an S3 bucket.

Writing a Lambda function

First we write the Lambda function which has the following specifics:

  1. It has a public, zero-argument constructor, that is used by the AWS Lambda service to instantiate our Lambda class;
  2. It is being triggered by a DynamoDB event;

It takes information from the DynamoDB event and writes it to an S3 bucket.

public class Lambda implements RequestHandler<DynamodbEvent, String> {

 private final Random random = new Random();
 private final AmazonS3 amazonS3;

 public Lambda() {
   amazonS3 = AmazonS3ClientBuilder
     .standard()
     .build();
 }

 //Used for testing to inject mocked instance of AmazonS3
 Lambda(final AmazonS3 amazonS3) {
   this.amazonS3 = amazonS3;
 }

 @Override
 public String handleRequest(
     final DynamodbEvent dynamodbEvent, final Context context) {
   final Map<String, AttributeValue> newImage = dynamodbEvent
     .getRecords()
     .get(0)
     .getDynamodb()
     .getNewImage();

   final String id = String.valueOf(random.nextInt());
   final String name = newImage.get("name").getS();

   amazonS3.putObject("person", String.valueOf(id), name);

   return id;
 }

}

For a test, we would like to see that the Lambda has actually written the information to S3 and preferably doing so in an entirely local manner – remember AWS S3 is running on localhost thanks to LocalStack.

Running LocalStack in a Docker container

To get LocalStack instance running, we will be using Docker containers. Here is the docker-compose.yml file:

version: '3.7'

services:
 localstack:
   image: localstack/localstack
   environment:
     SERVICES: s3:4572, dynamodb:4569
   ports:
   - 4572:4572
   - 4569:4569
   - 8080:8080

For services, we have specified only S3 and DynamoDB, since we do not need the rest. In fact we do not need DynamoDB either, but it is specified to serve as an example of how to specify multiple services.

Also, we have mapped the ports on which the APIs are exposed in the docker container to the ports on the local host machine, so we can make requests to localhost in our test.

Writing a test

And finally we write the test that does the following:

  1. Mapps a DynamoDB event, given as JSON, to a DynamoDB event as object.
  2. Calls the handler method of the Lambda function with that DynamoDB event and a MockLambdaContext object.
  3. Asserts that in the “local S3” we have written the information we intended to.
@Test
void handleRequest() throws IOException {
 final InputStream dynamoEventStream = getClass()
   .getClassLoader()
   .getResourceAsStream("dynamo-db-event.json");

 final DynamodbEvent testEvent =
   objectMapper.readValue(dynamoEventStream, DynamodbEvent.class);

 final String id = lambda.handleRequest(testEvent, lambdaContext);
 resourcesToBeCleanedUp.add(id);

 final S3Object person = amazonS3.getObject("person", id);
 assertEquals("Peter Parker", transformInput(person.getObjectContent()));
}

When we are instantiating our Lambda function that depends on S3, we need to provide it with an instance of the local S3 running in the docker container:

amazonS3 = AmazonS3ClientBuilder.standard()
 .withEndpointConfiguration(
   new AwsClientBuilder.EndpointConfiguration(
     "http://localhost:4572", "eu-west-1"))
 .withPathStyleAccessEnabled(true)
 .build();

It is time to run the test, just make sure LocalStack container is running! Test should pass :).

If for example, you’d like to test the flow end-to-end, in this case, meaning the triggers as well, what you can do is have the lambda  run in the Docker container as well. That way, all of our code, with the exception of the test one, will run in our simulated cloud. In order to do that you need to:

  1. Create a DynamoDB table via the AWS CLI in the local DynamoDB service;
  2. Create an S3 bucket via the AWS CLI in the local S3 service;
  3. Deploy the Lambda function to the Lambda service running in the container via the AWS CLI to the local Lambda service. The DynamoDB table should be set as a trigger to the Lambda function.

The test then will simply write an item to the DynamoDB. That action will trigger the Lambda function, which in turn will write to the specified S3 bucket. Then what is left is to check in our test that we have the information written to S3.

If the Lambda function needs a database access, unfortunately, we cannot rely on the free tier of LocalStack since it does not support AWS RDS. Only the PRO version does. A simple solution to that would be to deploy a separate container for the database.

Conclusion

In this tutorial, we have learned how to test Lambda functions, without making any requests to the cloud, even if those functions have dependencies on other AWS services. We have shown that we can do that with the help of LocalStack and given one alternative to how we write our tests so they can run quickly without costing us a small fortune.

The code can be found here, feel free to use our ideas and make your project better!

Was this article useful? Have you found another way to run such tests in isolation? Let us know in the comments below!


Share

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on pinterest
Share on print
Share on email