AWS Cloud is pretty awesome and growing at a very rapid pace. If planned and executed correctly, there are many tangible benefits of developing and managing solutions in the cloud. Various available services in the cloud can make life easy as there are services available for almost everything you need - storage, backup, version control, load balancing, auto-scaling, etc.
In the development phase, testing your solutions against AWS services can become very expensive and eat up your budget very quickly! For larger enterprises its less of a concern if the number of AWS users are small. But it can be a huge concern for tight-budget limited startups and also for enterprises which are cloud-based or have many AWS users.
To help with the development and lower the costs, there are many open-source solutions out there that you can benefit from. In this article, I have listed all the major and widely used modules that you can use to mock AWS services in your development phase. With these, you will not even need to have an AWS account to start your development.
I have personally evaluated and used all of these mocking modules, and highly recommend them for your development.
Fake S3 is a lightweight server which is useful for testing S3 calls in your code by creating a mock server which is responsive to the same API as Amazon S3.
You can install fakes3 using the command: gem install fakes3
To start the fakes3 server, you can run the command:
fakes3 server -p <PORT> --root=ROOT
e.g. fakes3 server -p 4569 --root=ROOT
Use as if you are interacting with a normal S3 bucket. By default, when you create an instance of boto3 s3 client it points to the AWS S3 server located at s3.amazonaws.com. All you need to do is change it to point to your fakes3 server.
s3 = boto3.client('s3', endpoint_url='http://localhost:<PORT>')
e.g. s3 = boto3.client('s3', endpoint_url='http://localhost:4569')
Note here, that you don't even need to provide any tokens/credentials to authenticate (secret access key and access key ID). That means, you can use this to test your code using any dummy credentials, and it will still work when you have deployed in your real environment.
For instance, as a simple(and not real at all!) example where you have a lambda function that prints all the visible bucket names. As you can see in the example below, your lambda to pull the server URL from an environment variable. This will allow you to invoke the lambda handler locally and test it out before deploying to your AWS account.
In the AWS account, you can use IAM Role to allow permissions to the lambda function thus not needing to provide credentials.
For this example, the lambda handler would look like this:
The default value of environment variable s3_endpoint_url will be empty. You will need to use API to set it before invoking the lambda function. With this type of development, you are making the lambda function easily testable using mock.
To demo, how the mock looks like in response to the API requests, in the screenshot below, I have started the fakes3 server and I am calling API requests against it from python REPL. You can see that it simulates the S3 behavior.
Moto mocks all the AWS services, not just S3. All of the services can be used as a decorator, context manager, or in a raw form, allowing much more flexibility to use with a lot of different test architectures. Majority of the services can be tested using just the python package installed, with the only special case being Lambda, for which you need Docker installed.
Moto in Python:
Here I am only demonstrating the use of a decorator. For more details, you can visit the project documentation.
You can install as python module using the command: pip install moto
Example Function: lambda_example.py
Moto Standalone Server Mode:
This allows you to utilize the backend structure of Moto even if you don't use Python. It uses flask, which isn't a default dependency.
You can install as python module using the command: pip install moto[server]
You can start the moto server using the command: moto_server s3 -p<PORT>
e.g. moto_server s3 -p3000
Just like fakes3, use as if you are interacting with a normal S3 bucket. By default, when you create an instance of boto3 s3 client it points to the AWS S3 server located at s3.amazonaws.com. All you need to do is change it to point to your fakes3 server.
s3 = boto3.client('s3', endpoint_url='http://localhost:<PORT>')
e.g. s3 = boto3.client('s3', endpoint_url='http://localhost:3000')
Just like fakes3, I have started a mock moto server for S3 in the screenshot below to demonstrate its behavior. Note that this example just shows S3, but moto can also be used to mock a lot of other AWS services.
This is is an implementation of Amazon's DynammoDB to execute locally. This was actually more useful when Amazon's DynamoDB Local was not available. You are able to install Amazon's DynamoDB local on your dev machine for your development needs, and I suggest using that instead.
Amazon's DynamoDB Local
Amazon's DynamoDB Local is available as an executable Java archive (JAR) file. It will run on Windows, Mac, and Linux systems and is compatible with version 7 of the Java Runtime Environment (JRE). It will not work on older versions of Java. It is fully maintained by Amazon itself and is very stable.
Download the DynamoDB Local JAR, put it in the directory of your choice, and open a command prompt in that directory. Launch DynamoDB Local like this:
$ java – Djava.library.path=. -jar DynamoDBLocal.jar
Configure your application so that it uses the local endpoint. DynamoDB Local listens on port 8000 by default; you can change this by specifying the –port option when you start it. If you are using the default port, the local endpoint will be localhost:8000.
You can install this using npm using command:
npm install -g kinesalite
Here is a basic example from their website. For more details, please check out their website.
Localstack is built on top of other existing other mocking tools, with more functionality, support for more services and interoperability. Following are the benefits of using localstack:
1. Error injection: LocalStack allows to inject errors frequently occurring in real cloud environments.
2. Actual HTTP REST services: All services in LocalStack allow actual HTTP connections on a TCP port.
3. Language agnostic: Although LocalStack is written in Python, it works well with arbitrary programming languages and environments.
4. Isolated processes: All services in LocalStack run in separate processes.
The easiest way to install LocalStack is via pip:
pip install localstack
Once installed, run the infrastructure using the following command:
You can also run in docker using the command line:
localstack start --docker
As localstack runs as a REST API, you can use it using AWS CLI, by specifying the --endpoint_url
aws --endpoint-url=http://localhost:4568 kinesis list-streams
To avoid specifying endpoint URL in every command line, try using localstack awslocal, which is a thin CLI wrapper.
awslocal kinesis list-streams
To use with boto3, all you need to change is the endpoint URL, like in the other mocking modules. I will not go through that in much detail here as it is similar to what we saw in the other mocking modules.
Besides that, localstack can be easily integrated with python nosetests and Junit frameworks.
In my experience, in the absence of these mocks, the costs are low in the beginning but rise rapidly as your development nears completion as you keep adding support for more features that use more services.
This is especially a very important consideration when you have regression testing setup. I would recommend setting up mocks early in the development phase of your project and develop solutions in such a way that they can be easily configured to test with the mocks.
I hope that this gives post you enough information to get started with your development, and hopefully also save a bunch of money! If you have more questions or would like to discuss something, I would be happy to know.