Any and all contributions to awslimitchecker are welcome. Guidelines for submitting
code contributions in the form of pull requests on GitHub
can be found below. For guidelines on submitting bug reports or feature requests,
please see the Getting Help documentation.
For any contributions that don’t fall into the above categories, please open an issue
for further assistance.

Please cut all pull requests against the “develop” branch. I’ll do my best to merge them as
quickly as possible. If they pass all unit tests and have 100% coverage, it’ll certainly be
easier. I work on this project only in my personal time, so I can’t always get things merged
as quickly as I’d like. That being said, I’m committed to doing my best, and please call me
out on it if you feel like I’m not.

All pull requests should be made against the develop branch, NOT master.

If you have not contributed to the project before, all pull requests must include
a statement that your contribution is being made under the same license as the
awslimitchecker project (or any subsequent version of that license if adopted by
awslimitchecker), may perpetually be included in and distributed with awslimitchecker,
and that you have the legal power to agree to these terms.

If you have difficulty writing tests for the code, feel free to ask for help or
submit the PR without tests. This will increase the amount of time it takes to
get merged, but I’d rather write tests for your code than write all the code myself.

Commit messages should be meaningful, and reference the Issue number
if you’re working on a GitHub issue (i.e. “issue #x - <message>”). Please
refrain from using the “fixes #x” notation unless you are sure that the
the issue is fixed in that commit.

Unlike many F/OSS projects on GitHub, there is no reason to squash your commits;
this just loses valuable history and insight into the development process,
which could prove valuable if a bug is introduced by your work. Until GitHub
fixes this, we’ll live with
a potentially messy git log in order to keep the history.

First, note that all calls to boto3 client (“low-level”) methods that return a dict response that can
include ‘NextToken’ or another pagination marker, should be called through
paginate_dict() with the appropriate parameters.

Add a new AwsLimit instance to the return value of the
Service class’s get_limits() method. If Trusted Advisor
returns data for this limit, be sure the service and limit names match those
returned by Trusted Advisor.

In the Service class’s find_usage() method (or a method
called by that, in the case of large or complex services), get the usage information
via self.conn and/or self.resource_conn and pass it to the appropriate AwsLimit object via its
_add_current_usage() method. For anything more than trivial
services (those with only 2-3 limits), find_usage() should be broken into
multiple methods, generally one per AWS API call.

If the service has an API call that retrieves current limit values, and its results
include your new limit, ensure that this value is updated in the limit via its
_set_api_limit() method. This should be done in the Service
class’s _update_limits_from_api() method.

Ensure complete test coverage for the above.

In cases where the AWS service API has a different name than what is reported
by Trusted Advisor, or legacy cases where Trusted Advisor support is retroactively
added to a limit already in awslimitchecker, you must pass the
ta_service_name and ta_limit_name parameters to the AwsLimit
constructor, specifying the string values that are returned by Trusted Advisor.

First, note that all calls to boto3 client (“low-level”) methods that return a dict response that can
include ‘NextToken’ or another pagination marker, should be called through
paginate_dict() with the appropriate parameters.

The new service name should be in CamelCase, preferably one word (if not one word, it should be underscore-separated).
In awslimitchecker/services, use the addservice script; this will create a templated service class in the
current directory, and create a templated (but far from complete) unit test file in awslimitchecker/tests/services:

./addservice ServiceName

Find all “TODO” comments in the newly-created files; these have instructions on things to change for new services.
Add yourself to the Authors section in the header if desired.

Add an import line for the new service in awslimitchecker/services/__init__.py.

Be sure to set the class’s api_name attribute to the correct name of the
AWS service API (i.e. the parameter passed to boto3.client). This string can
typically be found at the top of the Service page in the boto3 docs.

Write at least high-level tests; TDD is greatly preferred.

Implement all abstract methods from _AwsService and any other methods you need;
small, easily-testable methods are preferred. Ensure all methods have full documentation. For simple services, you need only
to search for “TODO” in the new service class you created (#1). See Adding New Limits for further information.

If your service has an API action to retrieve limit/quota information (i.e. DescribeAccountAttributes for EC2 and RDS), ensure
that the service class has an _update_limits_from_api() method which makes this API call and updates each relevant AwsLimit
via its _set_api_limit() method.

Test your code; 100% test coverage is expected, and mocks should be using autospec or spec_set.

As there is no programmatic way to validate IAM policies, once you are done writing your service, grab the
output of awslimitchecker--iam-policy, login to your AWS account, and navigate to the IAM page.
Click through to create a new policy, paste the output of the --iam-policy command, and click the
“Validate Policy” button. Correct any errors that occur; for more information, see the AWS IAM docs on
Using Policy Validator.
It would also be a good idea to run any policy changes through the
Policy Simulator.

So long as the Service and Limit name strings returned by the Trusted Advisor (Support) API exactly match
how they are set on the corresponding _AwsService and AwsLimit objects, no code changes
are needed to support new limit checks from TA.

Integration tests are automatically run in TravisCI for all non-pull request
branches. You can run them manually from your local machine using:

tox -r -e integration,integration3

These tests simply run awslimitchecker‘s CLI script for both usage and limits, for all services and each service individually. Note that this covers a very small amount of the code, as the account that I use for integration tests has virtually no resources in it.

If integration tests fail, check the required IAM permissions. The IAM user for Travis integration tests is configured via Terraform, which must be re-run after policy changes.

Pursuant to Sections 5(b)
and 13 of the license,
all users of awslimitchecker - including those interacting with it remotely over
a network - have a right to obtain the exact, unmodified running source code. We
have done as much as possible to make this transparent to developers, with no additional
work needed. See the guidelines below for information.

If you’re simply running awslimitchecker via the command line, there’s nothing to worry about;
just use it like any other software.

If you’re using awslimitchecker in your own software in a way that allows users to interact with it over the network (i.e. in your
deployment or monitoring systems), but not modifying it, you also don’t need to do anything special; awslimitchecker will log a
WARNING-level message indicating where the source code of the currently-running version can be obtained. So long as you’ve installed
awslimitchecker via Python’s packaging system (i.e. with pip), its current version and source will be automatically detected. This
suffices for the AGPL source code offer provision, so long as it’s displayed to users and the currently-running source is unmodified.

If you wish to modify the source code of awslimitchecker, you need to do is ensure that _get_version_info()
always returns correct and accutate information (a publicly-accessible URL to the exact version of the running source code, and a version number).
If you install your modified version directly from an editable (i.e. pipinstall-e), publicly-accessible Git repository, and ensure
that changes are available in the repository before they are present in the code running for your users, this should be automatically
detected by awslimitchecker and the correct URL provided. It is strongly recommended that any such repository is a fork of the
project’s original GitHub repository. It is solely your responsibility to ensure that the URL and version information presented
to users is accurate and reflects source code identical to what is running.

If you’re distributing awslimitchecker with modifications or as part of your own software (as opposed to simply an
editable requirement that gets installed with pip), please read the license and ensure that you comply with its terms.

If you are running awslimitchecker as part of a hosted service that users somehow interact with, please
ensure that the source code URL and version is correct and visible in the output given to users.

Run dev/terraform.py in the awslimitchecker source directory to update the
integration test user’s IAM policy with what is actually being reported by the
current code.

Ensure that Travis tests are passing in all environments.

Ensure that test coverage is no less than the last release (ideally, 100%).

Build docs for the branch (locally) and ensure they look correct. Commit any changes.

Increment the version number in awslimitchecker/version.py and add version and release date to CHANGES.rst. Ensure that there are CHANGES.rst entries for all major changes since the last release, and that any breaking changes or new required IAM permissions are explicitly mentioned.

Run dev/release.pygist to convert the CHANGES.rst entry for the current version to Markdown and upload it as a Github Gist. View the gist and ensure that the Markdown rendered properly and all links are valid. Iterate on this until the rendered version looks correct.

Commit all changes, mention the issue in the commit, and push to GitHub.

* [ ] Cut a branch off ``develop`` for this issue.
* [ ] Build docs locally (``tox -e localdocs``) and ensure they're current; commit any changes.
* [ ] Run ``dev/terraform.py`` in the awslimitchecker source directory to update the integration test user's IAM policy with what is actually being reported by the current code.
* [ ] Ensure that Travis tests are passing in all environments.
* [ ] Ensure that test coverage is no less than the last release (ideally, 100%).
* [ ] Build docs for the branch (locally) and ensure they look correct. Commit any changes.
* [ ] Increment the version number in awslimitchecker/version.py and add version and release date to CHANGES.rst. Ensure that there are CHANGES.rst entries for all major changes since the last release, and that any breaking changes or new required IAM permissions are explicitly mentioned.
* [ ] Run ``dev/release.py gist`` to convert the CHANGES.rst entry for the current version to Markdown and upload it as a Github Gist. View the gist and ensure that the Markdown rendered properly and all links are valid. Iterate on this until the rendered version looks correct.
* [ ] Commit all changes, mention the issue in the commit, and push to GitHub.
* [ ] Confirm that README.rst renders correctly on GitHub.
* [ ] Upload package to testpypi, confirm that README.rst renders correctly.
* Make sure your ~/.pypirc file is correct (a repo called ``test`` for https://testpypi.python.org/pypi).
* ``rm -Rf dist``
* ``python setup.py sdist bdist_wheel``
* ``twine upload -r test dist/*``
* Check that the README renders at https://testpypi.python.org/pypi/awslimitchecker
* [ ] Create a pull request for the release to be merged into master. Upon successful Travis build, merge it.
* [ ] Continue at [#13 on the Release Checklist](http://awslimitchecker.readthedocs.io/en/develop/development.html#release-checklist).