I think there's some confusion here, there are four setups: Cloud based setup with our cloud Cloud based setup with a local hybrid gateway, configuration is stored in our cloud system, On-premise (or your own cloud, e.g. AWS, Azure) with dashboard and gateway - here the dashboard stores the API definitions for you in MongoDB On-premise (or your own cloud, e.g. AWS, Azure) without dashboard - i.e. file-based installation, a.k.a Community Edition In options 1,2 and 3 the Dashboard REST API…

We are using V2.0 on our only server with file-based API + policy config (/app/*.json)
We want to add statistic and analytics dashboard. I know there are two options. 1. Using "Tyk Hybrid". but it is not compatible with file-based API config. right ? 2. Using self-hosted "Tyk-pump". is this support file-based API + policy? 3. Any suggestions if we want to stick with file-based config? Reasons we want file-based, because of a) we afraid human mistakes when using GUI to modify API and …

So seems there is a demand for that.

We are considering to use tyk in Enterprise Level but cannot move forward because of that.

Option 1: Use community edition

If you can do without the dashboard altogether, then you can use Community Edition, which is configured entirely with files, all the policies, and API definitions are stored as files on disk and so can be versioned and stored arbitrarily very easily. THis has been the case since Tyk was launched.

Option 2: Use Pro and the import/export scripts

The Tyk Dashboard installation folder has a utils/scripts folder which can be used to export the parent organisation, API Definition and policies for a dashboard installation and then use those files to import them into a new environment - retaining all the IDs.

If you were to import into a pre-populated dashboard (e.g. moving from one revision to another instead of zero to a revision), then you would need to drop the policies and api definitions collections in Mongo before importing, this would mean that all API Definitions and POlicy IDs are retained)

What you would do is:

Set up a staging env

Create the org, and initial APIs / API Definitions in this staging env

Export these

Create a prod env

Import everything

This will set up everything ready for pipeline, next:

Modify API definitions and policies in staging GUI

Use export scripts to export API Definitions and policies

Version control

Your pipeline would then:

Build artefacts

Deploy services

Drop tyk_apis and tyk_policies in MongoDB prod

Run the import script for apis and policies

Trigger hot reload

You can disconnect the hot-reload signal from the dashboard if you want to explicitly "publish" changes to the gateway instead of having that happen automatically on import this would reduce risk during change over.

Option 3: Use the dashboard API

You could always use the dashboard API - if you have the policies set up initially in a staging env, or even just as files, and then use the dashboard API to update API Definitions and policies that already exist and use the import scripts to import new policies.