What is the limit of number of features in a AGOL FeatureServer feature layer?

What is the largest number of vector features in a feature layer that you have ever published on ArcGIS Online as a feature service (not tiled)? What is the geometry type (point,line polygon)? How did you get it to perform?

I have a feature service with a table in it that contains over 1 million records. A million is nothing for a typical database server to handle, but as a AGOL REST endpoint, a query for all rows returned only 2000 records by default. A query for count only timed out. Similarly, a query with a specific where clause just spins and never respond back.

If you are running into issues with your data returning, try taking a look at the data itself. If the max record count is 2000 (which it should be by default) the server shouldn't have any problems returning the request. In order to display many features, the web application will send multiple requests to the server to prevent an enormous, slow responding service.

A few questions,

Is the data hosted in ArcGIS Online?

Is the data complex, either in geometry or the number fields, domains, etc?

Are you trying to display the entire dataset or a specific subset of the data (what is your zoom level?)

Take a look at these resources and if your service is still performing at a slow rate, give tech support a call to look into the issue further.

In this case I am not trying to display thousands of feature on a map, but show a tabular result from a query that may return up to a couple hundred rows from a table layer within a hosted feature service. The feature service contains a primary feature layer (index 0) consists of a few thousand polygons, and a related table (index 1) that has no feature/shapes and contains over a million records based on one(polygon)-to-many(records) relationship.

Since I limit my application functionality to only query one polygon at a time, the related data returned based on querying just one polygon ID is not overwhelmingly large. Perhaps this is not the best model for display our dataset, but in uploading and testing the related table, it begs that question, what is the size and performance limit of even simple tabular data on ArcGIS Online? Seems like we've hit the limit.

I understand there are certain output limits to a hosted feature layer, such as the 2000 features limit, but I wonder if you guys have done any internal testing as far as how LARGE a single dataset hosted can be, in terms of number of features, rows, and fields? I would be nice to have some guideline so we can design our catalog to work most efficiently, without the need for manually curating each dataset for different resolutions and scale displays. We work often on California statewide dataset scale. It is difficult enough to keep track of publishing/utilizing 3 difference types of services for each dataset already (tile layer, feature layer, and dynamic map image layer).

I'd suggest getting in contact with support services so we can better understand your workflow and data structure. I haven't heard of anyone testing out this particular workflow, so it would be great to get a full understanding.