I have a polygon shapefile that has a 100MB .dbf and 500MB .shp file component. The reason it is so large is it is a landbase classification for a whole district.

Everytime I am viewing the file in ArcCatalog or ArcMap and move the view window however slightly, the entire file needs to redraw from scratch. This is a problem the more zoomed-out I am. I have tried spatial indexing and importing into a geodatabase - neither approach provides any noticable performance improvement with respect to rendering.

Esri's help page suggests that that to improve shapefile performance, the user can generalize the file. While this would obviously work, I do not want to lose any information. Splitting the file up is not ideal since I am doing a lot of geoprocessing/queries with its entire area. I guess I could avoid viewing the entire area at once - but sometimes, for example, it is good to see what parts of the file a query has selected.

Is there any other approach I could take to improve rendering performance?

(In theory, building shapefile "pyramids" would be ideal - I'm unsure why ArcGIS has never supported such an approach - at least that I am aware of...)

Having such a large shapefile, is just asking for trouble. In my experience, Large shapefiles tend to corrupt very easily. Get it in a File Geodatabase, to avoid corruption. Better drawing performance would be an added bonus.
–
Devdatta TengsheOct 24 '12 at 5:49

As I breifly note above, I found that importing a large shapefile into a gdb creates no improvement - from a purely rendering point of view. You are quite right though that from a general perspective it makes little sense to not have a large shp file in a gdb (for all sorts of reasons).
–
youzerOct 24 '12 at 19:27

2

Have you considered using a raster instead of a shapefile?
–
Kirk KuykendallOct 30 '12 at 20:06

Suspect that could be gotten around by having second layer in TOC pointing at same source and not a basemap layer, which is usually turned off but can be displayed when required for analysis or selection.
–
PolyGeo♦Oct 24 '12 at 5:34

PolyGeo - thanks for answer. I tried baselayer and indeed a major performance improvement is when "zoom to layer" is clicked, the file does not render from scratch. The workaround you suggest given blah238's comment would perhaps work from some projects, but I found an addtional thing that limites the use of baselayer is that you cannot visualize the dbf using any symbology. With this limitation I might as well generlize the file for reference (not a baselayer), then unhide the "real" layer when needed. Not really a great solution.. I'll be voting "pyramids" idea as you suggest!
–
youzerOct 24 '12 at 19:41

generally very good advice. I did indeed implement all of these in my initial tests - it seems that more/different tricks are needed to deal with very large feature classes / shapefiles.
–
youzerOct 24 '12 at 20:04

As a bit of a followup to Aaron's answer, you could also use a definition query to limit the number of results returned for visualization (and it includes analysis - I believe it functions a lot like a selection). If not all features are needed for viewing at any given moment and you're not switching regions a ton, a definition query could be a workable solution, though not an exact answer to your question or needs.

You can improve rendering by setting the layer display parameters so that the layer will not display at large scales (e.g. >1:10,000). You can find this option in the layer properties: Layer Properties > General Tab > "Don't show layer when zoomed out beyond..."

Also, storage location matters--for example, if it is stored on an old server with poor bandwidth, you are guaranteed to have poor performance. I routinely handle 1GB+ vector data over a server, which makes me question whether your system specs need to be updated (for reference, I'm running 12GB RAM, 2nd gen i7, average graphics card).

I hear your frustration. I routinely work with large shapefiles like this and don't have display problems, in general. I agree with all the comments above, especially making sure everything is in the same projection, including the data frame. I assume you have copied the file locally and are not trying to access it over the network? One thing that will cause display issues with shapefiles this size is if there are an extreme amount of vertices, as in a stream network. The only solution I have found to this is creating a python script to do layer definitions on the fly so I am only drawing a few at a time. Another thing would be to update your computers graphic memory and graphic card. Good luck!