I suggest you ....

Remove the list view threshold (5000 by default)

This limit has always been a bit laughable, and is even more so as we develop more client side applications. In SharePoint 2007 we didn't have this limit and were allowed to make our own mistakes. Now that hardware is so much more powerful, we need this limit removed so that we can build enterprise-class applications.

We are continuing to make our large list experiences better, please keep the feedback coming.

Spring 2018 update:
- We now support being able to manually add indexes to lists of any size (increased from lists up to 20,000 items previously).
- Starting with the February release of the Office 365 Excel client, you will be able to export your full list instead of getting cut off part of the way through.

What we are working on now:
- Predictive indexing will start to work for lists larger than 20,000 items so your views will automatically cause the right indexes to be added to your lists.

In our backlog:
- Being able to index/sort/filter by lookup column types (like person, lookup or managed metadata columns) without being throttled.
- Making sure that our REST APIs support querying in ways that will guarantee that the call will not be throttled.

For a general update on large list capabiltiies, the video on myignite.microsoft.com/videos/53861 (focusing on large lists at 42 minutes and 25 seconds) describes some of the changes that we delivered back in the second half of 2017:
- Modern UI now has a lot of support for adding indexes to large lists and libraries on the fly, reducing the number of throttling errors experienced by our users, and some new UI for browsing through items in large lists with our paging model
- SharePoint runs predictive indexing jobs to automatically add indexes as lists get larger based on your view definitions, and updates these indexes when you add/update your views

We are changing the status here back to “Working on it”, as we are not yet done with all that we want to do here, and as many of you have pointed this out. We apologize for prematurely changing the status to done, please keep the feedback coming.

The video on https://myignite.microsoft.com/videos/53861 (focusing on large lists at 42 minutes and 25 seconds) describes some of the changes that we delivered back in the second half of 2017:
- Modern UI now has a lot of support for adding indexes to large lists and libraries on the fly, reducing the number of throttling errors experienced by our users, and some new UI for browsing through items in large lists with our paging model
- SharePoint runs predictive indexing jobs to automatically add indexes as lists get larger based on your view definitions, and updates these indexes when you add/update your views
- We now support being able to manually add indexes to lists as large as 20,000 items (increased from 5,000 previously).

What we are working on now:
- We are working on adding the capability to be able to manually add indexes to lists of all sizes (i.e. larger than 20,000 items.)
- We will also enable this capability for predictive indexing soon thereafter so your views will automatically cause the right indexes to be added to your lists.
- Starting with the February release of the Office 365 Excel client, you will be able to export your full list instead of getting cut off part of the way through.

In our backlog:
- Being able to index/sort/filter by lookup column types (like person, lookup or managed metadata columns) without being throttled.
- Making sure that our REST APIs support querying in ways that will guarantee that the call will not be throttled.

Again, we regret that the item was set to “done” before we were done with the full range of changes we wanted to do. We are taking this issue seriously, and are continuing to piece together changes that ultimately result in an experience for our users where this is no longer an issue.

Apologies for late update here. We’re going to close out this suggestion, directing folks to recent updates we shared at Ignite 2017 around smarter indexing and UX improvements to better manage querying and displaying content from large lists (for more details check out this Ignite session: https://myignite.microsoft.com/videos/53861). For specific scenarios or feedback on this approach please post new suggestions (noting that SQL will never scan more than 5k rows, so specific feedback/suggestions where the current approach still blocks scenarios would be welcome). Thanks!

The 5,000 item threshold is likely here to stay. We are aware that it takes awareness on both sides to get past this limitation: list owners have to build the right fields, indices, and indexed views so that end user queries can run successfully in large lists. And then, end users have to run the correct queries – they can’t just open the root of the list, they have to use a filtered, indexed view.

We want to pursue work that helps on both these ends – for example, proactively indexing more lists, making suggestions to list owners for views to create that will help end users, and helping end users at query time pick filters that will unblock their queries. Additionally, we want to increase our support for running more types of queries in large lists when an index is present – for example, instead of requiring an indexed filter for every query, maybe an indexed could suffice.

No guarantees or timelines yet, but that’s the way we’re thinking about the problem. Does that sound reasonable?

Here's a idea. While loading a list, just 5000 objects appear with a prompt that says, "Sorry, due to limitations we can only show 5000 of your 5001 items. Please refer to this help document to resolve view and filter options." This would be better than just locking you out completely.

The issue of displaying the list has been improved, but the issue still remains that as soon as you exceed 5000 records - you cannot change the views nor create new ones. This is a fundamental requirement for end-users - and a procedure on SQL can be done to do the paging that you created on the list display. Of course - you cannot download 1 million records/metadata in 1 postback, it would be detrimental to the network - but as a web coder and an SQL DBA - I do know that it is not that difficult to provide procedures to bring the data down. At the very least, allow modifications of views and put the limitation of batches not to exceed 5000.

So it takes 2 and a half years to tell us you're not going to do anything about this? And you pretend it's resolved like we're going to believe that? Disgraceful but oh so typical of this cr*p service.

Just so people aren't honeydicked by your misleading "Boom! It's Done". The issues IS NOT FIXED!!! Microsoft failed to deliver on this and are abandoning all hope. The SQL limit is a sad excuse and shouldn't be used as a technical limitation.

This limit is terrible and the fact that once you are over the limit you can't index or view the data to reduce the number is crazy. We are exploring other alternatives, so we can migrate away from SP, even though we have been customers since SP2007.

This is not acceptable to our organization, which has over 20,000 files in each work area, already divided. If you have problems with SQL, do something about it now. We pay a lot of money for the service. It's a joke in 2018 ....