Suppose I had a website where each user could post the best song they heard that day and and make a comment. (Just as a silly example). Assume we expect expect 1000 users with unique user names. So, I might expect that there would be one record created possibly each day for each user. Would I just just save an individual record with the three fields and the database would gradually just grow that way. So after 365 days we could expect there would be 365,000 records. Suppose a user wants to see all of his records. It seems like the search has to look over all those records and all those user names. This seems really wasteful. What am I supposed to thinking about? Or maybe databases can just do this without any problem. I'm using postgresql and I'm just starting to think that soon the problem would get huge. With that size of data should I start to think about carefully organizing everything? Or is a typical database search so quick that I needn't even worry about a 'slow' search?

Don't worry about it. 365,000 entries isn't that much. For a typical request, network latency will be greater than the time to do a query. You can always add indexes for the columns you're querying on. As with most performance questions the answer is -- first implement your system, load test it and see if there are any issues before thinking about performance optimization.

Thanks for the guidance. It looks like my concern is addressed by indexing something (as mentioned) - perhaps just one field and that orders the data by that field so searching is then quicker. I read a little here: