When someone visits their own website the browser is instructed to cache the page, so it does. They then add a new post and visit their website ... and get the cached version of it. Same will obviously hold true for anyone hitting a top level URL (root or stub file). Other than undoing the smart stuff we did with .htaccess, I dunno how to make this problem go away.

I just commented out the line that says 24 hours for anything not covered by one of the specific times so I guess it'll take some time to see if that actually benefits the client or not.

In .htaccess I mean. First it sets a default of 24 hours, then does some time for some file types, then does long time for other file types. I went with commenting out the default because neither html or php is actually in either of the other 2 statements.

Ed, have the client to check their browser settings (sorry, don't know the details) The way most browsers are set up; and usually by default so your client may have changed something or a plugin may. Anyway, the browser is suppose to check the cache copy against the incoming request in some fashion; again memory is fuzzy on how it does it, and if the new request is different, it fetches the website; if not, it fetches the cache copy. Sometime when making changes to the CSS; we do have to manually refresh the page with the shift-F5 (or whatever it is) because the browser won't detect the changes in the CSS; it will use the cache copy regardless.

Anyway, I have never had a problem where I made a new post and it was not reflected in going to the website, usually we click on publish or save or whatnot and it takes us to the page with the notifications at the top. Try to get some details from your client on how they are operating.

I guess we need to read up on how a website communicates with the browsers on site changes as well.