Recent Answers

You set caching based on the nature of your content. Are you saying your page completely changes after 30 mins? This is highly unlikely. I normally set output cache for 24 hours+. Secondly you can use substitution macro. For example your page is pretty much static except the news. You can do your news via substitution macros.

P.S. it takes 4 seconds? - you have to look at the SQL queries: settings > systems > debug
Make sure that "disable debugging" is unchecked at the top and check: "Enable SQL query debug" and "Display SQL query debug on live site". Now you page will display all SQL queries at the bottom. See what queries you have on you page and why it is taking that long.

Those might help you get to the root of the slowness, if that doesn't help you could also increase your cache time, or set up something that would constantly load the site to keep the cache fresh. I think you can set something up like this with azure insights.

Sure, the Web Crawler is just another type of "Smart Search" index you can run. Unlike the normal Pages index, it actually loads the page so it can put all the rendered content in it's cache (normal Pages index only indexes content on the page, but not rendered items like repeaters and such).

Possibly, but a domain alias shouldn't change the content, and i don't think it would affect the cache (the same page's cache should hit for both domain aliases i believe), so scanning one of the domains should cache for all the aliases. If not though you can create multiple Smart Search Indexes, one for each domain.

Yeah, if you look at adding a Custom Loading Module, the OnInit runs once per application start, you can add the call to run the smart searches there, just be careful with running multiple searches on the same pages, they tend to freeze up, best to run one after another.