Sadly dealt with this a lot last year. You know what happens when your stack is a forked version of WP where the former engineering team hacked the core for their own custom plugins?

You end up with a lot of disgruntled ex-employees who have installed multiple reverse bind shells and constantly attack the sites between 8 AM - 5 PM M-F for 3 months and take breaks at lunch and in the afternoon.

Our solution, after the 3 day of attacks and restoring everything from backup, we lengthened the TTL of the CDN cache. Once we detected the hacks starting, we took the sites down and left the clean static version of the site for the users. Then we identified and patched the exploits and waited for the next attack.

Over the weekend, I found a URL that is not just mobile-friendly, but actually hostile to desktop users. It simply refuses to show any useful content to desktops, and instead redirects desktop users to a page that tells them about the content they're missing out on :/

That's very wrong. Please don't do it.

We live in a world where content can be consumed on any device. Thinking mobile first is good because that's really important now (has been for a while, will be for a while). Ignoring a class of device is wrong for exactly the reasons ignoring mobile users is wrong.﻿

A lot of work both in offices and on trains/planes is done on larger screens still. In fact we live in an era of tiny and the extremely large screens (think Phone vs. browsing the web on TVs).

Our main challenge nowadays at +Summit Web Solutions is not the mobile screens but how to effectively server super-sized screens. Responsive and adaptive techniques are needed as we need to effectively serve users regardless of how they access a website. - i.e. meet people where they are not just where we think they ought to be.﻿

The Webmaster Tools team is looking for testers of a new 'search query' feature.

If you regularly use search query information from Webmaster Tools, we're working on something new for you! We're looking for testers of an early update to the search query feature. If you'd like to try it out, and give us feedback along the way, feel free to sign up here.

If you launch a new site, a brand new refresh, something awesome, something to be proud of and tell your family and friends about, something that may win you a design award, something in which you've put a lot effort, please also don't forget to remove the noindex directives when you go live.

As part of our ongoing commitment to help build an interoperable, secure web that “just works,” we're excited to announce support for HTTP Strict Transport Security (HSTS) in Internet Explorer. This change can be previewed using Internet Explorer in the Windows 10 Technical Preview, ...

Today we're taking our first small steps to support crawling, indexing, and ranking of locale-adaptive pages.

What are these? A locale-adaptive page is one that changes its response based on the perceived geographic location or language preference of the visitor. For example, a page may dynamicaly serve different content to users in the USA and Canada on the same URL, or may block access from a some countries, or may respond differently based on the visitors Accept-Language header.

Today Googlebot gets two new capabilities:

1. Geo-distributed crawling, where you may start seeing real (verifiable!) Googlebot user-agents crawling from IP addresses that appear to be coming from outside the USA.

2. Language-dependent crawling, where Googlebot may set different Accept-Language headers in the HTTP request.

A few important points:

1. We still (very strongly) recommend having separate URLs for different locales and using rel-alternate-hreflang annotation for them. Separate URLs are better for users, and that's what really counts. Locale-aware crawling is for the few edge cases where it's not possible for you to have separate URLs.

2. It's early days and the countries that Googlebot will appear to come from and the Accept-Language headers it may try do not cover all combinations of countries and languages around the world. Also, we will continue to tweak things as we build out this feature. This is another reason to have separate URLs.

3. Locale-aware crawling gets enabled algorithmically if we detect your site may benefit from it. You don't need to do anything :)

I work at Google as a Webmaster Trends Analyst. This is my personal profile.

What do I do at Google? I help webmasters build better websites, so you'll see me talking to webmasters on our and other forums, writing blog posts, saying things like "I'll ask internally" and the like.