It was a busy year for Web Designers, Web Developers, Search Engine Optimization Specialists (SEO’s), and Search Engine Marketers (SEM’s) in 2011. So much so, that it could make your head spin at times trying to keep up with the break-neck pace. While there was simply too much to cover in a single article, I’d like to touch on a few of the highlights that I think made the biggest differences.

Web Design and Web Development 2011 Highlights

In 2011, HTML5 really found it’s stride. While, the official spec still hasn’t been finalized, there is enough of it complete that many developers have started implementing it into current projects. What really made HTML5 usable, was the invention of the “HTML5 Shim” by the brilliant developer, Remy Sharp and later Modernizer.js. These allowed developers to assist older browsers (yeah, I’m talking about you Internet Explorer), to understand some of the new HTML5 elements. Along came other Polyfills which, use JavaScript to plug holes in older browsers where they may lack support for these newer elements. Another really helpful Open Source product spearheaded by Paul Irish, with the help of some other incredibly smart and talented people was the HTML5 Boilerplate. This became the starting point for many web designs because of it’s built-in development “best practices”.

CSS3 began finding more support due to the rapid development of browsers like FireFox, Google Chrome, Opera, and the release of Internet Explorer 9, which was the first browser released from Microsoft in several years and finally saw Microsoft coming in-line with web standards for the first time. Thanks to Modernizer, we now have feature detection which enables developers to check the browser and see if it supports various CSS3 elements and use them if they are present and either choose graceful degradation or make use of other technologies when they are not available.

The use of the font-face element saw a big gain this year as more Open Source fonts have become available. Previously, because so many fonts where proprietary, you simply couldn’t use them as real text on the page and had to resort to either making images of text (usually for headings) or using tricks like SFIR or Cufon to to make the text look like a specific font. Now, with an ever-growing library of Open Source fonts, you are beginning to see more variety in the choice of fonts used on across the web on mostly newer websites.

We continue to witness the demise of Adobe Flash used for websites this year. A few months back, Adobe announced that they would discontinue making Flash for mobile devices. Because of this, it has expedited the end of days for Flash to be used on websites except as a fall-back for video when other HTML5 video types are not supported by the browser in use.

2011 was also the year for Responsive Design to take off. Responsive Design was first conceived by Ethan Marcotte who released a book about it, aptly enough titled “Responsive Design” published by A Book Apart. This came almost a year after his his first article on the subject at A List Apart. Since then, several new responsive grids have been built to help developers achieve Responsive Design more easily. Responsive Design was based on the idea that the web design would “respond” to the screen-size” that was viewing it. So, if a user was viewing it on a desktop with a large screen, they would get the full max width view of the website, while viewing the same site on a smaller screen, such as a laptop or tablet, would receive a shrunk-down view of the same site. Possible where columns weren’t as wide or a column might slip down underneath another column, etc. If the user was on a Smart Phone, they would receive a further modified site custom designed for that screen size. This is all handled via the use of CSS Media Queries. This idea came about as a smart alternative to having a completely custom mobile version of the same website. There are truthfully pros and cons to both ways of building for mobile, and it really depends on the website itself, the target audience, etc. as to which should be used, but Responsive Design has really gained a lot of traction, especially as developers continue to figure out ways of fixing issues that could potentially pop-up when using this method (<em>such as downloading the full-size images and shrinking them down instead of a smaller sized image to save on bandwidth</em>). It saves a lot of time and money making custom apps for iOS devices, Android devices as well as various screen sizes and resolutions.

Speaking of the Mobile Web, numbers continued to grow at an amazing rate for for smart phones and tablets in 2011, and the trend is expected to grow even more in the next few years. Smart Phones where up 54% from the previous year as of the three-month period ending July 2011. Because of this, web designers and developers can no longer think only in terms of developing for desktops and laptops. However, not every site needs a mobile version and it is a lot more work for designers and developers which ultimately mean that if a client wants their site to have a special mobile version or a responsive version, the cost can go up dramatically from a desktop / laptop only version of the website.

Search Engine Optimization (SEO) 2011 Highlights

Search Engine Optimization, for those who are not aware of the term, are the things that you can do on your own website to achieve better search engine rankings. This year saw a few major changes including Schema.org, Page Speed, Duplicate Site Content, Pagination, as well as “Content Freshness”.

Beginning with Page Speed, Google, announced that they would begin using a web sites loading speed as part of the websites ranking signal. This encourages web developers to ensure that the site loads as quickly as possible by using various best practices. There are tools available such as Google’s Page Speed Plugin as well as Yahoo’s own YSlow Plugin that help designers find where problems exists so that they can be fixed.

Microdata has also gained a lot of traction this year. These are HTML tags and attributes that help define information for search engines. These can be used to define things such as people, movies, recipes, reviews, ratings, and many more things in such a way that robots understand the context in which they are being used.

Canonical Links. Many sites have the ability to get to the same content by way of different URL’s. Perhaps you’re writing articles that are in Categories and also have Tags and you can find the same content by following the Categories or the Tags links which create differing URL’s. It’s become more important now to let search engines know of the “canonical link” which essentially means, which one of the pages is the one true content page. This keeps search engines from splitting up the ranking between multiple URL’s and gives the full rank to the main URL instead. This has also been a long time issue where a site uses the www and the non www to point to the same content. This would be treated as two different sites, even though they are in fact, identical.

Content Freshness. Many types of businesses require fresh content and now Google is making more use of that content. Freshness will really only affect subjects where the timing is of greater importance. For example, things such as sports scores, election results, current news and such, require that breaking news get priority over older news, which makes perfect sense. This should encourage businesses to produce more fresh, timely content. This won’t affect some sites where timeliness of the content doesn’t matter. Some things just don’t change over time. For example, if you sell hammers, Google doesn’t need breaking news about hammers, because the technology doesn’t really change that frequently, so a great article about hammers from five years ago, might still rank in the #1 position.

Search Engine Marketing (SEM) 2011 Highlights

Search Engine Marketing for those who are not aware of the term, are the things that happen off of your website that help you achieve better search engine rankings.

Search Engine Marketing companies have had their hands full this year with the rolling-out of the few “Panda” updates from Google, which turned out to be a several months long roll-out that affected the rankings of millions of websites and took some of the highest ranked sites that were full of “poor content” and dropped them way down the rankings. The Panda update fixed some issues where many sites were able to game Google by putting out a massive number of articles, some of which was content that was basically copied from other sources or poorly researched content from low-waged writers who were paid by the article, which encouraged them to write as many articles as possible, without any fact checking. It also helped fix the issue where the original source of an article would be more credited for it than another, higher ranked site that happened to copy the article, possibly by way of an RSS Feed, although this is a problem that still seems to exist to some degree.

Other big Search Engine Marketing news was the launch of Google+, which was Google’s answer to Facebook and Twitter in the Social Space. Google understood very well that people were searching both less and differently since the “Social Web” was born. Instead of querying a search engine, they might just ask their friends a question and get answers that way. Google+ allowed them to tap into the social market and make use of the social signal as part of their website ranking algorithm.

The Author tag begins to take effect. Google is now beginning to rank authority based on authors in addition to websites. A site may have many authors while an author may write content for many sites. A well known and respected author can have their own authority by way of using the rel="author"HTML tag. To make this work, you point the tag to your Google Profile page and in return, list the site you wrote on in your Google Profile. As the author gains more reputation, they will receive higher rankings for their articles.

What’s coming in 2012?

It’s really difficult to say, but one thing’s for certain. Microsoft has announced automatic updates for Internet Explorer 6 and Internet Explorer 7, which will mean that the lowest browser developers will have to design and develop for is Internet Explorer 8, which is still not great, but certainly better than the other two. I’m hopeful that Microsoft will change their decision to cap Windows XP at Internet Explorer 8 and go ahead and let them update to Internet Explorer 9 since Windows XP still has a couple more years of support before it’s Microsoft stops supporting it. I understand Microsoft wanting to push people to purchase their newer operating systems, but at the same time, leaving outdated browsers floating around is bad for the web as a whole. Getting Internet Explorer 6 off of the web will make a massive difference to a lot of websites that need to have extra code to help them achieve some of the effects that newer browsers are able to accomplish which slows those sites down.

Other than that, I expect that there are a lot of web developers who will being playing catch-up and honing their skills with some of the previous changes mentioned.

And as always, there’s certainly something new coming out that none of us have conceived of yet.

Testimonials

Jim Tassano

“That guy’s got talent” …… was what a web designer told me after looking at one of our websites, attackspider.com. He was impressed, as a professional himself, with the quality of work done by Level One. You need to look at it to see why. Level One just updated foothillpest.com to ‘mobile-friendly’, and everyone loves it. Besides doing excellent work, they respond quickly to problems and requests. They assist in all manner of things, from website development, to keeping the sites high ranking in searches, to helping us solve the GoDaddy-related problems that pop up (often they just handle it Continue Reading