Menu

Search

UK Judge: Search is Theft

paidContent UK’s NLA Ruling Summary: How PRs Break Copyright Law Online offers the highlights of a 148-paragraph ruling by the British High Court “that PRs who subscribe to paid news monitors are breaking UK law by effectively copying a substantial part of online news articles.”

The product in question is Meltwater News, an online global media monitoring service that allows subscribers to track “keywords, phrases, and topics in over 130,000 sources from over 190 countries and 100 languages, monitored consistently throughout the day.”

The judge argues that in reprinting publications’ headlines or summaries of longer than 256 characters, the service is “stealing” the publishers’ content, even though Meltwater quite naturally provides links so users who are interested in a given piece of content can click through to the original. Since these summaries and headlines are cached on my computer, as an end-user I am complicit in the theft of content I didn’t pay for, says the judge.

If this ruling sticks, and if it ripples out, it will cripple or kill existing and emerging services that help people find content.

A List Apart No. 304

Like CSS, JavaScript works best when stored in an external file that can be downloaded and cached separately from our site’s individual HTML pages. To increase performance, we limit the number of external requests and make our JavaScript as small as possible. The inventor of Extreme JavaScript Compression with YUI Compressor reveals coding patterns that interfere with compression, and techniques to modify or avoid these coding patterns so as to improve the YUI Compressor’s performance. Think small and live large.

Faceted navigation may be the most significant search innovation of the past decade. It features an integrated, incremental search and browse experience that lets users begin with a classic keyword search and then scan a list of results. It also serves up a custom map that provides insights into the content and its organization and offers a variety of useful next steps. In keeping with the principles of progressive disclosure and incremental construction, it lets users formulate the equivalent of a sophisticated Boolean query by taking a series of small, simple steps. Learn how it works, why it has become ubiquitous in e-commerce, and why it’s not for every site.

Of Google and Page Speed

Google’s addition of a page speed signal to its search rankings algorithm officially links performance with search engine marketing. The loading speed of a web page affects user psychology in a number of ways, and now it can effect its rankings as well.

This back-to-basics message catches us at a funny time in web design history.

“Make more of less” has long been the norm

Most of us who’ve designed sites for quite a while, and who consider ourselves user- and standards-focused, have traditionally designed sites that loaded faster than the competition. We did it by using caching technologies (CSS instead of table layouts, linked instead of inline JavaScript, and so on). For many, many years, we also did it by keeping images to a minimum, using system fonts instead of pictures of type, CSS colors instead of faux backgrounds, and so on.

As the web audience grew, heavily trafficked sites became even more restrictive in their decorative flourishes, whether they cared about web standards or not. Thus Google, while happily using bad CSS and markup, exerted monk-like discipline over its designers. Not only were images out, even such details as rounded corners were out, because the tiny images needed to facilitate rounded corners prior to CSS3 added a tenth of a kilobyte to page weight, and a tenth of a kilobyte multiplied by a billion users was too much.

Of late, we have grown fat

Yet in the past few years, as wideband became the norm, every mainstream site and its brother started acting as if bandwidth didn’t matter. Why use 1K of web form when you could use 100K of inline pseudo-Ajax? Why load a new page when you could load a lightbox instead?

Instead of medium-quality JPEGs with their unimportant details painstakingly blurred to shave KB, we started sticking high-quality PNG images on our sites.

As these bandwidth-luxuriant (and not always beautiful, needed, or useful) practices became commonplace on mainstream sites, many advanced, standards-focused web designers were experimenting with web fonts, CSS3 multiple backgrounds, full-page background images, and other devices to create semantic, structurally lean sites that were as rich (and heavy) as Flash sites.

So now we face a dilemma. As we continue to seduce viewers via large, multiple background images, image replacement, web fonts or sIFR, and so on, we may find our beautiful sites losing page rank.

Search Party

Despite the fact that site search often receives the most traffic, it’s also the place where the user experience designer bears the least influence. Few tools exist to appraise the quality of the search experience, much less strategize ways to improve it. But relevancy testing and precision testing offer hope. These are two tools you can use to analyze and improve the search user experience.

Your search and clickstream data is missing a key ingredient: customer intent. You have all the clicks, the pages people viewed, and where they bailed, but not why they came to the site. Your internal site-search data contains that missing ingredient: intent. Learn five ways to analyze your internal site-search data—data that’s easy to get, to understand, and to act on.

Top-down analytics are great for creating measurable goals you can use to benchmark and evaluate the performance of your content and designs. But bottom-up analysis teaches you something new and unexpected about your customers—something goal-driven analysis can’t show you. Discover the kinds of information users want, and identify your site’s most urgent mistakes.