I am excited to announce the winners of the third and final round of WhichTestWon’s first-annual Adobe Target Awards. If you have not been following the contest, we askedAdobe Target users to submit their best A/B tests for a chance to win a free trip toAdobe Summit 2016 in Las Vegas. This time around, the winners include Steve Rude of Thomson Reuters for Web and Wendy Melemed of The McClatchy Group for Mobile. Wendy and Steve will join our other four winners at Adobe Summit in less than two weeks for session S910 Step Right Up and Guess the Test Winner: WhichTestWon’s Adobe Target Awards.

Without further ado, here are our final winning tests and the people who submitted them.

For the Web Category, Steve Rude

For the Web category, Steve Rude, Senior Optimization Specialist at Thomson Reuters, submitted a test that demonstrated the power of simplification. Steve works in the Legal Division of Thomson Reuters, which provides solutions, support, and print products for those in the legal profession throughout the U.S. and around the world. He wanted to apply the learnings he had gained from testing a site redesign that experienced huge success because it simplified the visitor’s experience, reducing their “cognitive overload.”

In the test submitted for the contest, Steve simplified the purchase process. The winning variation not only increased one-time purchases, but also overall projected annual revenue — the real measure of success for this test.

Steve was a part of the beta testing team for Adobe Analytics for Adobe Target, which he views as “outstanding.” He notes “There’s nothing we can’t do that we want to do as far as post-test analysis goes. Without that integration, if you don’t think of all the questions you want to answer up front, you can’t get to that data. With that integration, we can get to everything — even things we didn’t know we wanted. It’s so powerful!”

He is also a big fan of the visual experience composer of Adobe Target. He appreciates that non-technical people can easily set up tests from start to finish — even when adding scripts and layering experiences on top of each other. This opens up a much bigger pool of candidates when the time comes to further expand his optimization team. Plus, he says it is much easier and faster to set up the main success metric to measure interaction with a specific element on a page. He just navigates to the element on the page, selects it, and he is done — a process that used to take up to 30 minutes and now takes about 5 seconds.

Steve loves how willingly people in the testing and optimization community share their experiences and help each other, noting that WhichTestWon exemplifies this attitude. He is grateful for the opportunity to share back at the Adobe Summit session.

For the Mobile Category, Wendy Melemed

For our Mobile category, Wendy Melemed, Analytics Testing Specialist of The McClatchy Group, submitted a test that focused on one of the most heavily used elements of any digital channel: the navigation bar. The McClatchy Company is a leading newspaper and Internet publisher working with well-known newspapers such as the Sacramento Bee, the Miami Herald, and others known not only in their local markets, but also across the U.S. The McClatchy Group oversees and optimizes these newspapers’ websites.

Wendy explains that, due to the shift of their audience to mobile, the company has taken a “mobile first” approach to optimization in which mobile means smartphone and tablet. She explains that, given the tight real estate of mobile, “It’s easier to expand from mobile to desktop than the other way around.” In practical terms, this means that they optimize for smartphones and tablets first and then translate the experience to desktop. However, they measure the impact of everything they test on all three — smartphone, tablet, and desktop — because they have discovered time and again that users respond differently on different devices. This test supported that truism yet again.

In the test she submitted, Wendy tested the impact on page views per visit and time spent per visit when hiding or showing the navigation bar when scrolling down or up the page. It turns out that tablet users responded very favorably to one test variation over the others, increasing both key metrics.

For Wendy, when it comes to mobile, one of the most valuable capabilities of Adobe Target is its advanced targeting to specific segments. She notes, “I can set up rules-based targeting, and all I have to do is click a button, and all of a sudden, I’m targeting mobile.”

Wendy recalls that when she first started using Adobe Target, working with Adobe Target consultants proved extremely valuable. She says, “They don’t just give you test ideas; they ask about your business goals and what you’re trying to achieve. They also posit things that you wouldn’t necessarily know, such as testing every element on the page to see what has the most impact.”

To learn about the tests that won the previous rounds, read my blog posts about thefirst– andsecond-round winners. Better yet, join us at Adobe Summit on March 22, for session S910. In that session, Andrea Warner of WhichTestWon.com and I will highlight each winning test, let the audience guess which test-experience variation won, and hear from the winners as they explain the backstory and any key takeaways of their tests. Plus, we will announce the Grand Prize Winner and let the audience vote for the winner of the People’s Choice Award.