A tiny research team at Tableau is building tomorrow’s UX for data

Tableau Software is many things: a fast-growing thorn in the side of legacy analytics vendors, stock-market gold and the poster child for the next generation of user-friendly data analysis, among them. It’s also a company with a deeply rooted and growing research culture that’s responsible for nearly everything users see when they open its popular visualization application.

In fact, Tableau itself is the product of a Stanford Ph.D. dissertation by co-founder and Chief Development Officer Chris Stolte, in conjunction with his then-professor and eventual co-founder Pat Hanrahan. Their project, called Polaris, combined a structured query language with a declarative language for describing data visualization. When they commercialized the research by founding Tableau, that combination — which came together into a technology called VizQL — became the defining feature of the drag-and-drop Tableau experience.

Advertisement

However, the true value of what Stolte and Hanrahan created wasn’t just that let it let mainstream users query data visually and generate graphs, said Jock Mackinlay, Tableau’s vice president of visual analysis, who joined in 2004 after 18 years specializing in data visualization at Xerox PARC. There had been a lot of research done around ideal ways to visualize data — including his own — but they often focused on customized views of a single problem or type of analysis.

“The real power [of Tableau] was to go through a bunch of different views to answer one question,” Mackinlay said. “All you have to be an expert at is your data and the questions you want to ask of it.”

Jock Mackinlay. Source: Tableau Software

Now, a new research division within Tableau (it technically was created about a year and a half ago) is trying to imagine and create the next set of technologies that change the way data analysis is done. The five-person team, which Mackinlay heads, consists of four visualization experts (including Mackinlay), a couple of whom are also specialize in statistics and one of whom specializes in high-performance computing. The fifth member specializes in natural-language processing and computer graphics.

Like any research division, the team writes academic papers and works on some projects that might not be applicable for years, but Mackinlay made it pretty clear that the researchers expect everything they’re doing could be commercialized. If there was one thing that separated the famous Bell Labs from Xerox PARC or even Microsoft Research, it’s that Bell was really good at doing really good research that made its way into products, he said. Good research labs need to find the middle ground between nearsighted product upgrades and pie-in-the-sky ideas and, he explained, “You have to have absolutely no gap between the research scientists … and the people who are actually doing the work.”

A still image of an interactive Story Points slideshow. Source: Tableau Public user Matt Francis

It’s at a much, much smaller scale than Bell Labs, but Mackinlay thinks Tableau is following down that right path. For example, he said, the Story Points feature in the latest release of the company’s software, which lets users create data slideshows of sorts, was the result of tight work between the product team and researcher Robert Kosara, who had been doing research into this area for years. As data volumes, dataset complexity and user sophistication all increase, Mackinlay said systems-level research into data processing (including how to optimize for increased client-side computing power) has and will continue to help deliver a smooth user experience.

He’s understandably less forthcoming about what, specifically, we can expect to see from Tableau in the near term, but Mackinlay did discuss a few areas of interest. One is making it easier to use aesthetically pleasing icons rather than text labels in charts, an area where he and colleague Vidya Setlur (the aforementioned NLP and graphics specialist) recently published a paper. He’s also interested in text analysis and NLP, and generally adding new types of visualizations — some of which those types of analysis will help enable. For example, “node-link diagrams” (aka graphs) will happen, he said, although he can’t put an exact data on when.

A figure from Mackinlay’s and Setlur’s paper, which was presented at the 2014 SIGCHI Conference on Human Factors in Computing Systems.

Mackinlay also suggested that Tableau might expand beyond its current product lineup, which is essentially the same software delivered via the desktop (free and paid), server or cloud. “We can make our existing products easy to use,” Mackinlay said. “We can also make new products that are easy to use — perhaps radically easier than our existing products.”

Although the word “easy” is kind of a misnomer, it’s one that’s used to describe Tableau and other user-friendly software quite often. “Easy” connotes shallowness, Mackinlay said, making an analogy to the evolution of the telephone. Phones have evolved a great deal from those where users just rang the operator, to rotary phones, and now to modern smartphones. With every iteration, manufacturers had to strike the right balance maintaining a recognizable experience but also adding more capabilities.

“We use the two words ‘simple’ and ‘useful,'” he said. “… If you don’t make sure you’re useful, people just aren’t going to stick with you.”