Tagged: information design

I am joining a discussion on Information Visualization and Interaction Design… and the integral role of Cognitive Art to deliver innovative HCI (Human-Computer-Interfaces.)

Heare are sample projects that I have been involved in. This set showcases: multi-modal user interfaces, metaphorical abstractions, and cognitive models, as well as ergonomic form factors that optimize for extreme ease of use.

d.SCI refers to a methodology that I am working on which purposely intersects design and science. In this particular discussion, human congition and affect are the topics of interest.

“Service Design is big. Being holistic, it includes the researching, envisioning and orchestrating of service experiences that happen over time and across multiple touch points with many stakeholders involved, both frontstage and backstage.”

“At Service Design Week, we seek to strip away any fluff, examining service design methods and processes at their core, and unpack the practical tools and skill-sets, hard and soft, needed for this way of working. Service Design Week will gather service design leaders from various functions and disciplines across all flavors of Service Design. With content for all levels of Service Design maturity, we look forward to drawing both fledging and experienced service designers.”

I am looking forward to joining Service Design Week and I would like to thank Michel DeJager and the team at the International Quality & Productivity Center for their kind invitation. My talk will discuss C3LM, Customer Co-Creation Lifecycle Methodology, in the context of Blended Service Design, which I will take care of defining and demystifying in my talk.

I am proud to share that C3LM is the recipient of a Nokia Innovation Award. My work seeks to interweave a set of known and brand new interdisciplinary practices to best address end-to-end solutions for complex and dynamic environments, also known as soft systems given their organic and morphing nature. And, most importantly, achieving that by optimizing for the delivery of quality experiences while humanizing low and high tech in the process.

Widespread digitalization in our everyday activities is not just far reaching, but is also leading to a renaissance in Human Factors disciplines. The delivery of “effective quality services” with “highly efficient end-to-end solutions” is the reason for being and rationale behind creating C3LM. This new brave world entails Blended Services that intersect Data Science, Automation and Programmability, all orchestrated with Human Centered Design in mind.

My talk will also cover how we can best experience Artificial Intelligence and how to make it transparent to Blended Services. That will be a sneak preview in advance to another talk that I’m giving early next year. In case you have already heard what Elon Musk has to say about AI, let me share that Human Factors Engineering has been revisited and redefined to come to the rescue. More on that when we get to meet at Service Design Week : )

“Innovation is a risky business and the failure rate is high. Traditional approaches to consumer research may exacerbate the problem. There are many shortcomings with traditional research approaches, and one of the main ones is that data collection focuses on what people say they do, rather than on what is actually driving behavior.” – Behavioral Science – Do people do what they say will do? by Innovia.

I would like to thank Tim Goldrein for accepting my invitation to discuss the impact of Human Factors in tech innovation with our Solutions team in the Applications & Analytics Group.

Tim works for Innovia Technology and will be visiting Nokia’s Chicago Technology Center, Naperville Campus, on Monday, May 8. He is a physicist from University of Cambridge, UK, with a research background on ballistics who has spent the past 15 years addressing human factors led innovation.

Tim will share insights from recent projects as well as highlights of work done for Nokia back in 2003. About 15 years have gone by and he will conduct a retrospective to unveil who ended up implementing those concepts in today’s market.

Post May 8 Session Notes – Tim’s talk covered the need for gaining a deeper understanding of people as both individuals and collectives to best inform the design of new products, services and business models. Tim emphasized the value of a holistic approach to problem solving and a focus on behavioral drives. He stated that conventional research solely looking at attitudes and beliefs can miss critical insights.

Nokia’s community can access Tim’s presentation and recording on my work blog.

I am now taking the chance to share my thoughts on this topic and, whether we call it “stated vs. observed behavior” or “reported vs. actual paradoxes,” the point is that those of use working on Human Factors Engineering and/or leveraging Design Thinking cannot just rely on product or service requirements as described by customers and end users themselves.

Therefore, on location ethnographic research coupled with instrumentalizing objects, tools and environments to gather telemetry as they are being used over their useful lives are also of the essence, given user permission as this entails privacy concerns.

“According to Alan Mulally, former Ford Motor Company CEO, Henry Ford said that if, when he founded his company, he had asked potential customers what they wanted, they would have said faster horses.” – Quote Investigator.

Speaking of ethnographic research, on my very first day as a student of Human Factors Engineering at BarcelonaTech, we covered the so-called Hawthorne Effect.

Hawthorne Works was a Western Electric factory in the Chicago area, which is part of Bell Labs’ outstanding legacy.

I’m now inserting a side personal note: I now live Chicagoland and have worked with Bell Labs, now part of Nokia.

More than a century ago, going all the way back to the 1920s and 30s, Hawthorne Works undertook a study to assess what lighting levels correlated to higher productivity levels.

However, research findings revealed that (a) worker’s awareness of being observed in the context of (b) paying attention to their needs in the workplace elevated their motivation and productivity, which trumped other factors such as lighting levels whether they would be set low or high.

I would also like to share another interesting observation. This one involving Bell Lab’s own John Karlin:

“The Times, who refer to Karlin as widely considered the father of human-factors engineering in American industry, relates an amusing story of an earlier project–one that demonstrates his keen understanding of human behavior: an early experiment involved the telephone cord.”

“In the postwar years, the copper used inside the cords remained scarce. Telephone company executives wondered whether the standard cord, then about three feet long, might be shortened.”

“Mr. Karlin’s staff stole into colleagues’ offices every three days and covertly shortened their phone cords, an inch at time. No one noticed, they found, until the cords had lost an entire foot. From then on, phones came with shorter cords.”

Once again, I’d like to thank Tim for his talk and for the also interesting discussions that preceded and followed that session. We both agree on the positive impact of holistic and interdisciplinary practices, which lead to a disciplined and robust approach to defining value based outcomes.

This is about innovative solutions humanizing technology in everyone’s best interest. So, it definitely pays to leverage Behavioral Sciences and Behavioral Economics when addressing serial innovation programs.

“To envision information –and what bright and splendid visions can result- is to work at the intersection of image, word, number, art […] the principles of information design are universal – like mathematics.” “We envision information in order to reason about, communicate, document, and preserve knowledge – activities nearly always carrier out on two-dimensional paper and computer screen. Escaping this flatland and enriching the density of data displays are the essential tasks of information design.” – “Envisioning Information” by Edward R. Tufte.

“In today’s data driven world, professionals need to know how to express themselves in the language of graphics effectively and eloquently […] the ability to create effective charts and graphs has become almost as indispensable as good writing.” “Yet information graphics is rarely taught in schools or is the focus of on-the-job-training.” “With computer technology, anyone can create graphics, but few of us know how to do it well.” “Ultimately, it is content that makes graphics interesting. When a chart is presented properly, information just flows to the viewer in the clearest and most efficient way.” – “The Wall Street Journal Guide to Information Graphics” by Dona M. Wong.

This week I attended Edward Tufte’s “Presenting Data and Information” in Chicago. E.T. is teaching at Yale and is a leading expert in information architecture whose books are worth studying. It is worth noticing that information theory was first discussed by Claude E. Shannon, an MIT grad working at Bell Labs, Alcatel-Lucent’s research arm, in the 1940s.

I am now taking time to process my own thoughts and get new discussions on this topic going. From a marketing angle, advancing quality content is key to positioning thought leadership in the advent of emerging technologies. Strategy wise, the working assumption is that topic authorities and first movers can leapfrog competitors and capture significant industry mindshare. This eventually converts into actual market share in capital intensive industries such as network infrastructure and platforms in the telecommunications sector.

As more vendors join a nascent technology space, more voices prompt a pressing need to collaborate, share terms and constructs, as well as making an honest individual difference to rise above the pack. Technology evangelization programs and marketing efforts fostering the diffusion of things to come and innovations are only as good as the information and the intellectual leadership behind them. Success stories involving customers and partners (early adopters) and game changing breakthroughs solidify that narrative with added credibility and reputation.

Instead of just a slide with bullet points, text boxes, or a checkmark table for that matter, I created the above map as an abstraction to help visualize, plot, brainstorm and discuss what I think are key attributes of quality content. Note a sweet spot right in the middle. We can envision elegant content as [a] captivating and engaging, as well as thought provoking and worth [b] consuming, immersing, sharing and referring to. Elegant content results in high response levels and outperforms.

This last statement couples two sets of [a] leading and [b] lagging indicators given the need for defining success metrics based on cause-effect correlations, so that we get to know what works and what doesn’t. Leading indicators are understood as predictors of success (or failure), while lagging indicators become evidence. In this particular case, both [a] and [b] sets talk to the end user’s experience.

As an example, “immersing” entails depth of engagement and interactivity, which includes a feedback loop: a virtuous circle where the end user not only gains new knowledge of interest, but can also annotate, enhance and build new content upon what’s provided.

Reading the above map’s horizontal axis:

extreme left – oversimplification defeats the purpose of synthesizing content as not enough meaningful information is delivered or the message is just too cryptic

sweet spot – simple messages that are crisp and condense memorable insights as well as sophisticated ones that appeal to the recipient’s curiosity, personal discovery and intellectual excitement

extreme right – overly complexity fails to educate, dilutes messages and creates confusion due to diminishing and even negative returns from information overload

Regarding the map’s vertical axis:

extreme top – time wasters negate any benefits as there is no interest, which triggers a desire to abandon the session and harms reputation in the process

sweet spot – information that is relevant to the audience and consumable in an user friendly and progressive manner, users can browse and get what they need when they need it on demand

extreme bottom – time consuming exercises translate into overhead, that is more work than needed to infer information, causing detrimental fatigue and rising opportunity costs

At this point, if you are wondering if my map is just stating the obvious, you will be right. The objective is to deliver a construct we all can easily agree with and a workable framework for measurements. Beyond that point, this exercise has more to do with plotting where any given content marketing project would fall, then discussing observations on best practices, as well as evolving and changing the above visual as needed. Once populated, it becomes an infographic.

Just to provide some examples: content depicting highly technical subjects can easily fall in the lower right if complexity is not adequately addressed; some of the vaporware, smoke and mirrors can populate the upper left quadrant; seemingly simple yet cryptic content finds a home in the lower left.

Meeting Edward Tufte at “Presenting Data and Information” in Chicago, April 2014.

I would also like to take this chance to comment on the fact infographics have grown in popularity. This is also exposing deficits as quite a few happen to perform poorly and others drive misleading insights, confuse the issue, appear gratuitous and become a disservice. The same applies to a fair amount of slideware. Nonetheless, we shouldn’t overgeneralize because presenting data benefits from good graphic work.

Additionally, it makes sense to shift from “presentation” software to “discussion tools” on many occasions. Presentation formats assume a scripted narrative conveyed sequentially slide after slide. Existing technologies allow more options and possibilities. Earlier in the year I created a discussion tool allowing for interactive narratives based on a modular storyboard. Each live customer conversation was actually customized on the fly, as the discussion progressed in real time.

Instead of working with conventional slideware and text box or bullet point style charts, the tool allows instant access to any item by means of an Internet browser, all featuring eye friendly informational graphics as well as relevant photography (no stock photos). If interested, I will be sharing more on this in future articles.