Tag Archives: Mobile Apps

Today’s marketing and business development professionals use a wide array of big data collection and analytical tools to create and refine sophisticated profiles of market segments and their customer bases. These are deployed in order to systematically and scientifically target and sell their goods and services in steadily changing marketplaces.

These processes can include, among a multitude of other vast data sets and methodologies, demographics, web user metrics and econometrics. Businesses are always looking for a data-driven edge in highly competitive sectors and such profiling, when done correctly, can be very helpful in detecting and interpreting market trends, and consistently keeping ahead of their rivals. (The Subway Fold category of Big Data and Analytics now contains 50 posts about a variety of trends and applications in this field.)

I will briefly to this add my own long-term yet totally unscientific study of office-mess-ographics. Here I have been looking for any correlation between the relative states of organization – – or entropy – – in people’s offices and their work’s quality and output. The results still remain inconclusive after years of study.

One of the most brilliant and accomplished people I have ever known had an office that resembled a cave deep in the earth with piles of paper resembling stalagmites all over it. Even more remarkably, he could reach into any one of those piles and pull out exactly the documents he wanted. His work space was so chaotic that there was a long-standing joke that Jimmy Hoffa’s and Judge Crater’s long-lost remains would be found whenever ever he retired and his office was cleaned out.

Speaking of office-focused analytics, an article posted on VentureBeat.com on March 5, 2016, entitled CMOs: ‘Technographics’ is the New Demographics, by Sean Zinsmeister, brought news of a most interesting new trend. I highly recommend reading this in its entirety. I will summarize and add some context to it, and then pose a few question-ographics of my own.

New Analytical Tool for B2B Marketers

Marketers are now using a new methodology call technography to analyze their customers’ “tech stack“, a term of art for the composition of their supporting systems and platforms. The objective of this approach is to deeply understand what this says about them as a company and, moreover, how can this be used in business-to-business (B2B) marketing campaigns. Thus applied, technography can identify “pain points” in products and alleviate them for current and prospective customers.

Using established consumer marketing methods, there is much to be learned and leveraged on how technology is being used by very granular segments of users bases. For example:

By virtue of this type of technographic data, retailers can target their ads in anticipation of “which customers are most likely to shop in store, online, or via mobile”.

Next, by transposing this form of well-established marketing approach next upon B2B commerce, the objective is to carefully examine the tech stacks of current and future customers in order to gain a marketing advantage. That is, to “inform” a business’s strategy and identify potential new roles and needs to be met. These corporate tech stacks can include systems for:

Technographics can provide unique and valuable insights into assessing, for example, whether a customer values scalability or ease-of-use more, and then act upon this.

As well, some of these technographic signals can be indicative of other factors not, per se, directly related to technology. This was the case at Eloqua, a financial technology concern. They noticed their marketing systems have predictive value in determining the company’s best prospects. Furthermore, they determined that companies running their software were inclined “to have a certain level of technological sophistication”, and were often large enough to have the capacity to purchase higher-end systems.

As business systems continually grow in their numbers and complexity, interpreting technographic nuances has also become more of a challenge. Hence, the application of artificial intelligence (AI) can be helpful in detecting additional useful patterns and trends. In a July 2011 TED Talk by Ted Slavin, directly on point here, entitled How Algorithms Shape Our World, he discussed how algorithms and machine learning are needed today to help make sense out of the massive and constantly growing amounts of data. (The Subway Fold category of Smart Systems contains 15 posts covering recent development and applications involving AI and machine learning.)

Technographic Resources and Use Cases

Currently, technographic signals are readily available from various data providers including:

They parse data using such factors as “web hosting, analytics, e-commerce, advertising, or content management platforms”. Another firm called Ghostery has a Chrome browser extension illuminating the technologies upon which any company’s website is built.

The next key considerations are to “define technographic profiles and determine next-best actions” for specific potential customers. For instance, an analytics company called Looker creates “highly targeted campaigns” aimed at businesses who use Amazon Web Services (AWS). The greater the number of marketers who undertake similar pursuits, the more they raise the value of their marketing programs.

Technographics can likewise be applied for competitive leverage in the following use cases:

Sales reps prospecting for new leads can be supported with more focused messages for potential new customers. These are shaped by understanding their particular motivations and business challenges.

Locating opportunities in new markets can be achieved by assessing the tech stacks of prospective customers. Such analytics can further be used for expanding business development and product development. An example is the online training platform by Mindflash. They detected a potential “demand for a Salesforce training program”. Once it became available, they employed technographic signals to pinpoint customers to whom they could present it.

Enterprise wide decision-making benefits can be achieved by adding “value in areas like cultural alignment”. Familiarity with such data for current employees and job seekers can aid businesses with understanding the “technology disposition” of their workers. Thereafter, its alignment with the “customers or partners” can be pursued. Furthermore, identifying areas where additional training might be needed can help to alleviate productivity issues resulting from “technology disconnects between employees”.

Many businesses are not yet using technographic signals to their full advantage. By increasing such initiatives, businesses can acquire a much deeper understanding of their inherent values. In turn, the resulting insights can have a significant effect on the experiences of their customers and, in turn, elevate their resulting levels of loyalty, retention and revenue, as well as the magnitude of deals done.

My Questions

Would professional service industries such as law, medicine and accounting, and the vendors selling within these industries, benefit from integrating technographics into their own business development and marketing efforts?

Could there be, now or in the future, an emerging role for dedicated technographics specialists, trainers and consultants? Alternatively, should these new analytics just be treated as another new tool to be learned and implemented by marketers in their existing roles?

If a company identifies some of their own employees who might benefit from additional training, how can they be incentivized to participate in it? Could gamification techniques also be applied in creating these training programs?

What, if any, privacy concerns might surface in using technographics on potential customer leads and/or a company’s own internal staff?

Like most training and support documentation, car owner’s manuals are usually only consulted when something goes wrong with a vehicle. Some people look through them after they purchase a new ride, but usually their pages are about as interesting to read as watching paint dry on the wall. Does anyone really care about spending much quality time immersed in the scintillating details of how the transmission works unless you really must?

Not unsurprisingly, the results of a Google search on “car owner’s manuals” were, well, manifold as there exists numerous free sites online that contain deep and wide digital repositories of this highly detailed and exhaust-ively diagrammed support.

Now comes news that the prosaic car owner’s manual has been transformed into something entirely new with its transposition into an augmented reality (AR) application. This was the subject of a fascinating report on CNET.com on November 10, 2015 entitled Hyundai Unveils an Augmented-Reality Owner’s Manual by Andrew Krok. I will summarize and annotate it, and then pose a clutch of my own questions.

Adding an entirely new meaning to the term “mobile” app, it is officially called the Hyundai Virtual Guide and can be used on a smartphone or tablet. It will soon be available for downloading on both Google Play and Apple’s App Store. It compresses “hundreds of pages of information” into the app and, in conjunction with the owner’s mobile device’s camera, can recognize dozens of features and several basic maintenance operations. This includes 82 videos and 50 more informational guides. Its equivalent, if traditionally formatted on paper, would be hundreds of pages.

The augmented reality implementation in the app consists of six 3D images. When the user scans his or her mobile over a component of the car such as the engine or dashboard, the screen image will be enhanced with additional “relevant information”. Another example is pointing the mobile’s camera and then clicking on “Engine Oil”, which is then followed by instructions on how to use the dipstick to check the oil level.

To start off, the app will only be available at first for the 2015 Hyundai Sonata model. Other models will later be made compatible with the app.

Hyundai chose which systems of the car to include in the app by surveying buyers on “the most difficult features to figure out”. Because everyone today is so accustomed to accessing information on a screen, the company determined that this was among the best ways to inform buyers about their new Sonatas.

The company has previously created other virtual means to access some of their other manuals. These have included an iPad configured with the owner’s manual of their Equus sedan and another app that displays the manual inside the passenger compartment on the “infotainment system’s touchscreen”.

My questions are as follows:

While this AR app is intended for consumers, is Hyundai considering extending the scope of this development to engineer a more detailed and sophisticated app for car dealers and service stations for maintenance and repairs?

Could an AR app make the process of yearly state car inspections faster and more economical?

Are the heads-up displays that project some of the dashboard’s information into the lower part of the windshield for drivers to see in some production models another form of automotive AR that could somehow be used together with the new AR manual app?

Would other consumer products such as, among others, electronics and recreational equipment benefit from AR manuals?

Just as we now see traditional car owner’s manuals gathered, cataloged and accessed online as described above, will future automotive AR apps similarly be imported into dedicated online libraries?

What entrepreneurial opportunities might be forming in the design, production and implementation of AR manuals and other automotive AR apps?

Could other important personal items such as prescription drug packaging benefit from an AR app because so few people every read the literature accompanying their medicines? In other words, would an AR app increase the probability that important information on dosages and potential adverse reactions will be read because of the more engaging and interactive format?

What does a song actually look like in 3D? Everyone knows that music has always been evocative of all kinds of people, memories, emotions and sensations. In a Subway Fold post back on November 30, 2014, we first looked at Music Visualizations and Visualizations About Music. But can a representation of a tune now be taken further and transformed into a tangible object?

Yes, and it looks pretty darn cool. A fascinating article was posted on Wired.com on July 15, 2015, entitled What Songs Look Like as 3-D Printed Sculptures by Liz Stinson, about a new Kickstarter campaign to raise funding for the NYC startup called Reify working on this. I will sum up, annotate and try to sculpt a few questions of my own.

Reify’s technology uses sound waves in conjunction with 3D printing¹ to shape a physical “totem” or object of it. (The Wired article and the Reify website contain pictures of samples.) Then an augmented reality² app in a mobile device will provide an on-screen visual experience accompanying the song when the camera is pointed towards it. This page on their website contains a video of a demo of their system.

The firm is led by Allison Wood and Kei Gowda. Ms. Wood founded it in order to study “digital synesthesia”. (Synthesia is a rare condition where people can use multiple senses in unusual combinations to, for example, “hear” colors, and was previously covered in the Subway Fold post about music visualization linked to above.) She began to explore how to “translate music’s ephemeral nature” into a genuine object and came up with the concept of using a totem.

Designing each totem is an individualized process. It starts with analyzing a song’s “structure, rhythm, amplitude, and more” by playing it through the Echo Nest API.³In turn, the results generated correspond to measurements including “height, weight and mass”. The tempo and genre of a song also have a direct influence on the shaping of the totem. As well, the musical artists themselves have significant input into the final form.

The mobile app comes into play when it is used to “read” the totem and interpret its form “like a stylus on a record player or a laser on a CD”. The result is, while the music playing, the augmented reality component of the app captures and then generates an animated visualization incorporating the totem on-screen. The process is vividly shown in the demo video linked above.

Reify’s work can also be likened to a form of information design in the form of data visualization4. According to Ms. Wood, the process involves “translating data from one form into another”.

My questions are as follows:

Is Reify working with, or considering working with, Microsoft on its pending HoloLens augmented reality system and/or companies such as Oculus, Samsung and Google on their virtual reality platforms as covered in the posts linked to in Footnote 2 below?

How might Reify’s system be integrated into the marketing strategies of musicians? For example, perhaps printing up a number of totems for a band and then distributing them at concerts.

Would long-established musicians and performers possibly use Reify to create totems of some their classics? For instance, what might a totem and augmented reality visualization for Springsteen’s anthem, Born to Run, look like?