Can a reader typically tell that a story was generated by robot? Seems likely that after some time regular subscribers to a sports site will notice that their newsletters (or whatever the output) are exactly the same structure and lack (presumably) the flourishes that a human writer might add.

And, if they do notice, do they perceive the quality as better, worse or on par with a human-written article?

I have read a few Narrative Science articles, and I could see an error or two. But this is the beginning. These bugs will get worked out, and this will likely become a trend. There is so much information out there today to be written about that there will be a need for algorithms to put some stories together.

But other than mining data, I'm not sure how these software programs will write more complex stories for the time being.

I wrote about Narrative Science shortly after it received a round of funding in September 2013 that brought up the total funding at that point to $20 million. The concept came to be a as a Northwestern University research project called StatsMonkey. Two professors, Kris Hammond and Larry Birnbaum advised computer science and journalism students on developing software that could generate an account of baseball games solely from batter statistics. After college, two students, John Templon and Nick Allen obtained funding to launch the business that was incorporated as Narrative Science in January 2010. Subsequently, StatsMonkey was replaced by the more sophisticated Quill™, a "patented artificial intelligence authoring platform."

In a guest blog on HBR, entitled "The Value of Big Data Isn't the Data," Hammond made argued that algorithms write better narratives based on big data than peopple because algorithms make the process of turning big data into narratives that people relate to seamless. "By embracing the power of the machine, we can automatically generate stories from the data that bridge the gap between numbers and knowing."

And the narratives are supposed not sound robotic. Jonathan Morris, COO of a financial analysis firm called Data Explorers, which set up a securities newswire using Narrative Science technology, was quoted in a Wired article saying, "You can get anything, from something that sounds like a breathless financial reporter screaming from a trading floor to a dry sell-side researcher pedantically walking you through it."

I agree with you that any story or piece of text, which is generated by a robot, be as human-like as possible.

No human would write any story or text in exactly the same way twice. We would include the use of synonyms - both in terms of the variety of the vocabulary we use, as well as variations in sentence and paragraph structure and form.

However, only one software company owns the US patent of this feature.

Yseop (full disclosure: I work for them) is a natural language generating software based on artificial intelligence which writes - truly just like a human being. We are able to do this thanks to a unique, patented aspect of our technology which allows us to incorporate synonyms, both in terms of word choice and sentence structure on the text. Each person can specify the vocab words most appropriate to their industry and the type of output they want the robot to produce, and Yseop ensures the human-like nature of the text!

In our ideal world, readers wouldn't ask this question because they wouldn't notice the difference!

Only be Yseop can be installed on a customers' servers (so they can maintain data confidentiality). Yseop can also be run in the cloud.

Only Yseop allows users to build & maintain applications on their own.

Only Yseop provides non-Regression, Impact and Coherence Testing tools to ensure accuracy and consistency of the generated text.

Only Yseop holds a patent on its unique ability to write using synonyms.

However, to answer your question more specifically--it is really this last point which ensures that Yseop writes "like a human being". As I mentioned in my previous comment, we are able to do this because we incorporate synonyms, both in terms of word choice and sentence structure in the software. This ensures that no two sentences are written exactly the same way for the same type of output. Yseop knows the difference between a subject, verb and complement - which allows us to dynamically construct sentences from the rules of grammar. We do not use templates.

To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.

Is DevOps helping organizations reduce costs and time-to-market for software releases? What's getting in the way of DevOps adoption? Find out in this InformationWeek and Interop ITX infographic on the state of DevOps in 2017.

Transformation is on every IT organization's to-do list, but effectively transforming IT means a major shift in technology as well as business models and culture. In this IT Trend Report, we examine some of the misconceptions of digital transformation and look at steps you can take to succeed technically and culturally.