It's hard to discuss the role of Artificial Intelligence (AI) in the workplace until you decide what AI is. Some academics tell us -- using lots of words -- that AI is computers that think, learn and ultimately act like humans while others hold that maximizing the interaction between computers and their humans -- such as in Human Computer Interaction or HCI -- qualifies as the closest thing to AI we are likely to see. Until you decide on which side of that dichotomy you fall, it's difficult to understand how, or if, AI contributes to business, and if so, how to improve its contributions. Our fascination with the idea of machines that think like humans goes back millennia, but it's only recently that it appears to potentially be in reach. And while AI research has uncovered some amazing technological capabilities, it has also run into a quagmire in its attempts to 1) agree on just what human intelligence is; and 2) the extent to which technology might be capable of replicating it.

For the most of my life, I have earned my living as a computer vision professional busy with image processing tasks and problems. In the computer vision community there is a widespread belief that artificial vision systems faithfully replicate human vision abilities or at least very closely mimic them. It was a great surprise to me when one day I have realized that computer and human vision have next to nothing in common. The former is occupied with extensive data processing, carrying out massive pixel-based calculations, while the latter is busy with meaningful information processing, concerned with smart objects-based manipulations. And the gap between the two is insurmountable. To resolve this confusion, I had had to return and revaluate first the vision phenomenon itself, define more carefully what visual information is and how to treat it properly. In this work I have not been, as it is usually accepted, biologically inspired . On the contrary, I have drawn my inspirations from a pure mathematical theory, the Kolmogorov s complexity theory. The results of my work have been already published elsewhere. So the objective of this paper is to try and apply the insights gained in course of this my enterprise to a more general case of information processing in human brain and the challenging issue of human intelligence.

"When you are born, you know nothing." This is the kind of statement you expect to hear from a philosophy professor, not a Silicon Valley executive with a new company to pitch and money to make. A tall, rangy man who is almost implausibly cheerful, Hawkins created the Palm and Treo handhelds and cofounded Palm Computing and Handspring. His is the consummate high tech success story, the brilliant, driven engineer who beat the critics to make it big. Now he's about to unveil his entrepreneurial third act: a company called Numenta. But what Hawkins, 49, really wants to talk about -- in fact, what he has really wanted to talk about for the past 30 years -- isn't gadgets or source codes or market niches.

According to scientists and legal experts, responding to the bank's warning this November, there is now an urgent need for the development of intelligent algorithms to be put on the political agenda. Top of the agenda as far as Lightfoot is concerned is the economic impact if AI cuts large amounts of jobs and the incomes from people, how will they make a living and what will they do, a concern that Professor Toby Walsh, an expert in AI at Australia's University of New South Wales and a prominent campaigner against the use of AI in military weapons, says is justified and one that needs to be urgently considered. Though Professor Walsh and fellow AI expert Murray Shanahan, Professor of Cognitive Robotics at London's Imperial College were wary of calls for regulation of the sector, which they said, would inhibit research. According to Professor Walsh scientists working in AI have already started to exercise a degree of self-control over the exploitation of the discoveries being made in AI the areas that need to be focussed on are the ramifications of the technology.