The idea of the ‘runaway brain’

A friend of mine recently introduced me to the idea of the ‘runaway brain’ – a theory first published in 1993 outlining the uniqueness of human evolution. We take a look into how artificial intelligence is developing into something comparable to the human brain and the potential caveats that concern us as human-beings.

The theory considers how humans have created a complex culture by continually challenging their brains, leading to the development of more complex intellect throughout human evolution. A process which continues to occur, even up to today and will again tomorrow, and will no doubt for years to come. This is what theorists claim is driving human intelligence towards its ultimate best.

There are many ways in which we can define why ‘human intelligence’ is considered unique. In essence, it’s characterised by perception, consciousness, self-awareness, and desire.

It was by speaking to a friend that I considered with human intelligence alongside the emergence of artificial intelligence (AI), is it possible for the ‘runaway brain’ to reach a new milestone? After further research, I found some that say it already has.

They label it ‘runaway super intelligence‘.

Storage capacity of the human brain

Most neuroscientists estimate the human brains storage capacity to range between 10 and 100 terabytes, with some evaluations estimating closer to 2.5 petabytes. In fact, new research suggests the human brain could hold as much information as the entire internet.

As surprising as that sounds, it’s not necessarily impossible. It has long been said that the human brain can be like a sponge, absorbing as much information that we throw towards it. Of course we forget a large amount of that information, but take into consideration those with photographic memory or those who practice a combination of innate skills, learned tactics, mnemonic strategies or those who have an extraordinary knowledge base.

Why can machines still perform better?

Ponder this – if human brains have the capacity to store significant amounts of data, why do machines continue to outperform human decision making?

The human brain has a huge range – data analysis and pattern recognition alongside the ability to learn and retain information. A human needs only to glance before they recognise a car they’ve seen before, but AI may need to process hundreds or even thousands of samples before it’s able to come to a conclusion. Perhaps human premeditative assumption, if you will, to save time analysing finer details for an exact match, but conversely, while AI functions may be more complex and varied, the human brain is unable to process the same volume of data as a computer.

It’s this efficiency of data processing that calls on leading researchers to believe that indeed AI will dominate our lives in the coming decades and eventually lead to what we call the ‘technology singularity’.

Technology singularity

Technological singularity is defined by the hypothesis that through the invention of artificial super intelligence abruptly triggering runaway technological growth, which will result in unfathomable changes to human civilization.

According to this hypothesis, an upgradable intelligent agent, such as software-based artificial general intelligence, could enter a ‘runaway reaction’ cycle of self-learning and self-improvement, with each new and increasingly intelligent generation appearing more rapidly, causing an intelligence explosion resulting in a powerful super intelligence that would, qualitatively, far surpass human intelligence.

Ubiquitous AI

When it comes to our day-to-day lives, algorithms often save time and effort. Take online search tools, Internet shopping and smartphone apps using beacon technology to provide recommendations based upon our whereabouts.

Today, AI uses machine learning. Provide AI with an outcome-based scenario and, to put it simply, it will remember and learn. The computer is taught what to learn, how to learn, and how to make its own decisions.
What’s more fascinating, is how new AI’s are modeling the human mind using techniques similar to that of our own learning processes.

Do we need to be worried about the runaway artificial general intelligence?

If we to listen to the cautiously wise words of Stephen Hawking who said “success in creating AI would be the biggest event in human history”, before commenting “unfortunately, it might also be the last, unless we learn how to avoid the risks”.

The answer to whether we should be worried all depends on too many variables for a definitive answer. However, it is difficult not to argue that AI will play a growing part in our lives and businesses.

Rest assured: 4 things that will always remain human

It’s inevitable that one might raise the question is there anything that humans will always be better at?

Unstructured problem solving. Solving problems in which the rules do not currently exist; such as creating a new web application.

Acquiring and processing new information. Deciding what is relevant; like a reporter writing a story.

Non-routine physical work. Performing complex tasks in a 3-dimentional space that requires a combination of skill #1 and skill #2 which is proving very difficult for computers to master. As a consequence this causes scientists like Frank Levy and Richard J. Murmane to say that we need to focus on preparing children for an “increased emphasis on conceptual understanding and problem-solving“.

And last but not least – being human. Expressing empathy, making people feel good, taking care of others, being artistic and creative for the sake of creativity, expressing emotions and vulnerability in a relatable way, and making people laugh.

Are you safe?

We all know that computers/machines/robots will have an impact (positive and/or negative) on our lives in one way or another. The rather ominous elephant in the room here is whether or not your job can be done by a robot?

I am sure you will be glad to know there is an algorithm for it…
In a recent article by the BBC it is predicted that 35% of current jobs in the UK are at a ‘high risk’ of computerization in the coming 20 years (according to a study by Oxford University and Deloitte).

It remains, jobs that rely on empathy, creativity and social intelligence are considerably less at risk of being computerized. In comparison roles including retail assistants (37th), chartered accountants (21st) and legal secretaries (3rd) all rank among the top 50 jobs at risk.

Fanni Vig has been Data Analytics Division Director at Logicalis UK since March 2016. Prior to that she held a number or roles with Trovus, starting as Business Development Director in 2011, then Client Services Director and finally COO before the company was bought by Logicalis.

Fanni is an industry champion for helping businesses get over the hurdles and address the excuses that will allow them to truly benefit from data insight.

Year by year we are generating increasingly large volumes of data which require more complex and powerful tools to analyse in order to produce meaningful insights.

What is machine learning?

Anticipating the need for more efficient ways of spotting patterns in large datasets on mass, Machine Learning was developed to give computers the ability to learn without being explicitly programmed.

Today, it largely remains a human-supervised process, at least in the developmental stage. This consists of monitoring a computer’s progress as it works through a number of “observations” in a data set arranged to help train the computer into spotting patterns between attributes as quickly and efficiently as possible. Once the computer has started to build a model to represent the patterns identified, the computer then goes through a looping process, seeking to develop a better model with each iteration.

How is it useful?

The aim of this is to allow computers to learn for themselves, knowing when to anticipate fluctuation between variables which then helps us to forecast what may happen in future. With a computer model trained on a specific data problem or relationship, it then allows data professions to produce reliable decisions and results, leading to the discovering of new insights which would have remained hidden without this new analytical technique.

Real-world Examples

Think this sounds like rocket science? Every time you’ve bought something from an online shop and had recommendations based on your purchase – that’s based on machine learning. Over thousands of purchases the website has been able to aggregate the data and spot correlations based on real buying users’ buying patterns, and then present the most relevant patterns back to you based on what you did or bought. You may see these as “recommended for you” or “this was frequently bought with that”. Amazon and Ebay have been doing this for years, and more recently, Netflix.

This sounds fantastic – but where can this help us going forward?

Deep learning

This is distinguished from other data science practices by the use of deep neural networks. This means that the data models pass through networks of nodes, in a structure which mimics the human brain. Structures like this are able to adapt to the data they are processing, in order to execute in the most efficient manner.

Using these leading techniques, some of the examples now look ready to have profound impacts on how we live and interact with each other.We are currently looking at the imminent launch of commercially available real-time language translation which requires a speed of analysis and processing never available before. Similar innovations have evolved in handwriting-to-text conversion with “smartpads” such as the Bamboo Spark, which bridge the gap between technology and traditional note taking.

Other applications mimic the human components of understanding; classify, recognise, detect and describe (according to SAS.com). This has now entered main-stream use with anti-spam measures on website contact forms, where the software knows which squares contain images of cars, or street signs.

Particularly within the healthcare industry, huge leaps are made where scanned images of CT scans have been “taught” how to spot the early sign of lung cancer in Szechwan People’s Hospital, China. This has come in to meet a great need as there is a shortage of trained radiologists to examine patients.

In summary, there have been huge leaps in data analysis and science in the last couple years. The future looks bright for the wider range of real world issues to which we can apply more and more sophisticated techniques and tackle previously impossible challenges. Get in touch and let’s see what we can do for you.

It’s common knowledge that there is a global shortage of experienced IT security professionals, right across the spectrum of skills and specialities, and that this shortage is exacerbated by an ongoing lack of cyber security specialists emerging from education systems.

Governments are taking action to address this skills shortage, but it is nevertheless holding back advancement and exposing IT systems and Internet businesses to potential attacks.

Because of this, and despite the fear that other industries may have of Artificial Intelligence (AI) the information security industry should be embracing it and making the most of it. As the connectivity requirements of various different environments become ever more sophisticated, so the number of security information data sources is increasing rapidly, even as potential threats increase in number and complexity. Automation and AI offer powerful new ways of managing security in this brave new world.

At the moment, the focus in AI is on searching and correlating large amounts of information to identify potential threats based on data patterns or user behaviour analytics. These first generation AI-driven security solutions only go so far, though: security engineers are still needed, to validate the identification of threats and to activate remediation processes.

As these first generation solutions become more efficient and effective in detecting threats, they will become the first step towards moving security architectures into genuine auto-remediation.

To explore this, consider a firewall – it allows you to define access lists based on applications, ports or IP addresses. Working as part of a comprehensive security architecture, new AI-driven platforms will use similar access lists, based on a variety of complex and dynamic information sources. The use of such lists will under-gird your auto-remediation policy, which will integrate with other platforms to maintain consistency in the security posture defined.

As we move into this new era in security systems, in which everything comes down to gathering information that can be processed, with security in mind, by AI systems, we will see changes as services adapt to the new capabilities. Such changes will be seen first in Security Operations Centres (SOCs).

Today’s SOCs still rely heavily on security analysts reviewing reports to provide the level of service expected by customers. They will be one of the first environments to adopt AI systems, as they seek to add value to their services and operate as a seamless extension to digital businesses of all kinds.

SOCs are just one example, the security industry will get the most out of AI, but they need to start recognising that machines do best at what people do best. Any use of this technology will enable the creation of new tools and processes in the cybersecurity space that will protect new devices and networks from threats even before a human can classify that threat.

Artificial intelligence techniques such as unsupervised learning and continuous retraining can keep us ahead of the cyber criminals. However, we need to be aware that hackers will be also using these techniques, so here is where the creativity of the Good Guys can focus on thinking about what is coming next and let the machines do their job in learning and continuous protection.

Don’t miss out: to find out more, contact us – we’ll be delighted to help you with emerging technology and use it to your benefit.

As the benefits of hybrid IT have become clear, it has evolved from a temporary state to the chosen environment for many organisations looking to thrive.

Every organisation, regardless of size or sector, has a digital strategy. In fact, it’s hard to believe that IT once lingered on the fringes of business operations and decisions when today it is front and centre – a driving force behind both individual projects and overall business objectives.

And for the vast majority of organisations, it’s difficult to speak about digital strategy without mentioning cloud.

In fact, cloud’s ever-growing role and potential benefits are so widely publicised that it can feel almost unavoidable. After all, if your competitors adopt ‘cloud first’ strategies and you chose not to- don’t you risk getting left behind?

But, cloud doesn’t have to be all or nothing…

Enter hybrid IT

With hybrid IT, organisations can bring in cloud-based services that will run in parallel with their existing on-premise hardware. This may not necessarily be a new concept. However, its full potential is rarely realised.

Instead, more often than not, hybrid IT is built into digital strategies as a stepping stone to cloud and, as such, considered the transitional phase on a much bigger journey. It’s useful, but it’s also temporary… Simply a vehicle to get you and your organisation where you need to be by enabling you to join the elite and become a ‘cloud first’ company.

And it’s true, hybrid IT is a very useful tool for organisations looking to make the first small steps into a new cloud-centric world. You can test the waters by investing in new cloud-based technologies, without being all-in.

But hybrid IT can also open up the door to a whole new world of possibilities, enabling businesses to operate in- and therefore reap the benefits associated with- both on-premise and cloud environments.

The best of both worlds

Traditional IT or cloud technologies… it used to be a one-stop choice that organisations had to make. And once you made it, all your application workloads and databases were assigned to one environment. You were effectively tied into that environment until you actively decided to change and, with significant effort and probably financial cost, you made steps to convert.

But, by using hybrid IT, organisations no longer have to commit to a single environment. They can have the best of both worlds and benefit from aligning specific workloads and applications to specific platforms. hybrid IT grants:

The scalability and cost efficiency of cloud technologies

There’s no doubt about it, scaling a traditional infrastructure can be very expensive. By making the most of hybrid IT- and utilising the public cloud and private cloud environments, businesses can upscale IT operations quickly and at minimal cost – which is particularly useful for shorter-term projects. But, it doesn’t stop there… with hybrid IT, organisations can also downscale their operations. In effect, everything can be driven to reflect the actual demands being placed upon the business, saving both resource and money.

And if organisations are saving resource in those areas, it leaves more room for innovation. The exciting new projects that often have to be pushed aside due to more pressing concerns, such as keeping the lights on, can become a reality.

Hybrid IT is also often used in disaster recovery strategies. In our digital world, suffering an IT outage is every organisation’s worst nightmare. Why? Because the downtime that organisations suffer as a result can have a devastating and lasting impact, both financially and in terms of future reputation. By having primary data copied and stored in two different locations, organisations can recover faster while keeping downtime to a minimum.

Above all, hybrid IT gives organisations the freedom to make their own choices. It merges the best of old world technology with new world thinking. And, just as digital is no longer the sole territory of IT departments, it’s set to infiltrate the boardroom and play a key role in all future business decisions.

After all, hybrid IT is an enabler, allowing business leaders to make the right digital decision for their business, whether that is traditional IT, cloud-based technologies or a mixture of the two.

Contact us to find out more about Hybrid IT and how we can help you leverage it

Over the past decade or so, numerous planning and analytics solutions have come out in an effort to catch up with the complex business environment. Most solutions compete around speed, scalability, visualisation capabilities, scenario modelling and excel integration. Our recent Global CIO survey revealed that analytics is still considered ‘very important’ or ‘critical’ for driving innovation and decision-making across the business.

Traditionally, planning tools have been aimed at the department of Finance. Budgeting and forecasting needs, P&Ls, balance sheets and cash flows have been the bread and butter of planning and reporting solutions. However, this only scratches the surface of what can be achieved in the world of business planning. We are in an era when a truly successful planning practice is not solely based upon financial-focused analytics, but also includes customer, sales performance and workforce analytics.

Planning and analytics for the entire business

Although Finance is usually the right strategic area to begin implementing any planning solution, it should just serve as a starting point. A truly successful planning solution should incorporate your operational planning, giving you a more accurate and all-encompassing view of your business.

Apart from Finance, almost all business functions can benefit from agile planning processes and data analytics with payroll, sales and asset management at the top of the list.

Payroll analytics to decrease manual work

Payroll planning can be a very complex and frequently it’s a manual task for modern organisations. HR employees responsible for payroll face multiple components that influence the complexity of the payroll process, such as NI adjustments, complex bonus schemes, salary increases and benefits. Taking into consideration ongoing government changes, regulatory updates and HR related modifications, payroll can prove to be a stressful and time-consuming process.

The most effective way to evolve a historically manual process and increase the speed and accuracy of payroll planning is through data. By taking advantage of analytics, you can create timely, reliable payroll plans to put employee and business insight into action. You can benefit from faster processes and a uniform view of the data and simplify analytical processes, that HR employees might not be able to execute.

Accurate sales forecasting with planning analytics

Sales is another department of an organisation that can greatly benefit from planning analytics. For most organisations, sales planning and forecasting is their life-blood – as it directs the efforts of each department and helps define the overall strategy. Therefore, it is crucial to set realistic and accurate targets based on existing data.

With agile planning and analytics, businesses today can forecast sales volumes and adjust cost and price centrally to see the bottom line impact of the Sales department. More than in any other part of the organisation, this is the ideal area to take advantage of seasonality forecasting, what if scenario modelling and phasing. This will result in successfully steering sales activities, maintaining margins and delivering value, both to the client and the business.

Asset Management simplified through planning analytics

Often the biggest hurdle that companies face when managing their assets is the volume of data that needs to be collected, analysed and maintained. Increasing cost pressure, complex structures in supply chains and rising risks due to complex procurement mechanisms are just part of the challenge for modern businesses.

Effective and flexible networking of data is crucial in order to make fast and accurate decisions. With advanced planning and analytics, organisations can apply profiles to the assets to plan for depreciation and asset control.

At Logicalis, we have a holistic approach to planning analytics, moving beyond finance and helping you take data-driven decisions for the entire business.

What is your approach to Digital Transformation and is your business structured for it?

All modern companies are looking at digital transformation, and the key decision they need to make is whether to “become digital” or to “do digital”. “Becoming digital” is deciding to turn the whole business or business unit digital, re-engineering from the ground up to take full advantage of the benefits of technology across the value chain. “Doing digital” implies taking specific processes, maybe a customer interaction or a B2B transaction process, and making it digital. Depending on which of these options a business chooses to take, the approach and qualities of the Digital Transformation function will change.

Digital transformation has grown as a concept over the last few years, but in general, is taken to mean building additional business benefit on the data and data processes that a business owns. This can mean finding efficiencies through process improvement and automation, new opportunities buried within the value of corporate data or new digital routes to market. A full transformation embraces all of these and more; the emergence of a connected environment, now known as IoT, is opening new opportunities with every technological development.

Becoming Digital: starts with a solid digital business culture

If a business has chosen to “become digital”, the leadership team needs to embrace the objective and fully support the change initiative. That said, the scale of investment and impact of the programme means that a single point of oversight is essential. In some businesses this might fall to a CIO, in others a Chief Digital Officer, however, these leaders will need support from a team with excellent project and technical skills. In addition, the cultural change will require consideration throughout the process. Probably the most critical attributes that the transformation leaders will need to have are a clear vision of what digital looks like, the skills to understand how it will be delivered and, most importantly, the drive to sustain a multi-year transformational programme.

In many ways, the Digital Transformation Officer will have to lead the senior team through this programme, and these qualities combined with the soft skills to enable this leadership will eventually determine the success of the programme. This role is well suited to an interventional style – enabling the business to focus on BAU while the digital programme is delivered in a defined manner. There have been well-publicised initiatives similar to this in major UK retail banks and across industries, like the airlines, where all aspects of customer interaction have become fully digital.

Doing Digital: requires greater focus on technical skills

Alternatively, if the choice is to “do digital”, the transformation challenge is much more bounded. In this case, the challenge is more to do with having the technical understanding and project management skills to deliver tightly defined digital projects. While these transform the particular process involved, they do not require wholesale change across the business. For most organisations, this will be the chosen option as there is less risk and disruption in such an iterative approach. We are seeing programs like this often linked to IoT initiatives across our customer base.

Clearly the CEO will take a close interest in any of these initiatives, however with the choice to “become digital” he or she is betting the company and as such will want the transformation leadership to be part of his senior team and empowered to drive the vision to a conclusion. In choosing to “do digital”, the CEO contains the risk to particular areas and should use his management team to direct these initiatives through a skilled and technically able programme manager. Whatever the approach, there will be a material cost and the benefits realisation after go-live needs to be driven and measured with similar control and vigour.

As the benefits of hybrid IT have become clear, it has evolved from a temporary state to the chosen environment for many organisations looking to thrive.
Every organisation, regardless of size or sector, has a digital strategy. In fact, it’s hard to believe that IT once lingered on the fringes of business operations and decisions when today it is front and centre – a driving force behind both individual projects and overall business objectives.
And for the vast majority of organisations, i...

Data Privacy on the spotlight!
Data Privacy Day may not be an official holiday for your IT department, but it definitely should remind you that you need to focus and do more to protect confidential data.
The Data Privacy Day was first introduced in 2009 by the Online Trust Alliance (OTA) in response to the increasing number of cybersecurity attacks and data privacy breaches, emphasising the need for effective data protection regulations and transparent processes for collecting and using pers...

Digital Workplace is not a one size fits all approach
“Workplace of the Future’’, “Digital Workplace” and ‘‘Future Digital Employee Experience” are definitely the latest buzz phrases in the HR world! Today we work anywhere at almost any time. We work in cafeterias, airport waiting areas, cars, with mobile devices, desktops, laptops, tablets, and the list goes on.
Delivering a great digital workplace in today’s business environment is tricky and one of the most important things a business can...