Technology-based companies have been around for a very long time, and with them, we have started to enjoy a change in the kinds of lives we live and lead because they have helped redefine every aspect of our lives. Over at computing, the likes of HP, now split in two to be known as Hewlett-Packard, has achieved a tremendous amount of stuff in the past but then, that is just what it is – the past. In order to then be constantly relevant at every point in time, different companies are driven to introduce a new kind of technology that would revolutionaries the scene and keep them in the game for a few more years till they are needed to whip something up to stay relevant, and we think for HP, they might have found their answer in what they call ‘The Machine.’
Expected to alter the ways data is being seen and processed for mutual benefits all around the world, they have outed this idea since as far back as the year 2014. Now, two years down the line, it seems that their idea is gradually getting near completion and would soon leave the pipeline for all to see. The truth is that the amount of data that a single entity gets these days is becoming increasingly too much to handle and crunch. With this kind of problem creates a need in the market that needs to be filled and from Hewlett-Packard, The Machine is the perfect solution to this kind of problem. In other words, they have developed a computing system that is like none other and would, if what they are saying is on the money, change the way we look at the idea of crunching big data and using them to make the deductions needed. A developer of computers and personal units themselves, there is no telling the kind of amazing products that we would start to see from this manufacturer once this ambitious project reaches a stage of completion and can be implemented in components. Not only would Hewlett-Packards Labs be making some serious money for the sales and licensing of the technology to other users and interested firms, but they would also be taking in some considerable income from the sales of components involved in the development of this technology.
According to the current vice president of the Hewlett-Packard Enterprise, Antonio Neri, “We have achieved a major milestone with The Machine research project – one of the largest and most complex research projects in our company’s history.” (n.d.). Retrieved from http://www.digitaltrends.com/computing/hpe-the-machine-memory-driven-computing-demonstration/. This new project is set to reset the way in which we think about computing, and one of the major things that HPE has to do to get this fully running is to alter the way in which they build the servers that cater to their computers and every other one on the network. Given that the aspect of computer servers is a core product line for the Enterprise, it is understandable why the vice president would claim that this is one of the most complex projects that they have laid their hands on. What sets this Machine aside from the conventional way of computing is that it makes use of a medium of transmitting information with light as its mechanism (photonics). Knowing that the speed of light runs into the billions and more, that is surely some new level of speed with which microprocessors would be able to get data available and information processed, up to 1000 times what we have today.
This year, 2016, is the time that we are supposed to see what The Machine that HPE has been building is to look like. As concerns the commercial availability, though, there is no guarantee that they are at that point yet and we might have to keep looking at the nearest future. Creating a mental picture, The Machine would occupy the same space and be about the size of a server rack but pack a very high set of memory (about 320 terabytes) and also make use of up to 2500 CPU core units, further defining it to be no small feat to be rushed. While all of these can be readily made, and pieced together, there remains one key component of the entire setup that would be missing – the memristor. What makes the of very special is that it can act as a source of non-volatile storage for information and even when the unit has been powered down, it is still able to retain the pieces of information that have been grafted onto it. Developed when the company was still known as HP Labs back in the year 2008, the memristor represents just one of four major components that are remaining to get The Machine to where the company has dreamt for it. If you were wondering what makes this component better than the conventional DRAM, the energy efficiency that it provides is the major shot caller.
While there have been snags in the manufacture and optimisation of the memristor component, so much that HP Enterprise has recently announced a partnership with SanDisk to help turn their fortunes around, HPE has also concerned itself with the hardware and software components of this large-scale programme which they claim would build upon themselves in a “virtuous cycle.” By way of internal estimations, HPE is looking to make the technology commercially available as from the year 2020 when you would be able to get yours in either the size of a credit card or a very big supercomputer. Likewise, they are keeping the option of selling the components of their prototype as soon as it becomes available.