Tuesday, 17 June 2014 09:05

HP shows of their vision of the Future of Computing: Meet the Machine

Written by

Reading time is around minutes.

Yesterday at the HP Discover event a new concept device was shown off called the Machine. The Machine is an interesting concept that uses a number of existing technologies to deal with the massive influx of data from all of the connected devices we have now and will have in the future. To accomplish this HP envisions a devices that is unlike any of the traditional server offerings on the market today.

According to HP the Machine will leverage specialized cores designed to perform tasks more efficiently than larger generalized cores. This means you will need to have a much larger number of cores to cover the range of tasks a computer will be asked to do. When you push the number of cores up you increase the risk of a bottleneck in many places including (but certainly not limited to) memory, inter-core communication and storage.

To solve the inter-core communication conundrum HP would like to use photonics. What is photonics? Well I am glad you asked. In simplest terms Photonics is the study of light. As it relates to computing Photonics is the study of the transmission and reception of light for the purposes of transmitting data. We have been doing this since around 1976 when the first fiber optic cables came out and have hit some impressive speeds using this technology (400Gb/s is coming).

HP would like to utilize one of the emerging fields in photonics to connect the cores, memory and other components in the Machine. This will remove the limitations of copper interconnects while reducing the power needed to push the same amount of data. In theory the speed of inter-core communication would increase exponentially through the use of photonics over copper. The challenge will be the reduction of the existing light emitters to a size that makes this a manageable device.

On top of hundreds of specialized cores connected via a light based network HP sees the use of Memristors as a logical move. These are resistors that have an embedded memory function which allows them to store data (current values) even after a power loss. The technology was first postulated in 1971, but took until 2008 before anyone actually figured out a reliable way of actually doing it. The memristor can significantly reduce the space needed for memory in a given system and seems ready made for nanoelectonic devices like the Machine.

The theoretical upshot of all of this is a system that is estimated to be six times more powerful than existing systems and that uses 80 times less energy to operate. Not bad when you think about it.

Before you envision a super computer in your home, HP also mentioned that the same basic philosophy and design can be reduced in size and scope to fit inside almost anything, even a phone. This is because the basic building blocks are all intended for use in smaller and smaller electronic devices. You can technically expand the processing power and memory without adding too much in the way of space.

When can we expect these miracles of computing? Well I would not hold your breath. According to recent estimates memristors will not be commercially viable until sometime in 2018 and the smaller photonic circuits are not expected to be production ready until late 2015 (possibly longer). This means that we would not even see a working sample until around 2016 or so. HP did claim they might have one in 2015, but given the state of the technology needed to build one we have our doubts at right now.

Still it is nice to take a look at the possible future of computing isn’t it?

Tell us what you think in our Forum

Read 2466 times Last modified on Tuesday, 17 June 2014 09:22

Leave a comment

Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.