“The Machine” is what HP is calling it. It’s a new type of computer that’s apparently capable of processing 640 TB of data in a billionth of a second. Whoa. This computer could be used in servers, personal computers, and also handheld devices. It also requires far less energy to work. My question is, how do we go from our current computer model, one where we seem to have hit an efficiency wall to this new machine? It seems like we are lacking some growth steps here, but I’ll take it. Check out an excerpt from iflscience.com:
“In order to handle this flurry of information it uses clusters of specialized cores as opposed to a small number of generalized cores. The whole thing is connected together using silicon photonics instead of traditional copper wires, boosting the speed of the system whilst reducing energy requirements. Furthermore, the technology features memristors which are resistors that are able to store information even after power loss.
The result is a system six times more powerful than existing servers that requires eighty times less energy. According to HP, The Machine can manage 160 petabytes of data in a mere 250 nanoseconds. And, what’s more, this isn’t just for huge supercomputers- it could be used in smaller devices such as smartphones and laptops. During a keynote speech given at Discover, chief technology officer Martin Fink explained that if the technology was scaled down, smartphones could be fabricated with 100 terabytes of memory.
HP envisages a variety of future applications for this technology in numerous different settings, from business to medicine. For example, it could be possible for doctors to compare your symptoms or DNA with patients across the globe in an instant and without breaching privacy, improving health outcomes.”