Thursday, October 19, 2017


HPE’s unveils beast single-memory computer for big data era



Launch is part of The Machine project, the largest R&D program in the history of the company.

“The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day.”

Words of HPE’s very own CEO Meg Whitman as the company unveils the world’s largest single-memory computer with 160 terabytes (TB) of memory part of The Machine project.

The prototype is aimed at delivering memory-driven computing, an architecture custom-built for the big data era.

Memory-Driven Computing puts memory, not the processor, at the centre of the computing architecture.

Whitman said: “To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era.”

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly-limitless pool of memory – 4,096 yottabytes. For context, that is 250,000 times the entire digital universe today.

The company explains that with that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time.

Mark Potter, CTO at HPE and Director, Hewlett Packard Labs, said: “We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society.

“The architecture we have unveiled can be applied to every computing category – from intelligent edge devices to supercomputers.”