Reducing data among proposed techniques to speed-up computers
Future computer systems need to be significantly faster than the supercomputers around today, scientists believe. One reason is because analyzing complex problems properly, such as climate modeling, takes increasing work. Massive quantities of calculations, performed at high speed, and delivered in mistake-free data analysis is needed for the fresh insights and discoveries expected down the road.Limitations, though, exist in current storage, processing and software, among other components.The U.S. Department of Energy’s four year $48 million Exascale Computing Project (ECP), started at the end of last year for science and national security purposes, plans to overcome those challenges. It explains some of the potential hiccups it will be running into on its Argonne National Laboratory website. Part of the project is being studied at the lab.To read this article in full or to leave a comment, please click here
Neri will oversee HPE’s efforts to streamline and boost profits.
It will be interesting to see how Level 3 and CenturyLink merge their offerings.
Tower firm Crown Castle makes minority investment to speed deployment.
