The science fiction of a generation ago predicted a future in which humans were replaced by the reasoning might of a supercomputer. But in an unexpected twist of events, it appears the it is the supercomputer’s main output—scientific simulations—that could be replaced by an even higher order of intelligence.
While we will always need supercomputing hardware, the vast field of scientific computing, or high performance computing, could also be in the crosshairs for disruptive change, altering the future prospects for scientific code developers, but opening new doors in more energy-efficient, finer-grained scientific discovery. With code that can write itself based …
When Will AI Replace Traditional Supercomputing Simulations? was written by Nicole Hemsoth at The Next Platform.
Apteligent can predict how application performance impacts revenue within the enterprise.
Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.
The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.
Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.
The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.
To read this article in full or to leave a comment, please click here
Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.
The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.
To read this article in full or to leave a comment, please click here
GPU computing has deep roots in supercomputing, but Nvidia is using that springboard to dive head first into the future of deep learning.
This changes the outward-facing focus of the company’s Tesla business from high-end supers to machine learning systems with the expectation that those two formerly distinct areas will find new ways to merge together given the similarity in machine, scalability, and performance requirements. This is not to say that Nvidia is failing the HPC set, but there is a shift in attention from what GPUs can do for Top 500 class machines to what graphics processors can do …
The Year Ahead for GPU Accelerated Supercomputing was written by Nicole Hemsoth at The Next Platform.
This contributed piece has been edited and approved by Network World editors
Possession is nine-tenths of the law, right? But thanks to blockchain, this old adage may no longer be a viable way to settle property disputes.
Artists and enterprises alike have long struggled to prove ownership of their work after it has been disseminated, especially when it is uploaded online. What if there was a way to use technology to reliably track asset provenance with absolute certainty, from creation to marketplace and beyond? The reality is that this is already possible with the help of blockchain, and the benefits to the enterprise are many.
This contributed piece has been edited and approved by Network World editors
Possession is nine-tenths of the law, right? But thanks to blockchain, this old adage may no longer be a viable way to settle property disputes.
Artists and enterprises alike have long struggled to prove ownership of their work after it has been disseminated, especially when it is uploaded online. What if there was a way to use technology to reliably track asset provenance with absolute certainty, from creation to marketplace and beyond? The reality is that this is already possible with the help of blockchain, and the benefits to the enterprise are many.
To read this article in full or to leave a comment, please click here
This contributed piece has been edited and approved by Network World editors
Possession is nine-tenths of the law, right? But thanks to blockchain, this old adage may no longer be a viable way to settle property disputes.
Artists and enterprises alike have long struggled to prove ownership of their work after it has been disseminated, especially when it is uploaded online. What if there was a way to use technology to reliably track asset provenance with absolute certainty, from creation to marketplace and beyond? The reality is that this is already possible with the help of blockchain, and the benefits to the enterprise are many.
To read this article in full or to leave a comment, please click here
The Internet Society has been closely monitoring the ransomware cyber-attacks that have been occurring over the last couple of days. The malware, which has gone by multiple names, including WannaCry, WannaDecryptor, and WannaCrypt, exploits a flaw in Microsoft Windows that was first reportedly discovered by the National Security Agency (NSA). A group of hackers leaked the code for exploiting this vulnerability earlier this year, and a fix or patch was available as far back as March 2017. Since Friday, 200,000 computers in 150 countries have been compromised using this exploit. The numbers are expected to grow exponentially as people settle back into their work routines and regular use of computer systems this week.