Digital signature service DocuSign hacked and email addresses stolen

Digital signature service DocuSign said Monday that an unnamed third-party had got access to email addresses of its users after hacking into its systems.The hackers gained temporary access to a peripheral sub-system for communicating service-related announcements to users through email, the company said. It confirmed after what it described as a complete forensic analysis that only email addresses were accessed, and not other details such as names, physical addresses, passwords, social security numbers, credit card data or other information.“No content or any customer documents sent through DocuSign’s eSignature system was accessed; and DocuSign’s core eSignature service, envelopes and customer documents and data remain secure,” DocuSign said in a post. To read this article in full or to leave a comment, please click here

Small Cell Forum seeks advice from large enterprises

The carrier-led Small Cell Forum has revealed the names of the founding members of an Enterprise Advisory Council designed to help address the need for more wireless coverage to meet rising demand.The Forum — whose members include the likes of AT&T, Cisco and Huawei —had previously said it was forming such a council, but had not disclosed any member organization names. It also had said it was working with hotels and others in the hospitality industry to plot strategies for boosting cellular coverage to meet the needs of their customers (an industry, by the way, that hasn't always been very hospitable when it comes to wireless service, as seen in the Wi-Fi blocking schemes of some).To read this article in full or to leave a comment, please click here

Small Cell Forum seeks advice from large enterprises

The carrier-led Small Cell Forum has revealed the names of the founding members of an Enterprise Advisory Council designed to help address the need for more wireless coverage to meet rising demand.The Forum — whose members include the likes of AT&T, Cisco and Huawei —had previously said it was forming such a council, but had not disclosed any member organization names. It also had said it was working with hotels and others in the hospitality industry to plot strategies for boosting cellular coverage to meet the needs of their customers (an industry, by the way, that hasn't always been very hospitable when it comes to wireless service, as seen in the Wi-Fi blocking schemes of some).To read this article in full or to leave a comment, please click here

Enabling the Software-Defined Branch with NSX

Reimagining the edge

While the importance of the cloud is obvious to anyone, the increasing importance of the edge is often overlooked.  As digitization and the Internet of Things are  leading to an exponential growth in the number of devices, the amount of data that is being generated by sensors in devices such as self-driving-cars, mobile endpoints and  people tracking systems for retail is astronomical. Analyzing and turning that data into immediate actions is key to success in the era of digitization. The cloud enables massive data storage and processing, but it does not always lend itself to real time processing and immediate actions. Latency and the sheer amount of data to be transmitted are much less of a factor for the edge compared to the data center. In order to make instant decisions, some of the data processing needs to happen at the edge.  At the same time, a large number of employees no longer work form the corporate HQ, but have ever increasing expectations with regards to application access regardless of their physical location.  Distributed computing across the edge, along with high performance cloud access and distributed security enforcement give organizations “the edge”. Centralizing management and operations with distributed control and Continue reading

The WannaCry ransomware might have a link to North Korea

As security researchers investigate last Friday’s massive attack from the WannaCry ransomware, they’ve noticed clues that may link it with a North Korean hacking group that has been blamed for attacking banks across the world.The evidence is far from a smoking gun, and may prove inconclusive. But security researchers have noticed a similarity between an earlier version of WannaCry and a hacking tool used by the Lazarus Group.To read this article in full or to leave a comment, please click here

The WannaCry ransomware might have a link to North Korea

As security researchers investigate last Friday’s massive attack from the WannaCry ransomware, they’ve noticed clues that may link it with a North Korean hacking group that has been blamed for attacking banks across the world.The evidence is far from a smoking gun, and may prove inconclusive. But security researchers have noticed a similarity between an earlier version of WannaCry and a hacking tool used by the Lazarus Group.To read this article in full or to leave a comment, please click here

When Will AI Replace Traditional Supercomputing Simulations?

The science fiction of a generation ago predicted a future in which humans were replaced by the reasoning might of a supercomputer. But in an unexpected twist of events, it appears the it is the supercomputer’s main output—scientific simulations—that could be replaced by an even higher order of intelligence.

While we will always need supercomputing hardware, the vast field of scientific computing, or high performance computing, could also be in the crosshairs for disruptive change, altering the future prospects for scientific code developers, but opening new doors in more energy-efficient, finer-grained scientific discovery. With code that can write itself based

When Will AI Replace Traditional Supercomputing Simulations? was written by Nicole Hemsoth at The Next Platform.

I Will Be Presenting For the First Time at CLUS 2017!

Well, it looks like another major item will get struck from my bucket list this year. I've been accepted to present at Cisco Live in Las Vegas this summer! ? This session is designed to walk through an enterprise network and look at how EIGRP can be engineered with purpose to best suit the needs of the different areas of the network. I will focus a lot on stability and scaling EIGRP and will show the audience how, where, and when to leverage common EIGRP features such as summarization, fast timers, BFD, and wide metrics.

Paying the WannaCry ransom will probably get you nothing. Here’s why.

Last Friday’s massive WannaCry ransomware attack means victims around the world are facing a tough question: Should they pay the ransom?Those who do shouldn't expect a quick response -- or any response at all. Even after payment, the ransomware doesn’t automatically release your computer and decrypt your files, according to security researchers.  Instead, victims have to wait and hope WannaCry’s developers will remotely free the hostage computer over the internet. It's a process that’s entirely manual and contains a serious flaw: The hackers have no way to prove who paid off the ransom."The odds of getting back their files decrypted is very small," said Vikram Thakur, technical director at security firm Symantec. "It's better for [the victims] to save their money and rebuild the affected computers."To read this article in full or to leave a comment, please click here

Paying the WannaCry ransom will probably get you nothing. Here’s why.

Last Friday’s massive WannaCry ransomware attack means victims around the world are facing a tough question: Should they pay the ransom?Those who do shouldn't expect a quick response -- or any response at all. Even after payment, the ransomware doesn’t automatically release your computer and decrypt your files, according to security researchers.  Instead, victims have to wait and hope WannaCry’s developers will remotely free the hostage computer over the internet. It's a process that’s entirely manual and contains a serious flaw: The hackers have no way to prove who paid off the ransom."The odds of getting back their files decrypted is very small," said Vikram Thakur, technical director at security firm Symantec. "It's better for [the victims] to save their money and rebuild the affected computers."To read this article in full or to leave a comment, please click here

Why dynamic mapping is changing network troubleshooting for the better

Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.

The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.

To read this article in full, please click here

Why dynamic mapping is changing network troubleshooting for the better

Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.

The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.

To read this article in full or to leave a comment, please click here

Why dynamic mapping is changing network troubleshooting for the better

Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.To read this article in full or to leave a comment, please click here

Why dynamic mapping is changing network troubleshooting for the better

Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.To read this article in full or to leave a comment, please click here

Why dynamic mapping is changing network troubleshooting for the better

Effective network troubleshooting requires experience and a detailed understanding of a network’s design. And while many great network engineers possess both qualities, they still face the daunting challenge of manual data collection and analysis.

The storage and backup industries have long been automated, yet, for the most part, automation has alluded the network, forcing engineering teams to troubleshoot and map networks manually. Estimates from a NetBrain poll indicate that network engineers spend 80% of their troubleshooting time collecting data and only 20% analyzing it. With the cost of downtime only getting more expensive, an opportunity to significantly reduce the time spent collecting data is critical.

To read this article in full or to leave a comment, please click here

The Year Ahead for GPU Accelerated Supercomputing

GPU computing has deep roots in supercomputing, but Nvidia is using that springboard to dive head first into the future of deep learning.

This changes the outward-facing focus of the company’s Tesla business from high-end supers to machine learning systems with the expectation that those two formerly distinct areas will find new ways to merge together given the similarity in machine, scalability, and performance requirements. This is not to say that Nvidia is failing the HPC set, but there is a shift in attention from what GPUs can do for Top 500 class machines to what graphics processors can do

The Year Ahead for GPU Accelerated Supercomputing was written by Nicole Hemsoth at The Next Platform.

How to use blockchain: Following an asset through its lifecycle to learn more

This contributed piece has been edited and approved by Network World editors

Possession is nine-tenths of the law, right?  But thanks to blockchain, this old adage may no longer be a viable way to settle property disputes.

Artists and enterprises alike have long struggled to prove ownership of their work after it has been disseminated, especially when it is uploaded online. What if there was a way to use technology to reliably track asset provenance with absolute certainty, from creation to marketplace and beyond?  The reality is that this is already possible with the help of blockchain, and the benefits to the enterprise are many.

To read this article in full, please click here

How to use blockchain: Following an asset through its lifecycle to learn more

This contributed piece has been edited and approved by Network World editors

Possession is nine-tenths of the law, right?  But thanks to blockchain, this old adage may no longer be a viable way to settle property disputes.

Artists and enterprises alike have long struggled to prove ownership of their work after it has been disseminated, especially when it is uploaded online. What if there was a way to use technology to reliably track asset provenance with absolute certainty, from creation to marketplace and beyond?  The reality is that this is already possible with the help of blockchain, and the benefits to the enterprise are many.

To read this article in full or to leave a comment, please click here