An optical chip the size of a fingernail has enabled Australian researchers to set new optical data rate records on the country’s National Broadband Network (NBN).The raw data rate of 44.2Tbps over conventional optical cable is about three times the data rate for the entire NBN network and about 100 times the speed of any single device currently used on the network, the researchers say.10 of the world's fastest supercomputers
While the technology is meant for metro-area networks and data centers, those bit rates would support downloads of 1,000 movies in less than a second.To read this article in full, please click here
Market research firm IDC predicts that by 2023, 25% of Fortune 500 companies will gain a competitive advantage from quantum computing. It’s a bold prediction given the current dearth of real-world examples of quantum computing in action. However, there’s plenty of industry activity to back up IDC’s forecast. In fact, early this year at the Consumer Electronics Show the biggest buzz wasn’t the newest smartphone, wearable device or autonomous-driving technology, but rather unprecedented computing power based on an area of quantum physics Albert Einstein described as "spooky action at a distance."To read this article in full, please click here
Fujitsu has delivered all the components needed for a supercomputer in Japan that is expected to break the exaFLOP barrier when it comes online next year, and that delivery means that the same class of hardware will be available soon for enterprise customers.The supercomputer, called Fugaku, is being assembled and brought online now at the RIKEN Center for Computational Science. The installation of the 400-plus-rack machine started in December 2019, and full operation is scheduled for fiscal 2021, according according to a Fujitsu spokesman.10 of the world's fastest supercomputers
All told, Fugaku will have a total of 158,976 processors, each with 48 cores at 2.2 GHz. Already the partially deployed supercomputer’s performance is half an exaFLOP of 64-bit double precision floating point performance and looks to be the first to get to a full exaFLOP. Intel says its supercomputer Aurora being built for the Department of Energy’s Argonne National Laboratory in Chicago will be delivered by 2021, and it will break the exaFLOP barrier, too.To read this article in full, please click here
Last year, Lenovo Data Center Group (DCG) announced single-socket ThinkSystem servers using the AMD Rome generation, which has up to 64 cores per processor. Dual-socket systems are de rigueur in enterprise servers, but that's because those processors have just 20-odd cores. AMD's pitch, which Lenovo and its competitors embraced, was that it could offer more compute in a one-socket, 64-core processor than two 22-core processors, and for less money.This year Lenovo DGC is following up that launch with the 1U ThinkSystem SR645 and 2U ThinkSystem SR665 two-socket servers, featuring enhanced performance and I/O connectivity for higher performance workloads. With 128 cores/256 threads in a 1U/2U design, a whole lot of computation power can be squeezed into a small space.To read this article in full, please click here
Cisco has upgraded its core networking software to include better support for enterprise multicloud integration and management as well as tools to help telcos or hyperscalers tie together large scale data-center networks.The new features are part of the 5.0 release of Cisco's Application Centric Infrastructure (ACI) software, which runs on the company's core data center Nexus 9000 systems.
READ MORE: Are new Cisco certs too much? Network pros reactTo read this article in full, please click here
Schneider Electric has partnered with systems management company Aveva to put together a package for managing multiple and hyperscale data centers with a single view to enable and expand visibility into day-to-day operations.The package combines Aveva Unified Operations Center with Schneider Electric’s EcoStruxure for control and monitoring systems to offer predictive maintenance, staff training and financial aspects of large data centers. The combined software provides a homogenous view of engineering, operations and performance and improves workforce productivity by standardizing and de-siloing systems and processes across multiple sites for real-time decision making.To read this article in full, please click here
Aiming to help customers handle growing on-premises workloads, VMware and Dell EMC have bolstered their co-developed cloud software.First introduced in 2018, VMware Cloud on Dell EMC is intended to help enterpise customers move on-premises infrastructure and workloads to the cloud. Version 2.0 of VMware Cloud on Dell EMC bri-gs improved support for high-density and high-performance data-center applications. [Get regularly scheduled insights by signing up for Network World newsletters.]
VMware Cloud on Dell EMC consists of VMware’s high-performance compute, storage and networking software, powered by VMware vSphere, vSAN and NSX, tightly integrated with Dell EMC VxRail hyperconverged infrastructure (HCI) hardware, and delivered as a service all – managed by VMware.To read this article in full, please click here
Nvidia, whose heritage lies in making chips for gamers, has announced its first new GPU architecture in three years, and it’s clearly designed to efficiently support the various computing needs of artificial intelligence and machine learning.The architecture, called Ampere, and its first iteration, the A100 processor, supplant the performance of Nvidia’s current Volta architecture, whose V100 chip was in 94 of the top 500 supercomputers last November. The A100 has an incredible 54 billion transistors, 2.5 times as many as the V100.10 of the world's fastest supercomputers
Tensor performance, so vital in AI and machine learning, has been significantly improved. FP16 floating point calculations are almost 2.5x as fast as V100 and Nvidia introduced a new math mode called TF32. Nvidia claims TF32 can provide up to 10-fold speedups compared to single-precision floating-point math on Volta GPUs.To read this article in full, please click here
Juniper intends to spread the gospel of artificial intelligence across enterprise networking in the coming months with new products and services.The expected moves are a continuation of a strategy that has been an integral part of Juniper since the company bought wireless and artificial-intelligence software maker Mist in 2019 for $405 million.[Get regularly scheduled insights by signing up for Network World newsletters.]
Mist’s cloud-based Wi-Fi Assurance system includes an AI-base engine called Marvis that features dynamic packet capture and machine learning to automatically identify, adapt to and fix network issues.To read this article in full, please click here
A company that specializes in creating second lives for IT hardware is expanding its initiative to reengineer and sell decommissioned data-center equipment from the major hyperscale operators that are aggressively replacing relatively new hardware.ITRenew announced the plan at the recent Open Compute Project (OCP) conference, promising to sell full servers previously owned by the big operators, reengineered, warrantied, and configured for turnkey uses like web serving and Kubernetes. ITRenew launched its first server racks two years ago and is now making the initiative more broadly available to all industries so more potential customers can buy OCP-certified hardware. To read this article in full, please click here
Many edge-computing deployments are driven by very specific needs, but since new needs may arise down the road with a different set of edge requirements, IT leaders should adopt edge-computing architectures with flexibility and adaptability in mind.The fact that all edge-computing systems have certain things in common – complex combinations of hardware, applications, infrastructure software and networking – doesn’t mean they should all have the same design.[Get regularly scheduled insights by signing up for Network World newsletters.]
Every new project requires highly specialized software and integrated custom networking to deliver on project goals across such diverse applications as industrial control, autonomous transportation, health services, public safety and energy management. Each use case will have its unique requirements in terms of performance, response times, quantity of data to be gathered and processed, and cost.To read this article in full, please click here
Gartner this week said that IT spending across the globe is projected to total $3.4 trillion in 2020, a decline of 8% from 2019 due to the impact of the COVID-19 pandemic.In January Gartner had forecast Worldwide IT spending to total $3.9 trillion in 2020, an increase of 3.4% from 2019.[Get regularly scheduled insights by signing up for Network World newsletters.]
Gartner’s new forecast says all market segments from enterprise software to communications service will experience a decline in 2020, with devices and data-center systems experiencing the largest drops in spending. To read this article in full, please click here
A biotech company that develops sensors to detect explosives and other chemicals on planes and in airports is teaming up with Airbus to create a sensor that could detect passengers who are positive for COVID-19.California-based Koniku and Airbus, which have been working since 2017 on contactless equipment that sniffs out chemicals, are trying to adapt that technology to sniff out pathogens, says Osh Agabi, founder and CEO of Koniku, in a blog post.To read this article in full, please click here
The invention of synthetic full backups is one of the most important advancements in backup technology in the last few decades, right up there witih disk-based backups, deduplication, continuous data protection (CDP), and the cloud.Here’s how they came to be and an explanation of what benefits they might offer.[Get regularly scheduled insights by signing up for Network World newsletters.]
Traditional backup options
There are essentially two very broad categories of what the backup industry calls backup levels;you are either backing up everything (full backup) or you are backing up only what has changed (incremental backup). There are different types of incremental backups, but that's really not relevant to this particular discussion. A typical set up runs incremental backups every night and full backups every week – or even less often than that.To read this article in full, please click here
Backblaze, the cloud-backup vendor legendary for its quarterly hard-drive-failure reports, has decided to kick Amazon, Microsoft, and Google in the shins with a much cheaper and more customer-friendly storage offering.Like other cloud backup services, Backblaze used a small app to backup and restore on a PC. In 2015, in response to repeated requests for direct access to its storage services, the company introduced an API and service under the name Backblaze B2 Cloud Storage and now claims more than 100,000 customers.[Get regularly scheduled insights by signing up for Network World newsletters.]
The company has released beta versions of S3-compatible APIs that allows customers to redirect data workflows from S3 to Backblaze’s B2 Cloud Storage. The company says through its services, customers will have infinitely scalable, durable offsite storage at a quarter of the price of S3, Azure, and Google Cloud Storage.To read this article in full, please click here
While many functions have migrated to the cloud, data storage remains very firmly on premises due to the cost of cloud storage, regulations or simply the desire to retain control over a firm’s data. That’s reflected in two new announcements.Dell EMC launched PowerStore, a storage-array line that unifies its overlapping midrange products that Dell owned, along with products from EMC. PowerStore hardware and software has been redesigned from the ground up and comes with new consumption business models, a reflection of the growing popularity of pay-per-use hardware.[Get regularly scheduled insights by signing up for Network World newsletters.]
Much as HPE unified its multiple storage acquisitions under the Primera brand, PowerStore unifies multiple storage-hardware products over the years, including Dell’s EqualLogic and EMC’s Compellent and XtremIO, not to mention Dell’s own acquisition of EMC. And PowerStore comes with migration tools to help move the contents of old Dell EMC hardware to PowerStore.To read this article in full, please click here
The future of remote work has arrived.With the work-at-home mandates triggered by COVID-19 quarantines, businesses have adapted on-the-fly to create remote-networking environments that maintain corporate security. Largely, they have done so by expanding traditional remote access solutions including VPN infrastructure and services, virtual desktop infrastructure, secure Wi-Fi access points and even SD-WAN for home use.To read this article in full, please click here
Deeply assimilating its Red Hat technology, IBM this week rolled out a set of new platforms and services designed to help customers manage edge-based application workloads and exploit artificial intelligence for infrastructure resiliency.The announcements came at IBM’s virtualized Think! 2020 event that also featured the first Big Blue keynote by the company's new CEO Arvind Krishna, during which he told the online audience about challenges of COVID-19: "History will look back on this as the moment when the digital transformation of business and society suddenly accelerated,” but also that hybrid cloud and AI are the two dominant forces driving digital transformation.To read this article in full, please click here
The cloud will not kill the data center, but it will transform it. That's one of the takeaways from the 2020 State of the Data Center report from AFCOM, the industry association for data center professionals.In the near term, construction will slow way down, which aligns with what IDC analyst Rick Villars told me about data center construction slowing after a big buildout. More than 60% of respondents to the AFCOM report said they have no plans to build a new facility in the next 12 months, although 53% said they'll have at least one data center in the works over the next 36 months.
READ MORE: Supply-chain woes put the brakes on hyperscale data centersTo read this article in full, please click here
NVIDIA’s plans to acquire Cumulus Networks, a pioneer of using open source for networking, is a sign that open networking is finally ready for a big leap forward.Open networking has been tightly coupled with software-defined networking (SDN) because the combination promises to make networks significantly more agile, open and easier to customize to specific needs. Cumulus has been working on it for years, and NVIDIA started pushing into it when it acquired Mellanox last week.[Get regularly scheduled insights by signing up for Network World newsletters.]
The question the Cumulus acquisition raises is “why now”? The concept of open networking has been hotly debated since SDN came into prominence. The concept is sound, and open systems will disrupt the network industry much as it did the compute space. Yet while Linux and open source are wildly successful in the compute industry, open source has yet to take off in networking outside of webscale networks and a handful of large organizations.To read this article in full, please click here