The growth in data use and consumption means the needs of IT managers are changing, and a survey from Omdia (formerly IHS Markit) found data-center operators are looking for intelligence of all sorts, not just the artificial kind.Omdia analysts recently surveyed IT leaders from 140 North American enterprises with at least 101 employees working in North American offices and data centers and asked them what features they wanted the most in their networking technology.The results say respondents expect to more than double their average number of data-center sites between 2019 and 2021, and the average number of servers deployed in data centers is expected to double over the same timeline.To read this article in full, please click here
A British chip startup has launched what it claims is the world's most complex AI chip, the Colossus MK2 or GC200 IPU (intelligence processing unit). Graphcore is positioning its MK2 against Nvidia's Ampere A100 GPU for AI applications.The MK2 and its predecessor MK1 are designed specifically to handle very large machine-learning models. The MK2 processor has 1,472 independent processor cores and 8,832 separate parallel threads, all supported by 900MB of in-processor RAM.
SEE ALSO: Nvidia unleashes new generation of GPU hardwareTo read this article in full, please click here
Mainframe users looking to bring legacy applications into the public or private cloud world have a new option: LzLabs, a mainframe software migration vendor.Founded in 2011 and based in Switzerland, LzLabs this week said it's setting up shop in North America to help mainframe users move legacy applications – think COBOL – into the more modern and flexible cloud application environment.Read also: How to plan a software-defined data-center network
At the heart of LzLabs' service is its Software Defined Mainframe (SDM), an open-source, Eclipse-based system that's designed to let legacy applications, particularly those without typically available source code, such as COBOL, run in the cloud without recompilation.To read this article in full, please click here
While the previously hot SD-WAN market has slowed and IT budgets overall are under pressure, the COVID-19 pandemic has created demand for other network capabilities such as improved network-management and collaboration tools, according to IDC.The virus has caused recessionary economy that has forced enterprises across the globe to rapidly and dramatically shift their operations, according to Rohit Mehra, vice president, Network Infrastructure at IDC. “The reality of that is we have seen two years of IT digital transformation in two months,” Mehra told the online audience of an IDC webinar about the impact of the pandemic on enterprise networking.To read this article in full, please click here
About a year and a half ago, some Texas employees of the Federal National Mortgage Association (Fannie Mae) were leaving work early to work at home over the enterprise VNP because it gave them better application performance and less congestion than the office network.That’s also when the agency started moving toward a cloud-first environment and away from its legacy hub-and-spoke WAN.More about SD-WAN: How to buy SD-WAN technology: Key questions to consider when selecting a supplier • How to pick an off-site data-backup method • SD-Branch: What it is and why you’ll need it • What are the options for security SD-WAN?To read this article in full, please click here
Between the pandemic and the subsequent economic upheaval, these are challenging times for everyone. But the networking industry has some elements in its favor. Technologies such as Wi-Fi, VPNs, SD-WAN, videoconferencing and collaboration are playing an essential role in maintaining business operations and will play an even greater role in the reopening and recovery phase.To read this article in full, please click here(Insider Story)
The economic devastation of the global COVID-19 pandemic has many businesses fighting for survival, but dealing with chaos and uncertainty comes with the territory for a certain category of business: Startups.They thrive on disruption (or at least that’s the message they pitch to investors), but is the lean, move-fast-and-break-things model one that can survive global disruptions?Unlike retail, travel, and tourism that have been hammered by the downturn, data-center and networking businesses have fared better, with some such as teleconferencing seeing spikes in demand.To read this article in full, please click here
Ampere – the semiconductor startup founded by former Intel executive Renee James and not to be confused with the new line of Nvidia cards – just introduced a 128-core, Arm-based server processor to complement its 80-core part.The new processor, Altra Max, comes just three months after the company launched its first product, the 80-core Altra. Ampere says the new processor will be fully socket compatible with the existing part so customers can just do a chip swap if they want. Ampere
"This is not replacing Altra," says Jeff Wittich, senior vice president of products at Ampere and one of many ex-Intel executives at Ampere. "I expect there will be workloads and customers who will use Altra and Altra Max together for a long time. Anything suited for Max will be suited for Altra for a long time."To read this article in full, please click here
The newly crowned fastest supercomputer in the world according to the current TOP500 list of the world’s fastest buried the reigning champion by turning in speeds that were 2.8 times faster.To read this article in full, please click here(Insider Story)
Intel formally unveiled the third generation of its Xeon Scalable processor family, developed under the codename "Cooper Lake." This generation is aimed at the high end of the performance line for functions such as high-performance computing (HPC) and artificial intelligence (AI).The Cooper Lake line is targeted at four- and eight-socket servers. Xeons based on the Ice Lake architecture are due later this year and will target one- and two-socket servers. The latest announcement includes 11 new SKUs with between 16 and 28 cores, running at up to 3.1 Ghz base clock (and up to 4.3 Ghz with Turbo Boost), plus support for up to six memory channels.
READ MORE: Data center sales dip amid COVID-19 fallout, but public cloud growsTo read this article in full, please click here
Most things with a measurable temperature – human beings going about their daily routines, inert objects – generate terahertz waves, radiation that is sandwiched between infrared and microwave on the electromagnetic spectrum.So far, these waves haven’t proved very useful, but now scientists at MIT are trying to harness them with devices that use them to generate electricity that could charge the batteries of cellphones, laptops, even medical implants.[Get regularly scheduled insights by signing up for Network World newsletters.]
If successful, the charging devices would passively gather the waves and generate DC current at room temperature, something that hasn’t been accomplished before. Previous devices that can turn terahertz waves – T-rays – to electricity only work in ultracold environments, according to an MIT News article about the project.To read this article in full, please click here
Cisco has added features to is flagship network control platform, DNA Center, that introduce new analytics and problem-solving capabilities for enterprise network customers.DNA Center is the heart of Cisco’s Intent Based Networking initiative and is the core-networking control platform that features myriad services from analytics, network management and automation capabilities to assurance setting, fabric provisioning and policy-based segmentation for enterprise networks. The company extended DNA Center’s AI Endpoint Analytics application by adding the ability to analyze the data gathered from Cisco packages such as its Identity Services Engine, Software Defined Application Visibility and Control, wireless LAN controllers or third part third-party components.To read this article in full, please click here
This should come as no surprise, but spending on data-center hardware and software dipped in Q1 and cloud sales grew, but neither as much as you would think.Q1 figures from Synergy Research Group show that spending on enterprise software and hardware shrank globally by a modest 2% year on year to $35.8 billion, with the biggest non-cloud players, such as Microsoft, Dell, HPE, Cisco and VMware, down 4%.[Get regularly scheduled insights by signing up for Network World newsletters.]
When it comes to public cloud infrastructure, sales rose 3% to $9.66 billion year on year. The top vendors were Chinese ODMs that hyperscalers like: Dell, Microsoft, Inspur and Cisco.To read this article in full, please click here
Something as simple as how you tell your backup product which files and databases to backup can have a massive impact on your recoverability. Proper backup selection is essentially a balance between ensuring that everything that should be backed up is indeed backed up, while also trying not to backup worthless data.Physical server inclusion
Virtually all backup products require some initial installation and configuration at the level of a physical server. This means that for any of the tactics mentioned in this article to work, one must first install the appropriate software and authorization on each physical server in the data center. This means every VMware or Hyper-V server (not to be confused with each VM on those servers), every physical UNIX or Windows server, and any cloud services that are being backed up. Someone must make that initial connection and authentication before the backup system can perform its magic.To read this article in full, please click here
Data centers and water seem to go together, despite the fact water is bad for electronics. Many hyperscale data centers are built near rivers for use as hydroelectric power sources, liquid cooling is growing in popularity, and in one extreme case, Microsoft sunk a mini data center off the coast of northern England.The next step, it seems, is the floating data center, one on the water and easily accessible but also mobile. A startup called Nautilus Data Technologies has lined up $100 million in funding to build a six-megawatt floating colocation facility that it says will be cheaper and more efficient than traditional facilities.To read this article in full, please click here
Airbus expects quantum computing to have major production, performance and efficiency benefits as the technology plays a role in its cybersecurity, aerospace and communications businesses.“We are users of quantum computing and intend to use it to deliver more powerful services and systems,” said Paolo Bianco, global research & technology cooperation manager for Airbus to an online audience at the Inside Quantum Technology virtual event this week.To read this article in full, please click here
Airbus expects quantum computing to have major production, performance and efficiency benefits as the technology plays a role in its cybersecurity, aerospace and communications businesses.“We are users of quantum computing and intend to use it to deliver more powerful services and systems,” said Paolo Bianco, global research & technology cooperation manager for Airbus to an online audience at the Inside Quantum Technology virtual event this week.To read this article in full, please click here
Airbus expects quantum computing to have major production, performance and efficiency benefits as the technology plays a role in its cybersecurity, aerospace and communications businesses.“We are users of quantum computing and intend to use it to deliver more powerful services and systems,” said Paolo Bianco, global research & technology cooperation manager for Airbus to an online audience at the Inside Quantum Technology virtual event this week.To read this article in full, please click here
Airbus expects quantum computing to have major production, performance and efficiency benefits as the technology plays a role in its cybersecurity, aerospace and communications businesses.To read this article in full, please click here(Insider Story)
Cisco is telling customers of its Nexus core data-center switches to fix or work around a vulnerability that could leave the boxes open to a denial of service attack.The vulnerability, found in the Nexus NX-OS software gets a 8.6 score out of 10 on the Common Vulnerability Scoring System, making it a “High” risk problem.Cisco said the vulnerability is due to an affected device unexpectedly decapsulating and processing IP-in-IP packets that are destined to a locally configured IP address. IP in IP is a tunneling protocol that wraps an IP packet within another IP packet.To read this article in full, please click here