Supercomputer maker Cray announced what it calls its last supercomputer architecture before entering the era of exascale computing. It is code-named “Shasta,” and the Department of Energy, already a regular customer of supercomputing, said it will be the first to deploy it, in 2020.The Shasta architecture is unique in that it will be the first server (unless someone beats Cray to it) to support multiple processor types. Users will be able to deploy a mix of x86, GPU, ARM and FPGA processors in a single system.Up to now, servers either came with x86 or, in a few select cases, ARM processors, with GPUs and FPGAs as add-in cards plugged into PCI Express slots. This will be the first case of fully native onboard processors, and I hardly expect Cray to be alone in using this design.To read this article in full, please click here
A kind of twisting of light beams, within a fiber optic cable, rather than the sending of them linearly will let computer systems, and the internet overall, run faster, according to researchers who have just announced new findings. The group reckon they could speed up the internet a hundred-fold using the twisted technique.“What we’ve managed to do is accurately transmit data via light at its highest capacity in a way that will allow us to massively increase our bandwidth,” Dr. Haoran Ren, of Australia’s RMIT University, said in a press release.To read this article in full, please click here
Researchers reckon they could speed up the internet a hundredfold with a new technique that twists light beams within fiber optic cable rather than sending them in a straight path.“What we’ve managed to do is accurately transmit data via light at its highest capacity in a way that will allow us to massively increase our bandwidth,” Dr. Haoran Ren, of Australia’s RMIT University, said in a press release.[ Learn who's developing quantum computers. ]
The corkscrewing configuration, in development over the last few years and now recently physically miniaturized, uses a technique called orbital angular momentum (OAM).To read this article in full, please click here
For the past three years, the software-defined WAN or SD-WAN, has been one of the most talked about technology trends. By some estimations there are 60+ vendors, all marketing their products around the concept of SD-WAN, each vying to carve out a piece of this emerging multi-billion-dollar market. While all the discussion and hype around SD-WAN has helped shine the spotlight on the business value enterprises can realize by changing the way they build their wide area networks, it has also led to confusion and in some cases delays of adoption.That is why today’s release of the first ever 2018 Gartner Magic Quadrant for WAN Edge Infrastructure is such an important milestone for us. Gartner has long been one of the most influential industry analyst firms in the world, providing highly credible perspectives on technology vendors and markets. By listening to thousands of enterprise customers, reviewing each vendor’s solution in detail and analyzing their Ability to Execute and Completeness of Vision, they have published a comprehensive report to provide their view of the Leaders, Challengers, Niche Players and Visionaries in this rapidly changing market (to get a copy of the WAN Edge Magic Quadrant click here). This new Continue reading
Harrison Lewis wasn’t looking for SD-WAN, but he’s glad he found it.Northgate Gonzalez, which operates 40 specialty grocery stores throughout Southern California, had distributed its compute power for years. Each store individually supported applications with servers and other key infrastructure and relied on batch processing to deal with nightly backups and storage, according to Lewis, the privately held company’s CIO.
More about enterprise SD-WAN:
10 hot SD-WAN startups to watch
How SD-WAN saves $1.2M over 5 years for a radiology firm
SD-WAN deployment options: DIY vs. cloud managed
SD-WAN: What is it and why you’ll use it one day
How to choose the right SD-WAN transport and why it matters
Over time, the company’s needs changed, and it began centralizing more services, including HR and buying systems, as well as Microsoft Office, in the cloud or at the company’s two data centers. With this shift came a heavier burden on the single T-1 lines running MPLS into each store and the 3G wireless backup. Complicating matters, Lewis says, rainy weather in the region would flood the wiring, taking down terrestrial-network connectivity.To read this article in full, please click here
A firewall is a network device that monitors packets going in and out of networks and blocks or allows them according to rules that have been set up to define what traffic is permissible and what traffic isn’t.There are several types of firewalls that have developed over the years, becoming progressively more complex over time and taking more parameters into consideration when determining whether traffic should or should not be allowed to pass. The most modern are commonly known as next-generation firewalls (NGF) and incorporate many other technologies beyond packet filtering.[ Also see What to consider when deploying a next generation firewall. | Get regularly scheduled insights by signing up for Network World newsletters. ]
Initially placed at the boundaries between trusted and untrusted networks, firewalls are now also deployed to protect internal segments of networks, such as data centers, from other segments of organizations’ networks.To read this article in full, please click here
When building networks in the ‘real world’ like city centers, stadiums, apartment buildings, and even office buildings, we frequently come across situations where many access points, installed independently or managed as one network, create overlapping coverage areas. When these access points choose to use the same channel, the performance of all users in such an area is reduced, as the Wi-Fi algorithm used to avoid collisions on the air is quite conservative.One focus of the next Wi-Fi standard, 802.11ax is to improve the performance of ‘real-world’ networks. To this end, the new standard includes a feature enabling more simultaneous transmissions. This feature is known as ‘spatial reuse’ or ‘BSS coloring.’To read this article in full, please click here
Last week, the tech press made a big deal out of a ruling by the Librarian of Congress and the U.S. Copyright Office to allow consumers to break vendors’ digital rights management (DRM) schemes in order to fix their own smartphones and digital voice assistants. According to The Washington Post, for example, the ruling — which goes into effect Oct. 28 — was a big win for consumer right-to-repair advocates. To read this article in full, please click here
Last week, the tech press made a big deal out of a ruling by the Librarian of Congress and the U.S. Copyright Office to allow consumers to break vendors’ digital rights management (DRM) schemes in order to fix their own smartphones and digital voice assistants. According to The Washington Post, for example, the ruling — which goes into effect Oct. 28 — was a big win for consumer right-to-repair advocates. To read this article in full, please click here
IBM announced yesterday that it is buying Red Hat for $34 billion, making it IBM's largest deal to date and the third largest in the history in the US tech industry.After announcing the plan to close the deal sometime in the second half of next year, executives from the two companies held a joint conference call fleshing out the details. Here's what they had to say.According to Arvind Krishna, Senior Vice President of Hybrid Cloud at IBM, this move represents a "game changer" that will redefine the cloud market. Krishna was joined by Paul Cormier, Executive Vice President and President of Products and Technologies at Red Hat.To read this article in full, please click here
IBM has made a bold move to blast its way into the enterprise multicloud world by saying it would buy open source software pioneer Red Hat in a $34 billion stock acquisition.To read this article in full, please click here(Insider Story)
The digital transformation era is upon us, and it’s changing the business landscape faster than ever. I’ve seen numerous studies that show that digital companies are more profitable and have more share in their respective markets. Businesses that master being digital will be able to sustain market leadership, and those that can’t will struggle to survive; many will go away. This is why digital transformation is now a top initiative for every business and IT leader. A recent ZK Research study found that a whopping 89% of organizations now have at least one digital initiative under way, showing the level of interest across all industry verticals.Digital success lies in the quality of data
The path to becoming a digital company requires more than a CIO snapping his fingers and declaring their organization digital. Success lies in being able to find the key insights from the massive amounts of data that businesses have today. This requires machine learning–driven analytics, and there has been a significant amount of media focus on that topic. The other half of the equation is data. Machine learning alone doesn’t do anything. It needs to analyze data, and as the old axiom goes, good data leads Continue reading
In a move that IBM says will make it the world’s leader in hybrid cloud, the company says it’s going to buy open-source giant Red Hat for $34 billion, banking on what it sees as Red Hat’s potential to become the operating system of choice for cloud providers.IBM says it expects growth in the use of cloud services to blossom in the coming years, with enterprises poised to expand from using cloud for inexpensive compute power to placing more applications in the cloud.[ Now see After virtualization and cloud, what's left on premises?]
“To accomplish this, businesses need an open, hybrid cloud approach to developing, running and deploying applications in a multi-cloud environment,” IBM says in a written statement.To read this article in full, please click here
Much of the hype around the Internet of Things is centered on a decentralized model of deployment – edge computing, where specialized devices sit close to the endpoints they’re managing or monitoring, is very much the flavor of the month.Yet the cloud and the data center are still critical parts of the infrastructure, and the huge growth in IoT deployments is having an effect on them, as well. Even deployments that lean heavily on edge compute can stream data back to a central hub for more detailed analysis. So it’s tough to argue that rise of IoT hasn’t changed requirements and expectations in the data center.To read this article in full, please click here
Network security concerns remain an issue with the upcoming 5G and 6G wireless network standards.That's because security measures being aren't being adopted in new 5G standards, and there's a newly discovered potential for Man-in-the-Middle attacks in terahertz-based 6G networks, multiple research studies have discovered.One of those studies — a formal analysis of 5G authentication conducted by scientists from ETH Zurich, the University of Lorraine/INRIA, and the University of Dundee — found that criminals will be able intercept 5G communications and steal data because “critical security gaps are present,” the group says in their press release. That’s in part because “security goals are underspecified” and there’s a “lack of precision” in the 3GPP standards, they say.To read this article in full, please click here
Arista joined the parade toward high-speed Ethernet with new switches capable of supporting 400G speeds and aimed at hyperscale cloud and data-center networks.The Arista 7060X4 Series platforms are based on the Broadcom 12.8Tbps Tomahawk 3 silicon and feature 32 400G ports. Each 400G port can be split into four 100G ports for a total of 128 100G ports in a 1U chassis. Arista also enhanced its EOS operating system to better support traffic management, load balancing, buffering and routing for the high-speed boxes. [ Related: MPLS explained -- What you need to know about multi-protocol label switching
Over time, Ethernet speed transitions have been the primary driver for improving both the throughput and price-performance of data-center networks. 400G Ethernet is the next major transition on this journey, Andreas Bechtolsheim, Arista’s Chief Development Officer wrote in a blog about the announcement.To read this article in full, please click here