Data centers aren’t ready for AI, Schneider warns

Schneider Electric is warning the demands of power and cooling for AI are beyond what standard data center designs can handle and says new designs are necessary.That may be expected from a company like Schneider, which makes power and cooling systems used in data centers. But it doesn’t mean Schneider isn't correct. AI is a different kind of workload than standard server-side applications, such as databases, and the old ways just don’t cut it anymore.Schneider's white paper notes that AI needs ample supply of three things: power, cooling, and bandwidth. GPUs are the most popular AI processors and the most power intensive. Whereas CPUs from Intel and AMD draw about 300 to 400 watts, Nvidia’s newest GPUs draw 700 watts per processor and they are often delivered in clusters of eight at a time.To read this article in full, please click here

Japan invests $1.3 billion in Micron to subsidize chip manufacturing: Report

The Japanese government on Tuesday said that it had invested $1.3 billion in Micron’s Hiroshima factory as subsidy for manufacturing more advanced chips that support or power AI and quantum workloads.The investment is expected to cover the cost of installing ASML Holding’s extreme ultraviolet lithography equipment at the factory, according to a Bloomberg report. Lithography machines are used to draw patterns on silicon chips using light, and Dutch company ASML Holdings is one of the top producers of these lithography machines.To read this article in full, please click here

Importing Ansible Validated Content into Private Automation Hub

Introduction

Ansible validated content is a set of collections containing pre-built YAML content (such as playbooks or roles) to address the most common automation use cases. You can use Ansible validated content out-of-the-box or as a learning opportunity to develop your automation skills. It's a trusted starting point to bootstrap your automation: use it, customize it and learn from it!

This content is curated by experts like the Red Hat Automation Community of Practice so:

  • Use cases are based on successfully deployed customer examples
  • Content creators are trusted and verified subject matter experts
  • Content itself adheres to the latest best practices and guidelines issued by Red Hat’s engineering team
  • Ansible validated content is tested against supported versions of Red Hat Ansible Automation Platform

Ansible Automation Platform is a trusted delivery system to access and leverage Ansible validated content in your organization.

 

How can I get this Ansible validated content into my Ansible Automation Platform on clouds (AWS, Azure, Google Cloud) deployment?

To do this there are a few short steps. Let’s walk through these together.

As part of your Ansible Automation Platform on cloud, you will also have a private automation hub. This is your own internal automation content Continue reading

Microsoft puts its Cloud for Sovereignty in public preview

Microsoft on Tuesday moved its Cloud for Sovereignty offering from private preview to public preview and said the offering is likely to be made generally available this December.Microsoft Cloud for Sovereignty, which is aimed at helping government bodies meet specific compliance, security, and policy requirements, was first introduced in July of last year. Since then the company has released two private releases of the offering.The public preview version of the offering includes new features such as the Sovereign Landing Zone, support for two country-specific requirements, transparency logs, and automated workload templates.The Sovereign Landing Zone and policy initiative, which is now available on GitHub, instantiates guardrails for sovereign cloud environments for customer workloads, enabling customers to leverage best practices for secure and consistent environments while supporting their efforts to meet evolving local regulations, the company said.To read this article in full, please click here

Announcing General Availability for the Magic WAN Connector: the easiest way to jumpstart SASE transformation for your network

Announcing General Availability for the Magic WAN Connector: the easiest way to jumpstart SASE transformation for your network
Announcing General Availability for the Magic WAN Connector: the easiest way to jumpstart SASE transformation for your network

Today, we’re announcing the general availability of the Magic WAN Connector, a key component of our SASE platform, Cloudflare One. Magic WAN Connector is the glue between your existing network hardware and Cloudflare’s network — it provides a super simplified software solution that comes pre-installed on Cloudflare-certified hardware, and is entirely managed from the Cloudflare One dashboard.

It takes only a few minutes from unboxing to seeing your network traffic automatically routed to the closest Cloudflare location, where it flows through a full stack of Zero Trust security controls before taking an accelerated path to its destination, whether that’s another location on your private network, a SaaS app, or any application on the open Internet.

Since we announced our beta earlier this year, organizations around the world have deployed the Magic WAN Connector to connect and secure their network locations. We’re excited for the general availability of the Magic WAN Connector to accelerate SASE transformation at scale.

When customers tell us about their journey to embrace SASE, one of the most common stories we hear is:

We started with our remote workforce, deploying modern solutions to secure access to internal apps and Internet resources. But now, we’re looking at Continue reading

What Is Ultra Ethernet All About?

If you’re monitoring the industry press (or other usual hype factories), you might have heard about Ultra Ethernet, a dazzling new technology that will be developed by the Ultra Ethernet Consortium1. What is it, and does it matter to you (TL&DR: probably not2)?

As always, let’s start with What Problem Are We Solving?

What Is Ultra Ethernet All About?

If you’re monitoring the industry press (or other usual hype factories), you might have heard about Ultra Ethernet, a dazzling new technology that will be developed by the Ultra Ethernet Consortium1. What is it and does it matter to you (TL&DR: probably not2)?

As always, let’s start with What Problem Are We Solving?

Tech Bytes: The SD-WAN Prescription For Healthcare Networks (Sponsored)

Today on the Tech Bytes podcast, we talk with sponsor Palo Alto Networks about SD-WAN in healthcare markets. The healthcare sector has stringent requirements around the privacy and security of patient information, but clinics also need reliable and robust performance. We discuss how SD-WAN can help meet all these requirements.

The post Tech Bytes: The SD-WAN Prescription For Healthcare Networks (Sponsored) appeared first on Packet Pushers.

The First Peeks At The DOE Post-Exascale Supercomputers

Other than Hewlett Packard Enterprise, who wants to build the future NERSC-10 supercomputer at Lawrence Berkeley National Laboratory or the future OLCF-6 system at Oak Ridge National Laboratory?

The post The First Peeks At The DOE Post-Exascale Supercomputers first appeared on The Next Platform.

The First Peeks At The DOE Post-Exascale Supercomputers was written by Timothy Prickett Morgan at The Next Platform.

Network Break 449: Amazon Invests $4 Billion In AI Startup; Will Small Modular Reactors Power Public Clouds?

Today's Network Break, with guest host Johna Till Johnson, discusses why Amazon is pouring $4 billion into a generative AI startup, Marvell's response to accusations of an NSA-friendly backdoor in older Cavium products, why Microsoft is investigating small modular nuclear reactors, Meta using public posts to train AI, and more tech news

Network Break 449: Amazon Invests $4 Billion In AI Startup; Will Small Modular Reactors Power Public Clouds?

Today's Network Break, with guest host Johna Till Johnson, discusses why Amazon is pouring $4 billion into a generative AI startup, Marvell's response to accusations of an NSA-friendly backdoor in older Cavium products, why Microsoft is investigating small modular nuclear reactors, Meta using public posts to train AI, and more tech news

The post Network Break 449: Amazon Invests $4 Billion In AI Startup; Will Small Modular Reactors Power Public Clouds? appeared first on Packet Pushers.

AI startup SambaNova updates processor, software

SambaNova Systems, maker of dedicated AI hardware and software systems, has launched a new AI chip, the SN40, that will be used in the company’s full-stack large language model (LLM) platform, the SambaNova Suite.First introduced in March, the SambaNova Suite uses custom processors and operating systems for AI inference training. It's designed to be an alternative to power-hungry and expensive GPUs.To upgrade the hardware so soon after launch means that there ought to be a big jump in performance, and there is. The SN40L serves up to a 5 trillion parameter LLM with 256K+ sequence length possible on a single system node, according to the vendor.To read this article in full, please click here

AI startup SambaNova updates processor, software

SambaNova Systems, maker of dedicated AI hardware and software systems, has launched a new AI chip, the SN40, that will be used in the company’s full-stack large language model (LLM) platform, the SambaNova Suite.First introduced in March, the SambaNova Suite uses custom processors and operating systems for AI inference training. It's designed to be an alternative to power-hungry and expensive GPUs.To upgrade the hardware so soon after launch means that there ought to be a big jump in performance, and there is. The SN40L serves up to a 5 trillion parameter LLM with 256K+ sequence length possible on a single system node, according to the vendor.To read this article in full, please click here

Data storage archive options: Batch, real-time and hierarchical storage management

When it comes to archiving data, there are three different approaches, generally speaking. Selecting the right system hinges on technical capabilities as well as external factors such as budget constraints. Enterprise storage pros need to balance data- preservation, accessibility, and resource-optimization requirements as they weigh the various archive systems available in the market. Let's take a deeper look into the different types of archive systems.Traditional batch archive With a traditional batch archive, data serves its purpose for a certain period before being tucked away in a safe repository, awaiting the possibility of being of some use in the future. The main idea behind this type of archive is to preserve data over an extended timeframe, while keeping costs at a minimum and ensuring that retrieval remains a breeze even years down the line. In this kind of archive system, each collection of data selected for archiving is given one or more identities, stored as metadata alongside the archived data. This metadata plays a pivotal role in locating and retrieving the archived information, with details such as project names, tools used for to create the data, the creator’s name, and the creation timeframe all forming part of this digital fingerprint. Continue reading

Data storage archive options: Batch, real-time and hierarchical storage management

When it comes to archiving data, there are three different approaches, generally speaking. Selecting the right system hinges on technical capabilities as well as external factors such as budget constraints. Enterprise storage pros need to balance data- preservation, accessibility, and resource-optimization requirements as they weigh the various archive systems available in the market. Let's take a deeper look into the different types of archive systems.Traditional batch archive With a traditional batch archive, data serves its purpose for a certain period before being tucked away in a safe repository, awaiting the possibility of being of some use in the future. The main idea behind this type of archive is to preserve data over an extended timeframe, while keeping costs at a minimum and ensuring that retrieval remains a breeze even years down the line. In this kind of archive system, each collection of data selected for archiving is given one or more identities, stored as metadata alongside the archived data. This metadata plays a pivotal role in locating and retrieving the archived information, with details such as project names, tools used for to create the data, the creator’s name, and the creation timeframe all forming part of this digital fingerprint. Continue reading

What IT needs to know about energy-efficiency directives for data centers

Creating energy-efficient and sustainable data centers makes a lot of sense from a business standpoint. Aside from the obvious environmental impact of lower carbon emissions, the potential business benefits include lower operating costs, reduced space requirements and a positive brand image.There’s another good reason for building more sustainable and energy-efficient data centers: Regulations and standards are emerging around the world that will require or recommend such actions.IT and networking executives and their teams need to get up to speed on a host of sustainability regulations and standards that are going to require a response on their part. Energy efficiency and sustainability are not just issues for facilities teams anymore. They are a concern for IT teams that will be asked to provide metrics, so the need for reporting will become more urgent. They will also need to select more energy-efficient hardware.To read this article in full, please click here

What IT needs to know about energy-efficiency directives for data centers

Creating energy-efficient and sustainable data centers makes a lot of sense from a business standpoint. Aside from the obvious environmental impact of lower carbon emissions, the potential business benefits include lower operating costs, reduced space requirements and a positive brand image.There’s another good reason for building more sustainable and energy-efficient data centers: Regulations and standards are emerging around the world that will require or recommend such actions.IT and networking executives and their teams need to get up to speed on a host of sustainability regulations and standards that are going to require a response on their part. Energy efficiency and sustainability are not just issues for facilities teams anymore. They are a concern for IT teams that will be asked to provide metrics, so the need for reporting will become more urgent. They will also need to select more energy-efficient hardware.To read this article in full, please click here

Birthday Week recap: everything we announced — plus an AI-powered opportunity for startups

Birthday Week recap: everything we announced — plus an AI-powered opportunity for startups
Birthday Week recap: everything we announced — plus an AI-powered opportunity for startups

This year, Cloudflare officially became a teenager, turning 13 years old. We celebrated this milestone with a series of announcements that benefit both our customers and the Internet community.

From developing applications in the age of AI to securing against the most advanced attacks that are yet to come, Cloudflare is proud to provide the tools that help our customers stay one step ahead.

We hope you’ve had a great time following along and for anyone looking for a recap of everything we launched this week, here it is:

Monday

What

In a sentence…

Switching to Cloudflare can cut emissions by up to 96%

Switching enterprise network services from on-prem to Cloudflare can cut related carbon emissions by up to 96%. 

Cloudflare Trace

Use Cloudflare Trace to see which rules and settings are invoked when an HTTP request for your site goes through our network. 

Cloudflare Fonts

Introducing Cloudflare Fonts. Enhance privacy and performance for websites using Google Fonts by loading fonts directly from the Cloudflare network. 

How Cloudflare intelligently routes traffic

Technical deep dive that explains how Cloudflare uses machine learning to intelligently route traffic through our vast network. 

Low Latency Live Streaming

Cloudflare Stream’s Continue reading