Archive

Category Archives for "Network World Data Center"

Oracle plans second cloud region in Singapore to meet growing demand

Oracle on Tuesday said it is planning to add a second cloud region in Singapore to meet the growing demand for cloud services across Southeast Asia.“Our upcoming second cloud region in Singapore will help meet the tremendous upsurge in demand for cloud services in South East Asia,” Garrett Ilg, president, Japan & Asia Pacific at Oracle, said in a statement.Public cloud services market across Asia Pacific, excluding Japan, is expected to reach $153.6 billion in 2026 from $53.4 billion in 2021, growing at a compound annual growth rate of 23.5%, according to a report from IDC.To read this article in full, please click here

AWS to invest $8.9 billion across its regions in Australia by 2027

Within months of adding a second region in Melbourne, Amazon Web Services (AWS) on Tuesday said it would invest $8.93 billion (AU$13.2 billion) to spruce up infrastructure across its cloud regions in Australia through 2027.The majority share of the investment, about $7.45 billion, will be invested in the company’s cloud region in Sydney through the defined time period. The remaining $1.49 billion will be used to expand data center infrastructure in Melbourne, the company said.The $8.93 billion investment includes a $495 million investment in network infrastructure to extend AWS cloud and edge infrastructure across Australia, including partnerships with telecom partners to facilitate high-speed fiber connectivity between Availability Zones, AWS said.To read this article in full, please click here

IBM targets edge, AI use cases with new z16 mainframes

IBM has significantly reduced the size of some its Big Iron z16 mainframes and given them a new operating system that emphasizes AI and edge computing.The new configurations—which include Telum processor-based, 68-core IBM z16 Single Frame and Rack Mounted models and a new IBM LinuxONE Rockhopper 4 and LinuxONE Rockhopper Rack Mount boxes—are expected to offer customers better data-center configuration options while reducing energy consumption. Both new Rack Mount boxes are 18U compared to the current smallest Single Frame models, which are 42U.To read this article in full, please click here

10 things to know about data-center outages

The severity of data-center outages appears to be falling, while the cost of outages continues to climb. Power failures are “the biggest cause of significant site outages.” Network failures and IT system glitches also bring down data centers, and human error often contributes.Those are some of the problems pinpointed in the most recent Uptime Institute data-center outage report, which analyzes types of outages, their frequency, and what they cost both in money and consequences.Unreliable data is an ongoing problem Uptime cautions that data relating to outages should be treated skeptically given the lack of transparency of some outage victims and the quality of reporting mechanisms. “Outage information is opaque and unreliable,” said Andy Lawrence, executive director of research at Uptime, during a briefing about Uptime’s Annual Outages Analysis 2023.To read this article in full, please click here

Recording your commands on the Linux command line

Recording the commands that you run on the Linux command line can be useful for two important reasons. For one, the recorded commands provide a way to review your command line activity, which is extremely helpful if something didn't work as expected and you need to take a closer look. In addition, capturing commands can make it easy to repeat the commands or to turn them into scripts or aliases for long-term reuse. This post examines two ways that you can easily record and reuse commands.Using history to record Linux commands The history command makes it extremely easy to record commands that you enter on the command line because it happens automatically. The only thing you might want to check is the setting that determines how many commands are retained and, therefore, how long they're going to stay around for viewing and reusing. The command below will display your command history buffer size. If it's 1,000 like that shown, it will retain the last 1,000 commands that you entered.To read this article in full, please click here

Data center fires raise concerns about lithium-ion batteries

Fire is to blame for a small but significant number of data-center outages including a March 28 fire that caused severe damage to a data center in France, and an analysis of global incidents highlights ongoing concerns about the safety of lithium-ion batteries and their risk of combustion.The use of lithium ion (Li-ion) batteries in data centers is growing. Now commonly used in uninterruptible power supplies, they are expected to account for 38.5% of the data-center battery market by 2025, up from 15% in 2020, according to consulting firm Frost & Sullivan.To read this article in full, please click here

Intel announces 144 core Xeon processor

Intel has announced a new processor with 144 cores designed for simple data-center tasks in a power-efficient manner.Called Sierra Forest, the Xeon processor is part of the Intel E-Core (Efficiency Core) lineup that that forgoes advanced features such as AVX-512 that require more powerful cores. AVX-512 is Intel Advanced Vector Extensions 512, “a set of new instructions that can accelerate performance for workloads and usages such as scientific simulations, financial analytics, artificial intelligence (AI)/deep learning, 3D modeling and analysis, image and audio/video processing, cryptography and data compression,” according to Intel.Sierra Forest signals a shift for Intel that splits its data-center product line into two branches, the E-Core and the P-Core (Performance Core), which is the traditional Xeon data-center design that uses high-performance cores.To read this article in full, please click here

Supermicro has a new liquid-cooled server for AI

With data center servers running hotter and hotter, the interest in liquid cooling is ramping up with vendors announcing servers that feature self-contained systems and businesses with expertise in related technologies jumping in.Liquid cooling is more efficient than traditional air cooling, and Supermicro is using it to cool the hottest processors in a new server designed as a platform to develop and run AI software.The SYS-751GE-TNRT-NV1 server runs hot. It features four NVIDIA A100 GPUs that draw 300W each and are liquid-cooled by a self-contained system.Some liquid cooling systems rely on water that is piped into the data center. The self-contained system doesn’t require that, so it makes the servers more widely deployable.The system is quiet, too; its running noise level is 30dB.To read this article in full, please click here

10-year server lifespan? That’s what one cloud service provider plans

A trend to extend the lifespan of servers beyond the typical three- to five-year range has companies such as Microsoft looking to add a few years of use to hardware that would otherwise be retired.The latest company to adopt this strategy is Paris-based Scaleway, a European cloud services provider that's sharing details about how it plans to get a decade of use out of its servers through a mix of reuse and repair.Scaleway decided the carbon footprint of new servers is just too large – server manufacturing alone accounts for 15% to 30% of each machine’s carbon impact. Reusing existing machines, rather than buying new ones, could significantly reduce e-waste.To read this article in full, please click here

Commercial quantum networks inch closer to primetime

As commercial availability of quantum computers moves closer to reality, researchers and vendors are investing in efforts to create quantum-secured networks.Quantum networks use entangled photons or other particles to ensure secure communications, but they are not, in and of themselves, used for general communication. Quantum networks are expensive and slow. And though nobody can listen in on the messages without breaking the entanglement of the photons, hackers can still try to attack the systems before the messages get into the quantum network, or after they leave it.Instead, quantum networks today are largely used for quantum key distribution (QKD), which uses quantum mechanics to secure the transmission of symmetric encryption keys. According to a June report by quantum industry analyst firm IQT research, the worldwide market for quantum networks will near $1.5 billion in 2027 and grow to more than $8 billion by 2031, and QKD will be the main revenue driver, followed by a rise in networks that use emerging quantum repeaters to connect quantum computers together and quantum sensor networks.To read this article in full, please click here

Cloud vs on-prem: SaaS vendor 37signals bails out of the public cloud

David Heinemeier Hansson, co-owner and CTO at SaaS vendor 37signals, is quitting the cloud and wants everyone to know about it. In a series of blog posts, Hansson has challenged the cloud business model, rebutted assumptions associated with cloud computing, and argued that the consolidation of power among hyperscalers is not necessarily a good thing.It might seem counterintuitive for a SaaS vendor to be publicly taking pot shots at the cloud and suggesting that other companies re-consider their cloud investments.  Has Hansson, the creator of Ruby on Rails, gone off the rails?Hansson’s argument is simple:  By pulling server workloads off the Amazon AWS infrastructure, purchasing new hardware from Dell, and running his business from a colocation facility, he will save millions of dollars.To read this article in full, please click here

Nvidia annunces new DPU, GPUs

Nvidia launched its GPU Technology Conference with a mix of hardware and software news, all of it centered around AI.The first big hardware announcement is the BlueField-3 network data-processing unit (DPU) designed to offload network processing tasks from the CPU. BlueField comes from  Nvidia's Mellanox acquisition, and is a SmartNIC fintelligent-networking card.BlueField-3 has double the number of Arm processor cores as the prior generation product as well as more accelerators in general and can run workloads up to eight times faster than the prior generation. BlueField-3 can accelerate network workloads across the cloud and on premises for high-performance computing and AI workloads in a hybrid setting.To read this article in full, please click here

Nvidia announces new DPU, GPUs

Nvidia launched its GPU Technology Conference with a mix of hardware and software news, all of it centered around AI.The first big hardware announcement is the BlueField-3 network data-processing unit (DPU) designed to offload network processing tasks from the CPU. BlueField comes from  Nvidia's Mellanox acquisition, and is a SmartNIC fintelligent-networking card.BlueField-3 has double the number of Arm processor cores as the prior generation product as well as more accelerators in general and can run workloads up to eight times faster than the prior generation. BlueField-3 can accelerate network workloads across the cloud and on premises for high-performance computing and AI workloads in a hybrid setting.To read this article in full, please click here

Oracle ties up with Nvidia to offer AI supercomputing service

Oracle is partnering with Nvidia to offer a new AI supercomputing service, dubbed DGX Cloud and available immediately, using Oracle Cloud Infrastructure's Supercluster.“OCI has excellent performance. They have a two-tier computing fabric and management network," Nvidia CEO Jensen Huang said during his keynote at the company’s annual GTC conference on Tuesday.Nvidia is working with other cloud providers to provide similar services, but Oracle is its first partner to go live with an offering. "Nvidia's CX7 along with Oracle’s non-blocking remote direct access memory (RDMA) forms the computing fabric," Huang said. "And Bluefield 3 will be the infrastructure processor for the management network. The combination is a state-of-the-art DGX AI supercomputer that can be offered as a multitenant cloud service.”To read this article in full, please click here

Counting and modifying lines, words and characters in Linux text files

Linux includes some useful commands for counting when it comes to text files. This post examines some of the options for counting lines and words and making changes that might help you see what you want.Counting lines Counting lines in a file is very easy with the wc command. Use a command like that shown below, and you'll get a quick response.$ wc -l myfile 132 myfile What the wc command is actually counting is the number of newline characters in a file. So, if you had a single-line file with no newline character at the end, it would tell you the file has 0 lines,The wc -l command can also count the lines in any text that is piped to it. In the example below, wc -l is counting the number of files and directories in the current directory.To read this article in full, please click here

Dell offers bare metal cloud via colocation

A new deal between Dell and colocation services provider Cyxtera will enable enterprises to access Dell’s PowerEdge infrastructure for bare-metal deployments in Cyxtera facilities.“Bare metal” cloud services means you get the hardware with no software loaded. Typically, a cloud services provider offers an operating system, usually Linux, and accompanying infrastructure. With bare metal, you just get CPU cores, memory, networking and storage but no OS. You provide your own environment.Under the deal, enterprises will be able to deploy Dell hardware through Cyxtera’s enterprise bare-metal service, an on-demand offering that connects an enterprise’s existing on-premises infrastructure with the colocation environment.To read this article in full, please click here

Using the at command to schedule tasks on Linux

To schedule a command or script to run at some particular time, the at command is perfect and provides many options for specifying the time you want it to run. It will set the task up to be run whenever you specify, and you can view the scheduled tasks or even change your mind and cancel one of them as you see fit.The at command differs from cron in that it sets up a command or script to run only once, while cron allows you to set up commands or scripts to be run on a specified schedule – whether every day, once a week, a couple times a month or even just once a year.at command syntax Using the at command is relatively easy, though it has a lot of options, particularly on how you specify the time a task should be run. If you specify a time like shown below, the task will be set up to be run the next time you reach 15:27 (3:27 PM), whether that's today or tomorrow.To read this article in full, please click here

Vast Data focuses on metadata cataloging, encryption support and snapshots

Flash storage vendor Vast Data has released what it claims is its biggest software release, updating and adding new features around data catalogs that will allow enterprises to tag data with user-defined information and to query datasets that meld structured, unstructured and semi-structured data.Vast has actually announced two revisions to its software, 4.6 and 4.7. The software itself has no formal name, just a version number. Version 4.6, available now, is a major release, and 4.7, available this spring, will be a minor release, according to Steve Pruchniewski, director of product marketing at VAST. “Major feature releases are core to the product's evolution, where minor feature releases typically contain bug fixes and holdovers from previous feature releases,” Pruchniewski said.To read this article in full, please click here

Intel delays next GPU Max until 2025

A significant change to Intel's high performance computing roadmap gives competitors AMD and Nvidia plenty of time to grab market share.Intel has a pair of processors called CPU Max and GPU Max. Both feature high bandwidth memory (HBM) on the die which greatly increases performance. The successor to the GPU Max, known as Rialto Bridge, was due later this year or early next year.Instead, Intel cancelled Rialto Bridge, and its successor – Falcon Shores – isn't coming until 2025. Longer term, Intel plans to have one processor, called an XPU, that will combine CPU and GPU cores on one die, but that will come after Falcon Shores.To read this article in full, please click here

1 8 9 10 11 12 172