COMMISSIONED Enterprises adopting AI to stay competitive must tailor AI models to their needs. …
Mastering The AI Terrain: Why Optimal Storage Is Essential For Competitive Edge was written by Timothy Prickett Morgan at The Next Platform.
Here’s a question for you: How much of the growth in cloud spending at Microsoft Azure, Amazon Web Services, and Google Cloud in the second quarter came from OpenAI and Anthropic spending money they got as investments out of the treasure chests of Microsoft, Amazon, and Google? …
The Sugar Daddy Boomerang Effect: How AI Investments Puff Up The Clouds was written by Timothy Prickett Morgan at The Next Platform.
COMMISSIONED: Whether you’re using one of the leading large language models (LLM), emerging open-source models or a combination of both, the output of your generative AI service hinges on the data and the foundation that supports it. …
GenAI Only As Good As Its Data And Platforms was written by Martin Courtney at The Next Platform.
Intel’s second quarter is pretty much a carbon copy of the first three months of 2024 when it comes to revenues across its newly constituted groups, and with an operating loss that is twice as big. …
The Resurrection Of Intel Will Take More Than Three Days was written by Timothy Prickett Morgan at The Next Platform.
With all of the hyperscalers and major cloud builders designing their own CPUs and AI accelerators, the heat is on those who sell compute engines to these companies. …
Ampere Arm Server CPUs To Get 512 Cores, AI Accelerator was written by Timothy Prickett Morgan at The Next Platform.
As expected, AMD has once again raised its forecast for sales of its Instinct MI300 series GPUs, and as it has broken through $1 billion in revenues for its “Antares” line of compute engines in the second quarter, it is now expecting to surpass $4.5 billion in sales of these devices for all of 2024. …
AMD Breaks $1 Billion In Datacenter GPU Sales In Q2 was written by Timothy Prickett Morgan at The Next Platform.
The companies under the control of Elon Musk – SpaceX, Tesla, xAI, and X (formerly known as Twitter) – all need a hell of a lot of GPUs, and all for their own specific AI or HPC projects. …
So Who Is Building That 100,000 GPU Cluster For xAI? was written by Timothy Prickett Morgan at The Next Platform.
Commissioned: The importance of data has never been more salient in this golden age of AI services. …
Your AI Strategy Called: It Wants You To Free The Data was written by Timothy Prickett Morgan at The Next Platform.
For Mark Zuckerberg, the decision by Meta Platforms – and way back when it was still known as Facebook – to open much of its technology – including server and storage designs, datacenter designs, and most recently its Llama AI large language models – came about because the company often found itself trailing competitors when it came to deploying advanced technologies. …
For Meta Platforms, An Open AI Policy Is The Best Policy was written by Jeffrey Burt at The Next Platform.
Training AI models is expensive, and the world can tolerate that to a certain extent so long as the cost inference for these increasingly complex transformer models can be driven down. …
Stacking Up AMD Versus Nvidia For Llama 3.1 GPU Inference was written by Timothy Prickett Morgan at The Next Platform.
When you are International Business Machines and you do corporate IT deals in 185 countries around the world, political and economic uncertainty is always a problem. …
IBM Lays Its GenAI Foundation With Software And Services was written by Timothy Prickett Morgan at The Next Platform.
A scant three months ago, when Meta Platforms released the Llama 3 AI model in 8B and 70B versions, which correspond to the billions of parameters they can span, we asked the question we ask of every open source tool or platform since the dawn of Linux: Who’s going to profit from it and how are they going to do it? …
Meta Lets Its Largest Llama AI Model Loose Into The Open Field was written by Jeffrey Burt at The Next Platform.
SPONSORED POST: The rapid breakout of Artificial Intelligence is driving business opportunities across verticals – but there’s one sector for which AI presents some formidable challenges, and that’s the datacenter industry itself. …
AI Era Datacenters Need AI-Ready Ethernet Switching was written by Timothy Prickett Morgan at The Next Platform.
In today’s dynamic technological environment, service providers such as cloud service providers (CSPs), managed service providers (MSPs), software-as-a-service (SaaS) providers, and enterprise private cloud operators face a myriad of challenges in the modern datacenter. …
Scaling The Datacenter: Five Best Practices For CSPs was written by Timothy Prickett Morgan at The Next Platform.
There is a seemingly endless list of problems to be solved and issues to be addressed in vendors’ steady march toward quantum computing. …
GPU-Armed Scientists Solve A Quantum Annealing Debate was written by Jeffrey Burt at The Next Platform.
Taiwan Semiconductor Manufacturing Co already has a de facto monopoly – one that it has earned – when it comes to the manufacturing of datacenter compute engines. …
TSMC Seeks To Make Itself More Indispensable Than It Already Is was written by Timothy Prickett Morgan at The Next Platform.
Back in 2012, when AMD was in the process of backing out of the datacenter CPU business and did not really have its datacenter GPU act together at all, the US Department of Energy exhibited the enlightened self-interest that is a strong foundation of both economics and politics and took a chance and invested in AMD to do research in memory technologies and hybrid CPU-GPU computing at exascale. …
AMD’s Long And Winding Road To The Hybrid CPU-GPU Instinct MI300A was written by Timothy Prickett Morgan at The Next Platform.
Maybe, if you need blazing performance extracting data and chewing on it from a relational database, it belongs in a cloud. …
Oracle Goes Massively Parallel, Low Cost With Exadata Exascale was written by Timothy Prickett Morgan at The Next Platform.
Everybody knows that companies, particularly hyperscalers and cloud builders but now increasingly enterprises hoping to leverage generative AI, are spending giant round bales of money on AI accelerators and related chips to create AI training and inference clusters. …
Ongoing Saga: How Much Money Will Be Spent On AI Chips? was written by Timothy Prickett Morgan at The Next Platform.
There have been rumors that either Arm Ltd or parent company and Japanese conglomerate SoftBank would buy British AI chip and system upstart – it is no longer a startup if it is eight years old – Graphcore for quite some time. …
Can SoftBank Be An AI Supercomputer Player? Will Arm Lend A Hand? was written by Timothy Prickett Morgan at The Next Platform.