Archive

Category Archives for "IT Industry"

AMD Finally Makes More Money On GPUs Than CPUs In A Quarter

Pent up demand for MI308 GPUs in China, which AMD has been trying to get a license to sell since early last year, were approved so that $360 million in Instinct GPU sales that were not officially part of the pipeline made their way onto the AMD books in Q4 2025.

AMD Finally Makes More Money On GPUs Than CPUs In A Quarter was written by Timothy Prickett Morgan at The Next Platform.

Dassault And Nvidia Bring Industrial World Models To Physical AI

During his more than two decades with Nvidia, Rev Lebaredian has had a ringside seat to the show that has been the evolution of modern AI, from the introduction of the AlexNet  deep convolutional neural network that made waves by drastically lowering the error rate at the 2012 ImageNet challenge to the introduction of generative AI and now agentic AI, where systems can create AI assistance to help with knowledge work.

Dassault And Nvidia Bring Industrial World Models To Physical AI was written by Jeffrey Burt at The Next Platform.

TACC Explores Mixed Precision And FP64 Emulation For HPC With Horizon

If you want to test out an idea in HPC simulation and modeling and see how it affects a broad array of scientific applications, there is probably not a better place than the Texas Advanced Computing Center at the University of Texas.

TACC Explores Mixed Precision And FP64 Emulation For HPC With Horizon was written by Timothy Prickett Morgan at The Next Platform.

Big Blue Poised To Peddle Lots Of On Premises GenAI

If you want to know the state of the art in GenAI model development, you watch what the Super 8 hyperscalers and cloud builders are doing and you also keep an eye on the major model builders outside of these companies – mainly, OpenAI, Anthropic, and xAI as well as a few players in China like DeepSeek.

Big Blue Poised To Peddle Lots Of On Premises GenAI was written by Timothy Prickett Morgan at The Next Platform.

Nvidia’s $2 Billion Investment In CoreWeave Is A Drop In A $250 Billion Bucket

With the hyperscalers and the cloud builders all working on their own CPU and AI XPU designs, it is no wonder that Nvidia has been championing the neoclouds that can’t afford to try to be everything to everyone – this is the very definition of enterprise computing – and that, frankly, are having trouble coming up with the trillions of dollars to cover the 150 gigawatts to more than 200 gigawatts of datacenter capacity that is estimated to be on the books between 2025 and 2030 for AI workloads.

Nvidia’s $2 Billion Investment In CoreWeave Is A Drop In A $250 Billion Bucket was written by Timothy Prickett Morgan at The Next Platform.