Archive

Category Archives for "IT Industry"

Learning From Google’s Cloud Storage Evolution

Making storage cheaper on the cloud does not necessarily mean using tape or Blu-Ray discs to hold data. In a datacenter that has enormous bandwidth and consistent latency over a Clos network interconnecting hundreds of thousands of compute and storage servers, and by changing the durability and availability of data on the network and trading off storage costs and data access and movement costs, a hyperscaler can offer a mix of price and performance and cut costs.

That, in a nutshell, is what search engine giant and public cloud provider Google is doing with the latest variant of persistent storage

Learning From Google’s Cloud Storage Evolution was written by Timothy Prickett Morgan at The Next Platform.

Learning From Google’s Cloud Storage Evolution

Making storage cheaper on the cloud does not necessarily mean using tape or Blu-Ray discs to hold data. In a datacenter that has enormous bandwidth and consistent latency over a Clos network interconnecting hundreds of thousands of compute and storage servers, and by changing the durability and availability of data on the network and trading off storage costs and data access and movement costs, a hyperscaler can offer a mix of price and performance and cut costs.

That, in a nutshell, is what search engine giant and public cloud provider Google is doing with the latest variant of persistent storage

Learning From Google’s Cloud Storage Evolution was written by Timothy Prickett Morgan at The Next Platform.

The New Intelligence Economy, And How We Get There

Earlier this month, Samsung acquired Viv, the AI platform built by the creators of Siri that seeks to “open up the world of AI assistants to all developers.” The acquisition was largely overshadowed by the more high-profile news of Samsung’s struggles with its Galaxy Note smartphone, but make no mistake, this was a bold and impactful move by Samsung that aggressively launches the company into the future of smart, AI-enabled devices.

Viv co-founder Dag Kittlaus makes a compelling argument for why Samsung’s ecosystem serves as an invaluable launching pad for Viv’s goal of ubiquity – the electronics giant’s

The New Intelligence Economy, And How We Get There was written by Timothy Prickett Morgan at The Next Platform.

The New Intelligence Economy, And How We Get There

Earlier this month, Samsung acquired Viv, the AI platform built by the creators of Siri that seeks to “open up the world of AI assistants to all developers.” The acquisition was largely overshadowed by the more high-profile news of Samsung’s struggles with its Galaxy Note smartphone, but make no mistake, this was a bold and impactful move by Samsung that aggressively launches the company into the future of smart, AI-enabled devices.

Viv co-founder Dag Kittlaus makes a compelling argument for why Samsung’s ecosystem serves as an invaluable launching pad for Viv’s goal of ubiquity – the electronics giant’s

The New Intelligence Economy, And How We Get There was written by Timothy Prickett Morgan at The Next Platform.

It Takes a Lot of Supercomputing to Simulate Future Computing

The chip industry is quickly reaching the limits of traditional lithography in its effort to cram more transistors onto a piece of silicon at a pace consistent with Moore’s Law. Accordingly, new approaches, including using extreme ultraviolet light sources, are being developed. While this can promise new output for chipmakers, developing this technology to enhance future computing is going to take a lot of supercomputing.

Lawrence Livermore National Lab’s Dr. Fred Streitz and his teams at the HPC Innovation Center at LLNL are working with Dutch semiconductor company, ASML, to push advances in lithography for next-generation chips. Even as a

It Takes a Lot of Supercomputing to Simulate Future Computing was written by Nicole Hemsoth at The Next Platform.

It Takes a Lot of Supercomputing to Simulate Future Computing

The chip industry is quickly reaching the limits of traditional lithography in its effort to cram more transistors onto a piece of silicon at a pace consistent with Moore’s Law. Accordingly, new approaches, including using extreme ultraviolet light sources, are being developed. While this can promise new output for chipmakers, developing this technology to enhance future computing is going to take a lot of supercomputing.

Lawrence Livermore National Lab’s Dr. Fred Streitz and his teams at the HPC Innovation Center at LLNL are working with Dutch semiconductor company, ASML, to push advances in lithography for next-generation chips. Even as a

It Takes a Lot of Supercomputing to Simulate Future Computing was written by Nicole Hemsoth at The Next Platform.

The State of Enterprise Machine Learning

For a topic that generates so much interest, it is surprisingly difficult to find a concise definition of machine learning that satisfies everyone. Complicating things further is the fact that much of machine learning, at least in terms of its enterprise value, looks somewhat like existing analytics and business intelligence tools.

To set the course for this three-part series that puts the scope of machine learning into enterprise context, we define machine learning as software that extracts high-value knowledge from data with little or no human supervision. Academics who work in formal machine learning theory may object to a

The State of Enterprise Machine Learning was written by Nicole Hemsoth at The Next Platform.

The State of Enterprise Machine Learning

For a topic that generates so much interest, it is surprisingly difficult to find a concise definition of machine learning that satisfies everyone. Complicating things further is the fact that much of machine learning, at least in terms of its enterprise value, looks somewhat like existing analytics and business intelligence tools.

To set the course for this three-part series that puts the scope of machine learning into enterprise context, we define machine learning as software that extracts high-value knowledge from data with little or no human supervision. Academics who work in formal machine learning theory may object to a

The State of Enterprise Machine Learning was written by Nicole Hemsoth at The Next Platform.

ARM Carves Path to IoT Driven Cloud Business

Chip design firm ARM is getting into the cloud business. The company whose designs power almost all of the world’s cell phones, has steadily pushed its designs into new ventures, including servers, as we have covered extensively. But on Tuesday it branched into something completely different.

It is selling cloud services to help a new breed of customers such as appliance makers connect devices to the internet of things in a secure fashion. The ARM mbed cloud is now available for customers that want to create a connected device that is easier to secure, track and get online.

The

ARM Carves Path to IoT Driven Cloud Business was written by Nicole Hemsoth at The Next Platform.

ARM Carves Path to IoT Driven Cloud Business

Chip design firm ARM is getting into the cloud business. The company whose designs power almost all of the world’s cell phones, has steadily pushed its designs into new ventures, including servers, as we have covered extensively. But on Tuesday it branched into something completely different.

It is selling cloud services to help a new breed of customers such as appliance makers connect devices to the internet of things in a secure fashion. The ARM mbed cloud is now available for customers that want to create a connected device that is easier to secure, track and get online.

The

ARM Carves Path to IoT Driven Cloud Business was written by Nicole Hemsoth at The Next Platform.

ARM Predicts Cambrian Server Explosion In The Coming Decades

What happens to the datacenter when a trillion devices embedded in every manner of product and facility are chatting away with each other, trying to optimize the world? There is a very good chance that the raw amount of computing needed to chew on that data at the edge, in the middle, and in a datacenter ­– yes, we will still have datacenters – will absolutely explode.

The supply chain for datacenters – including the ARM collective — is absolutely counting on exponential growth in sensors, which ARM Holding’s top brass and its new owner, Japanese conglomerate SoftBank Group, spent

ARM Predicts Cambrian Server Explosion In The Coming Decades was written by Timothy Prickett Morgan at The Next Platform.

ARM Predicts Cambrian Server Explosion In The Coming Decades

What happens to the datacenter when a trillion devices embedded in every manner of product and facility are chatting away with each other, trying to optimize the world? There is a very good chance that the raw amount of computing needed to chew on that data at the edge, in the middle, and in a datacenter ­– yes, we will still have datacenters – will absolutely explode.

The supply chain for datacenters – including the ARM collective — is absolutely counting on exponential growth in sensors, which ARM Holding’s top brass and its new owner, Japanese conglomerate SoftBank Group, spent

ARM Predicts Cambrian Server Explosion In The Coming Decades was written by Timothy Prickett Morgan at The Next Platform.

Intel Hints At Future Skylake Xeons As Enterprises Cool

The gap in the appetite for new computing technology between enterprises and hyperscalers and cloud builders is widening, but no one is talking about causality quite yet even though there is probably some link between the rise of clouds and more conservative spending on the part of enterprises.

But there is, perhaps, another cause. It could just be that the growth in traditional computing in the enterprise that is dedicated to things like print, file, web, email, and database serving is now less than the Moore’s Law increases that are enabled by process shrinks at the fabs run by Intel.

Intel Hints At Future Skylake Xeons As Enterprises Cool was written by Timothy Prickett Morgan at The Next Platform.

Intel Hints At Future Skylake Xeons As Enterprises Cool

The gap in the appetite for new computing technology between enterprises and hyperscalers and cloud builders is widening, but no one is talking about causality quite yet even though there is probably some link between the rise of clouds and more conservative spending on the part of enterprises.

But there is, perhaps, another cause. It could just be that the growth in traditional computing in the enterprise that is dedicated to things like print, file, web, email, and database serving is now less than the Moore’s Law increases that are enabled by process shrinks at the fabs run by Intel.

Intel Hints At Future Skylake Xeons As Enterprises Cool was written by Timothy Prickett Morgan at The Next Platform.

How Long Before Burst Buffers Push Past Supercomputing?

Over the last couple of years, we have been watching how burst buffers might be deployed at some of the world’s largest supercomputer sites. For some background on how these SSD devices boost throughput on large machines and aid in both checkpoint and application acceleration, you can read here, but the real question is how these might penetrate the market outside of the leading supercomputing sites.

There is clear need for burst buffer technology in other areas where users are matching a parallel file system with SSDs. While that is still an improvement over the disk days, a lot

How Long Before Burst Buffers Push Past Supercomputing? was written by Nicole Hemsoth at The Next Platform.

How Long Before Burst Buffers Push Past Supercomputing?

Over the last couple of years, we have been watching how burst buffers might be deployed at some of the world’s largest supercomputer sites. For some background on how these SSD devices boost throughput on large machines and aid in both checkpoint and application acceleration, you can read here, but the real question is how these might penetrate the market outside of the leading supercomputing sites.

There is clear need for burst buffer technology in other areas where users are matching a parallel file system with SSDs. While that is still an improvement over the disk days, a lot

How Long Before Burst Buffers Push Past Supercomputing? was written by Nicole Hemsoth at The Next Platform.

Turning OpenMP Programs into Parallel Hardware

Systems built from commodity hardware such as servers, desktops and laptops often contain so-called general-purpose processors (CPUs)—processors that specialize in doing many different things reasonably well. This is driven by the fact that users often perform various types of computations; the processor is expected to run an Operating System, browse the internet and even run video games.

Because general-purpose processors target such a broad set of applications, they require having hardware that supports all such application areas. Since hardware occupies silicon area, there is a limit to how many of these processor “cores” that can be placed—typically between 4 and

Turning OpenMP Programs into Parallel Hardware was written by Nicole Hemsoth at The Next Platform.

IBM Overclocks Power8 To Take On “Broadwell” Xeon E7

In the absence of Power8+ processor upgrades this year, and with sales of midrange systems taking a bit of a hit in the third quarter, IBM has to do something to push its iron against Xeon E7 and Sparc M7 systems between now and when the Power9 machines are available in the second half of 2017. It also needs to entice customers who are on older Power7 and Power7+ machinery to upgrade now rather than wait the better part of a year to spend money.

To that end, IBM has launched the Power 850C four-socket server, a companion to

IBM Overclocks Power8 To Take On “Broadwell” Xeon E7 was written by Timothy Prickett Morgan at The Next Platform.

Future Economies of Scale for Quantum Computing

Clustering together commodity servers has allowed the economies of scale that enable large-scale cloud computing, but as we look to the future of big infrastructure beyond Moore’s Law, how might bleeding edge technologies capture similar share and mass production?

To say that quantum computing is a success simply because a few machines manufactured by quantum device maker, D-Wave, would not necessarily be accurate. However, what the few purchases of such machines by Los Alamos National Lab, Google, and Lockheed Martin do show is that there is enough interest and potential to get the technology off the ground and

Future Economies of Scale for Quantum Computing was written by Nicole Hemsoth at The Next Platform.

How Microsoft Fell Hard for FPGAs

Microsoft’s embrace of programmable chips knowns as FPGAs is well documented. But in a paper released Monday the software and cloud company provided a look into how it has fundamentally changed the economics of delivering hardware as a service thanks to these once-specialty pieces of silicon.

Field programmable gate arrays, or FPGAs, are chips where the logic and networking functions can be reconfigured after they’ve been manufactured. They are typically larger than similarly functioning chips and traditionally were made for small jobs where the performance advantage outweighed the higher engineering cost associated with designing them.

But thanks to the massive

How Microsoft Fell Hard for FPGAs was written by Nicole Hemsoth at The Next Platform.