Archive

Category Archives for "IT Industry"

The Emergence Of Data-Centric Computing

As data grows, a shift in computing paradigm is underway. I started my professional career in the 1990s, during massive shift from mainframe computing to the heyday of client/server computing and enterprise applications such as ERP, CRM, and human resources software. Relational databases like Oracle, DB2, SQL Server, and Informix offered improvements to managing data, and the technique of combining a new class of midrange servers from Sun Microsystems, Digital Equipment Corporation, IBM, and Hewlett-Packard with storage tiers from EMC and IBM reduced costs and complexity over traditional mainframes.

However, what remained was that these new applications continued to operate

The Emergence Of Data-Centric Computing was written by Timothy Prickett Morgan at The Next Platform.

Accelerating Slow Databases That Wear People Down

Todd Mostak, the creator of the MapD GPU-accelerated database and visualization system, made that database because he was a frustrated user of other database technologies, and as a user, he is adamant that accelerating databases and making visualization of queried data is about more than just being a speed freak.

“Analytics is ultimately a creative exercise,” Mostak tells The Next Platform during a conversation that was supposed to be about benchmark results but that, as often happens here, wandered far and wide. “Analysts start from some place, and where they go is a function of the resources that are

Accelerating Slow Databases That Wear People Down was written by Timothy Prickett Morgan at The Next Platform.

Applied Micro Finds ARM Server Footing, Reaches Higher

One of the frustrating facts about peddling any new technology is that the early adopters that discover a strategic advantage in that technology want to keep that secret all to themselves. Word of mouth and real-world use cases are big factors in the adoption of any new technology, and anything that hampers this actually causes the adoption to move slower than it otherwise might.

But eventually, despite all of the secrecy, there comes a time when the critical mass is reached and adoption proceeds apace. We have been waiting for that moment for a long time now for 64-bit ARM

Applied Micro Finds ARM Server Footing, Reaches Higher was written by Timothy Prickett Morgan at The Next Platform.

Making The Case For Containers

Linux container technology is IT’s shiny new thing. Containers promise to ease application development and deployment, a necessity in a business environment where getting ahead of application demand can mean the difference between staying in business or not. Containers offer many benefits, but they are not a panacea, and it’s important to understand why, where and when to use them.

Most IT pros recognize that application containers can provide a technological edge, one that translates into a clear business advantage. Containers unify and streamline application components – including the libraries and binaries upon which individual applications depend. Combining isolation with

Making The Case For Containers was written by Timothy Prickett Morgan at The Next Platform.

Server Encryption With An FPGA Offload Boost

Everyone talks about security on infrastructure, but it comes at a heavy cost. While datacenters have been securing their perimeters with firewalls for decades, this is far from sufficient for modern applications.

Back in the early days of the Internet, all traffic was from the client in through the web and application servers to the back-end database that fed the applications – what is known as north-south traffic in the datacenter lingo. But these days, an application is a collection of multiple services that are assembled on the fly from all over the datacenter, across untold server nodes, in what

Server Encryption With An FPGA Offload Boost was written by Timothy Prickett Morgan at The Next Platform.

Exascale Code Performance and Portability in the Tune of C

Among the many challenges ahead for programming in the exascale era is the portability and performance of codes on heterogeneous machines.

Since the future plan for architectures includes new memory and accelerator capabilities, along with advances in general purpose cores, developing on a solid base that offers flexibility and support for many hardware architectures is a priority. Some contend that the best place to start is with C++, which has been gathering steam in HPC in recent years.

As our own Douglas Eadline noted back in January, choosing a programming language for HPC used to be an easy task. Select

Exascale Code Performance and Portability in the Tune of C was written by Nicole Hemsoth at The Next Platform.

Amazon Gets Serious About GPU Compute On Clouds

In the public cloud business, scale is everything – hyper, in fact – and having too many different kinds of compute, storage, or networking makes support more complex and investment in infrastructure more costly. So when a big public cloud like Amazon Web Services invests in a non-standard technology, that means something. In the case of Nvidia’s Tesla accelerators, it means that GPU compute has gone mainstream.

It may not be obvious, but AWS tends to hang back on some of the Intel Xeon compute on its cloud infrastructure, at least compared to the largest supercomputer centers and hyperscalers like

Amazon Gets Serious About GPU Compute On Clouds was written by Timothy Prickett Morgan at The Next Platform.

Windows Server 2016: End Of One Era, Start Of Another

What constitutes an operating system changes with the work a system performs and the architecture that defines how that work is done. All operating systems tend to expand out from their initial core functionality, embedding more and more functions. And then, every once in a while, there is a break, a shift in technology that marks a fundamental change in how computing gets done.

It is fair to say that Windows Server 2016, which made it formal debut at Microsoft’s Ignite conference today and which starts shipping on October 1, is at the fulcrum of a profound change where an

Windows Server 2016: End Of One Era, Start Of Another was written by Timothy Prickett Morgan at The Next Platform.

Exascale Capabilities Underpin Future of Energy Sector

Oil and natural resource discovery and production is an incredibly risky endeavor, with the cost of simply finding a new barrel of oil tripling over the last ten years. Discovery teams want to ensure they are only drilling in the most lucrative locations, which these days means looking to increasingly inaccessible (for a bevy of reasons) sources for hydrocarbons.

Even with renewable resources like wind, there are still major financial risks. An accurate prediction of shifting output and location for expensive turbines are two early-stage challenges, and maintaining, monitoring, and optimizing those turbines is an ongoing pressure.

The common thread

Exascale Capabilities Underpin Future of Energy Sector was written by Nicole Hemsoth at The Next Platform.

Baidu’s New Yardstick for Deep Learning Hardware Makers

When it comes to deep learning innovation on the hardware front, few other research centers have been as forthcoming with their results as Baidu. Specifically, the company’s Silicon Valley AI Lab (SVAIL) has been the center of some noteworthy work on GPU-based deep learning as well as exploratory efforts using novel architectures specifically for ultra-fast training and inference.

It stands to reason that teams at SVAIL don’t simply throw hardware at the wall to see what sticks, even though they seem to have more to toss around than most. Over the last couple of years, they have broken down

Baidu’s New Yardstick for Deep Learning Hardware Makers was written by Nicole Hemsoth at The Next Platform.

A Rare Tour Of Microsoft’s Hyperscale Datacenters

If you want to study how datacenter design has changed over the past two decades, a good place to visit is Quincy, Washington. There are five different datacenter operators in this small farming community of around 7,000 people, including Microsoft, Yahoo, Intuit, Sabey Data Centers, and Vantage Data Centers, and they have located there thanks to the proximity of Quincy to hydroelectric power generated from the Columbia River and the relatively cool and arid climate, which can be used to great advantage to keep servers, storage, and switches cool.

All of the datacenter operators are pretty secretive about their glass

A Rare Tour Of Microsoft’s Hyperscale Datacenters was written by Timothy Prickett Morgan at The Next Platform.

Pushing Database Scalability Up And Out With GPUs

What is good for the simulation and the machine learning is, as it turns out, also good for the database. The performance and thermal limits of traditional CPUs have made GPUs the go-to accelerator for these workloads at extreme scale, and now databases, which are thread monsters in their own right, are also turning to GPUs to get a performance and scale boost.

Commercializing GPU databases takes time, and Kinetica, formerly known as GPUdb, is making a bit of a splash ahead of the Strata+Hadoop World conference next week as it brags about the performance and scale of the parallel

Pushing Database Scalability Up And Out With GPUs was written by Timothy Prickett Morgan at The Next Platform.

IBM Builds A Bridge Between Private And Public Power Clouds

Two years ago, when Big Blue put a stake through the heart of its impartial attitude about the X86 server business, it was also putting a stake in the ground for its Power systems business.

IBM bet that it could make more money selling Power machinery to its existing customer base and while at the same time expanding it out to hyperscalers like Google through the OpenPower Foundation while at the same time gradually building out a companion public cloud offering of Power machinery on its SoftLayer cloud and through partners like Rackspace Hosting. This is a big bet, and

IBM Builds A Bridge Between Private And Public Power Clouds was written by Timothy Prickett Morgan at The Next Platform.

Baking Specialization into Hardware Cools CPU Concerns

As Moore’s Law spirals downward, ultra-high bandwidth memory matched with custom accelerators for specialized workloads might be the only saving grace for the pace of innovation we are accustomed to.

With advancements on both the memory and ASIC sides driven by machine learning and other workloads pushing greater innovation, this could be great news for big datacenters with inefficient legions of machines dedicated to ordinary processing tasks—jobs that could far more efficient with more tailored approaches.

We have described this trend in the context of architectures built on stacked memory with FPGAs and other custom accelerators inside recently, and we

Baking Specialization into Hardware Cools CPU Concerns was written by Nicole Hemsoth at The Next Platform.

The Three Great Lies of Cloud Computing

It’s elastic! It’s on-demand! It scales dynamically to meet your needs! It streamlines your operations, gives you persistent access to data, and it’s always, always cheaper. It’s cloud computing, and it’s here to save your enterprise.

And yet, for all the promise of cloud, there are still segments of IT, such as HPC and many categories of big data analytics, that have been resistant to wholesale outsourcing to public cloud resources. At present cloud computing makes up only 2.4% of the HPC market by revenue, and although Intersect360 Research is forecast its growth at a robust 10.9%, that still keeps

The Three Great Lies of Cloud Computing was written by Nicole Hemsoth at The Next Platform.

Modern Storage Software Erodes Resistant Data Silos

With the record-breaking $60 billion Dell/EMC acquisition now complete, both of these companies and their customers now have more options than ever before to meet evolving storage needs. Joining forces helps the newly minted Dell Technologies combine the best of both worlds to better serve customers by blending EMC storage and support with Dell pricing and procurement.

But there is some trouble in paradise. Even when sold by the same vendor, most storage systems have been designed as secluded islands of data, meaning they aren’t terribly good at talking to each other.

In fact, this silo effect is exacerbated

Modern Storage Software Erodes Resistant Data Silos was written by Timothy Prickett Morgan at The Next Platform.

The Server At Peak X86

One of the reasons why Dell spent $60 billion on the EMC-VMware conglomerate was to become the top supplier of infrastructure in the corporate datacenter. But even before the deal closed, Dell was on its way – somewhat surprisingly to many – to toppling Hewlett Packard Enterprise as the dominant supplier of X86 systems in the world.

But that computing world is set to change, we think. And perhaps more quickly – some might say jarringly — than any of the server incumbents are prepared to absorb.

After Intel, with the help of a push from AMD a decade ago,

The Server At Peak X86 was written by Timothy Prickett Morgan at The Next Platform.

The Next Wave of Deep Learning Applications

Last week we described the next stage of deep learning hardware developments in some detail, focusing on a few specific architectures that capture what the rapidly-evolving field of machine learning algorithms require. This week we are focusing in on a trend that is moving faster than the devices can keep up with; the codes and application areas that are set to make this market spin in 2017.

It was with reserved skepticism that we listened, not even one year ago, to dramatic predictions about the future growth of the deep learning market—numbers that climbed into the billions despite the fact

The Next Wave of Deep Learning Applications was written by Nicole Hemsoth at The Next Platform.

So, You Want to Program Quantum Computers?

The jury is still out when it comes to how wide-ranging the application set and market potential for quantum computing will be. Optimistic estimates project that in the 2020s it will be a billion-dollar field, while others expect the novelty will wear off and the one company behind the actual production of quantum annealing machines will go bust.

Ultimately, whichever direction the market goes with quantum computing will depend on two things. First, the ability for applications of sufficient value to warrant the cost of quantum systems have to be in place. Second, and connected to that point, is the

So, You Want to Program Quantum Computers? was written by Nicole Hemsoth at The Next Platform.

When Will Containers Be the Total Package for HPC?

While containers are old news in enterprise circles, by and large, high performance computing centers have just recently begun to consider packaging up their complex applications. A few centers are known for their rapid progress in this area, but for smaller sites, especially those that serve users from a diverse domain base via medium-sized HPC clusters, progress has been slower—even though containers could zap some serious deployment woes and make collaboration simpler.

When it comes to containers in HPC, there are a couple of noteworthy efforts that go beyond the more enterprise-geared Docker and CoreOS options. These include Shifter out

When Will Containers Be the Total Package for HPC? was written by Nicole Hemsoth at The Next Platform.