Jeffrey Burt

Author Archives: Jeffrey Burt

IBM Combines PowerAI, Data Science Experience in Enterprise AI Push

IBM has spent the past several years putting a laser focus on what it calls cognitive computing, using its Watson platform as the foundation for its efforts in such emerging fields as artificial intelligence (AI) and is successful spinoff, deep learning. Big Blue has leaned on Watson technology, its traditional Power systems, and increasingly powerful GPUs from Nvidia to drive its efforts to not only bring AI and deep learning into the cloud, but also to push AI into the enterprise.

The technologies are part of a larger push in the industry to help enterprises transform their businesses to take

IBM Combines PowerAI, Data Science Experience in Enterprise AI Push was written by Jeffrey Burt at The Next Platform.

Red Hat Stretches Gluster Clustered Storage Under Containers

Red Hat has been aggressive in building out its capabilities around containers. The company last month unveiled its OpenShift Container Platform 3.6, its enterprise-grade Kubernetes container platform for cloud native applications that added enhanced security features and greater consistency across hybrid and multi-cloud deployments.

A couple of weeks later, Red Hat and Microsoft expanded their alliance to make it easier for organizations to adopt containers. Red Hat last year debuted OpenShift 3.0, which was based on the open source Kubernetes orchestration system and Docker containers, and the company has since continued to roll out enhancements to the platform.

The

Red Hat Stretches Gluster Clustered Storage Under Containers was written by Jeffrey Burt at The Next Platform.

IBM Brings Analytics To The Data For Faster Processing

Data analytics is a rapidly evolving field, and IBM and other vendors over the past several years have built numerous tools to address segments of it. But now Big Blue is shifting its focus to give data scientists and developers the technologies they need more easily and quickly analyze the data and derive insights that they can apply to their businesses strategies.

“We have [created] a ton of different products that solve parts of the problem,” Rob Thomas, general manager of IBM Analytics, tells The Next Platform. “We’re moving toward a strategy of developing platforms for analytics. This trend

IBM Brings Analytics To The Data For Faster Processing was written by Jeffrey Burt at The Next Platform.

Anaconda Teams With Microsoft In Machine Learning Push

Microsoft is embedding Anaconda’s Python distribution into its Azure Machine Learning products, the latest move by the software vendor to expand its capabilities in the fast-growing artificial intelligence space and an example of Anaconda extending its reach beyond high performance computing and into AI.

The two companies announced the partnership this week at the Strata Data Conference in New York City, with the news dovetailing with other announcements around AI that Microsoft officials made this week at its own Ignite 2017 show. The vendors said they will offer Anaconda for Microsoft, which they described as a subset of the Anaconda

Anaconda Teams With Microsoft In Machine Learning Push was written by Jeffrey Burt at The Next Platform.

Unifying Massive Data at Cloud Scale

Enterprises continue to struggle with the issue of data: how to process and move the massive amounts that are coming in from multiple sources, how to analyze the different types of data to best leverage its capabilities, and how to store and unify it across various environments, including on-premises infrastructure and cloud environments. A broad array of major storage players, such as Dell EMC, NetApp and IBM are building out their offerings to create platforms that can do a lot of those things.

MapR Technologies, which made its bones with its commercial Hadoop distribution, is moving in a similar direction.

Unifying Massive Data at Cloud Scale was written by Jeffrey Burt at The Next Platform.

HPE Looks Ahead To Composable Infrastructure, Persistent Memory

Over the past several years, the server market has been roiled by the rise of cloud computing that run the applications created by companies and by services offered by hyperscalers that augment or replace such applications. This is a tougher and lumpier market, to be sure.

The top tier cloud providers like Amazon, Microsoft, and Google not only have become key drivers in server sales but also have turned to original design manufacturers (ODMs) from Taiwan and China for lower cost systems to help populate their massive datacenters. Overall, global server shipments have slowed, and top-tier OEMs are working to

HPE Looks Ahead To Composable Infrastructure, Persistent Memory was written by Jeffrey Burt at The Next Platform.

Unifying Oil and Gas Data at Scale

The oil and gas industry has been among the most aggressive in pursuing internet of things (IoT), cloud and big data technologies to collect, store, sort and analyze massive amounts of data in both the drilling and refining sectors to improve efficiencies and decision-making capabilities. Systems are increasingly becoming automated, and sensors are placed throughout processes to send back data on various the systems and software has been put in place to crunch the data to create useful information.

According to a group of researchers from Turkey, the oil and gas industry is well suited to embrace all the new

Unifying Oil and Gas Data at Scale was written by Jeffrey Burt at The Next Platform.

Logistics in Application Path of Neural Networks

Accurately forecasting resource demand within the supply chain has never been easy, particular given the constantly changing nature of the data over periods of time.

What may have been true in measurements around demand or logistics one minute might be entirely different an hour, day or week later, which can throw off a short-term load forecast (STLF) and lead to costly over- or under-estimations, which in turn can lead to too much or too little supply.

To improve such forecasts, there are multiple efforts underway to create new models that can more accurately predict load needs, and while they have

Logistics in Application Path of Neural Networks was written by Jeffrey Burt at The Next Platform.

A Health Check For Code And Infrastructure In The Cloud

As businesses continue their migration to the cloud, the issue of monitoring the performance and health of their applications gets more challenging as they try to track them across both on-premises environments and in both private and public clouds. At the same time, as they become more cloud-based, they have to keep an eye on the entire stack, from the customer-facing applications to the underlying infrastructure they run on.

Since its founding eight years ago, New Relic has steadily built upon its first product, a cloud-based application performance management (APM) tool that is designed to assess how well the

A Health Check For Code And Infrastructure In The Cloud was written by Jeffrey Burt at The Next Platform.

Dell EMC Upgrades Flash in High-End Storage While Eyeing NVMe

When Dell acquired EMC in its massive $60 billon-plus deal last year, it boasted that Dell was inheriting a boatload of new technologies that would help propel forward its capabilities and ambitions with larger enterprises.

That included offerings ranging from VMware’s NSX software-defined networking (SDN) platform to VirtuStream and its cloud technologies for running mission critical applications from the likes of Oracle, SAP and Microsoft off-premises. In particular, Dell was acquiring EMC’s broad and highly popular storage portfolio, in particular the high-end VMAX, XtremeIO, and newer ScaleIO lineups as well as its Isilon storage arrays for high performance workloads.

Dell

Dell EMC Upgrades Flash in High-End Storage While Eyeing NVMe was written by Jeffrey Burt at The Next Platform.

Red Hat Is The Gatekeeper For ARM In The Datacenter

If any new hardware technology is going to get traction in the datacenter, it has to have the software behind it. And as the dominant supplier of commercial Linux, Red Hat’s support of ARM-based servers gives the upstart chip makers like Applied Micro, Cavium, and Qualcomm the leverage to help pry the glasshouse doors open and get a slice of the server and storage business that is so utterly dominated by Intel’s Xeon processors today.

It is now or never for ARM in the datacenter, and that means Red Hat has to go all the way and not just support

Red Hat Is The Gatekeeper For ARM In The Datacenter was written by Jeffrey Burt at The Next Platform.

Red Hat Gears Up OpenShift For Developers

During the five years that Red Hat has been building out its OpenShift cloud applications platform, much of the focus has been on making it easier to use by customers looking to adapt to an increasingly cloud-centric world for both new and legacy applications. Just as it did with the Linux operating system through Red Hat Enterprise Linux and related middleware and tools, the vendor has worked to make it easier for enterprises to embrace OpenShift.

That has included a major reworking of the platform with the release of version 3.0 last year, which ditched Red Hat’s in-house technologies for

Red Hat Gears Up OpenShift For Developers was written by Jeffrey Burt at The Next Platform.

Swiss Army Knife File System Cuts Through Petabytes

Petabytes are in the future of every company, and luckily, the future is always being invented by the IT ecosystem to handle it.

Those wrestling with tens to hundreds of petabytes of data today are constantly challenged to find the best ways to store, search and manage it all. Qumulo was founded in 2012 and came out of the chute two years ago with the idea that a software-based file system that includes built-in analytics that enables the system to increase capacity as the amount of data grows. QSFS, now called Qumulo Core, also does it all: fast with big

Swiss Army Knife File System Cuts Through Petabytes was written by Jeffrey Burt at The Next Platform.

Machine Learning Storms Into Climate Research

The fields where machine learning and neural networks can have positive impacts seem almost limitless. From healthcare and genomics to pharmaceutical development, oil and gas exploration, retail, smart cities and autonomous vehicles, the ability to rapidly and automatically find patterns in massive amounts of data promises to help solve increasingly complex problems and speed up  discoveries that will improve lives, create a heathier world and make businesses more efficient.

Climate science is one of those fields that will see significant benefits from machine learning, and scientists in the field are pushing hard to see how the technology can help them

Machine Learning Storms Into Climate Research was written by Jeffrey Burt at The Next Platform.

Red Hat Tunes Up OpenShift For Legacy Code In Kubernetes

When Red Hat began building out its OpenShift cloud application platform more than five years ago, the open source software vendor found itself in a similar situation as others in the growing platform-as-a-service (PaaS) space: they were all using technologies developed in-house because there were no real standards in the industry that could be used to guide them.

That changed about three years ago, when Google officials decided to open source the technology – called Borg – they were using internally to manage the search giant’s clusters and make it available to the wider community. Thus was born Kubernetes,

Red Hat Tunes Up OpenShift For Legacy Code In Kubernetes was written by Jeffrey Burt at The Next Platform.

ARM Pioneer Sophie Wilson Also Thinks Moore’s Law Coming to an End

Intel might have its own thoughts about the trajectory of Moore’s Law, but many leaders in the industry have views that variate slightly from the tick-tock we keep hearing about.

Sophie Wilson, designer of the original Acorn Micro-Computer in the 1970s and later developer of the instruction set for ARM’s low-power processors that have come to dominate the mobile device world has such thoughts. And when Wilson talks about processors and the processor industry, people listen.

Wilson’s message is essentially that Moore’s Law, which has been the driving force behind chip development in particular and the computer industry

ARM Pioneer Sophie Wilson Also Thinks Moore’s Law Coming to an End was written by Jeffrey Burt at The Next Platform.

Memory And Logic In A Post Moore’s Law World

The future of Moore’s Law has become a topic of hot debate in recent years, as the challenge of continually shrinking transistors and other components has grown.

Intel, AMD, IBM, and others continue to drive the development of smaller electronic components as a way of ensuring advancements in compute performance while driving down the cost of that compute. Processors from Intel and others are moving now from 14 nanometer processes down to 10 nanometers, with plans to continue onto 7 nanometers and smaller.

For more than a decade, Intel had relied on a tick-tock manufacturing schedule to keep up with

Memory And Logic In A Post Moore’s Law World was written by Jeffrey Burt at The Next Platform.

Keeping The Blue Waters Supercomputer Busy For Three Years

After years of planning and delays after a massive architectural change, the Blue Waters supercomputer at the National Center for Supercomputing Applications at the University of Illinois finally went into production in 2013, giving scientists, engineers and researchers across the country a powerful tool to run and solve the most complex and challenging applications in a broad range of scientific areas, from astrophysics and neuroscience to biophysics and molecular research.

Users of the petascale system have been able to simulate the evolution of space, determine the chemical structure of diseases, model weather, and trace how virus infections propagate via air

Keeping The Blue Waters Supercomputer Busy For Three Years was written by Jeffrey Burt at The Next Platform.

Tuning Up Knights Landing For Gene Sequencing

The Smith-Waterman algorithm has become a linchpin in the rapidly expanding world of bioinformatics, the go-to computational model for DNA sequencing and local sequence alignments. With the growth in recent years in genome research, there has been a sharp increase in the amount of data around genes and proteins that needs to be collected and analyzed, and the 36-year-old Smith-Waterman algorithm is a primary way of sequencing the data.

The key to the algorithm is that rather than examining an entire DNA or protein sequence, Smith-Waterman uses a technique called dynamic programming in which the algorithm looks at segments of

Tuning Up Knights Landing For Gene Sequencing was written by Jeffrey Burt at The Next Platform.

3D Stacking Could Boost GPU Machine Learning

Nvidia has staked its growth in the datacenter on machine learning. Over the past few years, the company has rolled out features in its GPUs aimed neural networks and related processing, notably with the “Pascal” generation GPUs with features explicitly designed for the space, such as 16-bit half precision math.

The company is preparing its upcoming “Volta” GPU architecture, which promises to offer significant gains in capabilities. More details on the Volta chip are expected at Nvidia’s annual conference in May. CEO Jen-Hsun Huang late last year spoke to The Next Platform about what he called the upcoming “hyper-Moore’s Law”

3D Stacking Could Boost GPU Machine Learning was written by Jeffrey Burt at The Next Platform.