Because Microsoft has shifted to a more gradual upgrade of Windows Server, many of the features that will become available with Windows Server 2019 have already been in use in live corporate networks, and here are half a dozen of the best.[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ]
Enterprise-grade hyperconverged infrastructure (HCI)
With the release of Windows Server 2019, Microsoft rolls up three years of updates for its HCI platform. That’s because the gradual upgrade schedule Microsoft now uses includes what it calls Semi-Annual Channel releases – incremental upgrades as they become available. Then every couple of years it creates a major release called the Long-Term Servicing Channel (LTSC) version that includes the upgrades from the preceding Semi-Annual Channel releases.To read this article in full, please click here
Microsoft is set to make Windows Server 2019 generally available in the second half of the year, opening up access to its preview build through its Insiders program now and targeting data centers with new features to handle hybrid cloud setups and hyperconverged infrastructure.The next version of Windows Server also adds new security features and enhances support for containers and Linux.[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ]
If you want to check out the release for yourself, sign up for the Insiders program.To read this article in full, please click here
Microsoft is set to make Windows Server 2019 generally available in the second half of the year, opening up access to its preview build through its Insiders program now and targeting data centers with new features to handle hybrid cloud setups and hyperconverged infrastructure.The next version of Windows Server also adds new security features and enhances support for containers and Linux.[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ]
If you want to check out the release for yourself, sign up for the Insiders program.To read this article in full, please click here
As spring finally rolls around and the frost melts away (except here in New England), the change in seasons not only brings us ultraviolet B-induced vitamin D, but also shines some healthy sunlight on the clutter the long winter leaves behind in its wake, including both at home and in the data center. With spring, comes an opportunity to reevaluate data center cleaning habits and question whether a more practical, day-to-day strategy would benefit data center managers in the long run.Instead of enduring the spring cleaning process in the data center, data center managers should rethink their current cleaning strategy, which often leads to seasonal operational overhauls, and instead consider a new and more efficient method that benefits from granular operational data and analytics. So, with the spring solstice upon us, here are a few tips to enable data center managers to reap the advantages of “year-round cleaning” rather than looking for that old and tired broom.To read this article in full, please click here
IBM has launched the latest effort to bring the nature of the cloud to the on-premises data center with Cloud Private for Data. It's an integrated data science, engineering and development platform designed to help companies gain insights from data sources such IoT, online commerce, and mobile data.Cloud Private for Data builds on IBM Cloud Private, a private cloud platform IBM introduced in November that brought Kubernetes into the data center. Cloud Private for Data expands on that greatly, adding IBM Streams for data ingestion, IBM Data Science Experience, Information Analyzer, Information Governance Catalogue, Data Stage, Db2, and Db2 Warehouse. All run on the Kubernetes platform, allowing services to be deployed “in minutes,” IBM claimed, and to scale up or down automatically as needed.To read this article in full, please click here
It wasn’t that long ago that Amazon CTO Werner Vogels routinely said that any strategy that included on-premises data centers and the public cloud was really just a path to public cloud. Yet today, AWS touts architectures that include both.Microsoft has gone a step further with Azure Stack so that the public cloud vs. data center experience is as seamless as possible for its customers. Google, meanwhile, continues to invest in technologies that admit that some services will stay in private data centers (but you might as well make nice APIs for them, while also making it easier for on-premises business logic to consume public cloud services).To read this article in full, please click here
As data centers are called upon to handle an explosion of unstructured data fed into a variety of cutting-edge applications, the future for FPGAs looks bright.That’s because FPGAs, or field programmable gate arrays, are essentially chips that can be programmed, after manufacturing, to act as custom accelerators for workloads including machine-learning, complex data analysis, video encoding, and genomics – applications that have far-reaching consequences for communications, networking, health care, the entertainment industry and many other businesses.Such applications lend themselves to parallel processing, an important feature of FPGAs, which can also be reconfigured on the fly to handle new features as the nature of these workloads evolve.To read this article in full, please click here
As data centers are called upon to handle an explosion of unstructured data fed into a variety of cutting-edge applications, the future for FPGAs looks bright.That’s because FPGAs, or field programmable gate arrays, are essentially chips that can be programmed, after manufacturing, to act as custom accelerators for workloads including machine-learning, complex data analysis, video encoding, and genomics – applications that have far-reaching consequences for communications, networking, health care, the entertainment industry and many other businesses.[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ]
Such applications lend themselves to parallel processing, an important feature of FPGAs, which can also be reconfigured on the fly to handle new features as the nature of these workloads evolve.To read this article in full, please click here
What would your organization do if your cloud provider were to go out of business? What happens if your cloud provider suddenly stops offering critical services that your organization requires for its business to function properly? Businesses need to start asking these important questions and develop plans to address these scenarios.The cloud is a new market that continues to grow, and there are more small players offering their services. According to Gartner, Cloud System Infrastructure Services (IaaS) are expected to grow from $45.8 billion in revenue in 2018 to $72.4 billion in 2020. As the market matures, it's only natural that some of these organizations will disappear or stop offering certain services. In 2013, Nirvanix stopped offering it cloud services and gave customers only two weeks’ notice to move their data off of their platform.To read this article in full, please click here
Several things make bare-metal cloud providers appealing compared with traditional cloud providers, which operate in a virtualized environment. Bare-metal providers give users more control, more access to hardware, more performance, and the ability to pick their own operating environment.There's another interesting angle, as articulated by Martin Blythe, a research fellow with Gartner. He maintains that bare-metal providers appeal to small and mid-sized businesses (SMBs) because those companies are often small, local players, and SMBs looking for something more economical than hosting their own data center often want to keep the data center nearby.To read this article in full, please click here
Hyperconvergence is winning over enterprises that are drawn to its potential to ease management, streamline the deployment of new workloads, and optimize infrastructure costs.As much as 20 percent of business-critical applications currently deployed on three-tier IT infrastructure will transition to hyperconverged infrastructure by 2020, predicts Gartner, which recently gave the technology its own magic quadrant.[ Check out our What is hyperconvergence? and learn whether your network and team are up to hyperconverged storage. ]
The magic quadrant is Gartner’s signature format for tech market analysis, and in prior years, the research firm tackled hyperconvergence as part of its integrated-systems research.To read this article in full, please click here
Ava Robotics, a startup with strong technical ties to iRobot, just announced its telepresence robot. Her name is Ava, and she’s likely to win a lot of hearts.For one thing, Ava is quite perceptive, using video technology from Cisco and integrating with Cisco Spark (which provides tools for team messaging, online meetings, and white boarding). Ava is also quite friendly. She allows her users to participate in remote meetings, wander down hallways at other facilities while chatting with colleagues, and enjoy face-to-face discussions with people who may physically be thousands of miles away.Also read: Customer reviews: Top Remote Access Tools
Telepresence robots provide a lot of benefits to companies that are spread across many locations — especially those spanning continents — or with staff who work from home. They make work relationships considerably more productive — even for individuals who may have never met in person. Carrying on casual conversations and checking remote data centers and manufacturing facilities (sometimes safer than being there in person) can make huge differences in how staffs coordinate and get important work done.To read this article in full, please click here
VMware has expanded its portfolio of cloud tools to help enterprises improve the manageability of their public cloud and on-premises environments. At the same time, VMware announced the first global expansion of VMware Cloud on AWS, its joint hybrid cloud service with Amazon Web Services.Complexity is on the rise for enterprises as they expand their use of cloud computing – which often is not limited to a single cloud provider. VMware estimates that nearly two-thirds of companies will use two or more cloud service providers in addition to their on-premises data centers.To read this article in full, please click here
Cloud services, particularly infrastructure- and platform-as-a-service, are well established, but in some cases customers demand more — more control, more access to hardware, more performance, and the ability to pick their own operating environment.In those cases, they are looking to bare-metal services, a niche that is growing quickly.As the name implies, bare metal means no software, just CPUs, memory, and storage. Customers provide all of the software from the operating system on up. That means a dedicated CPU, full access to the hardware, and freedom to run custom operating systems.According to a 2016 Markets and Markets report, the bare-metal cloud market is expected to grow from $871.8 million in 2016 to $4.7 billion in 2021, at an estimated compound annual growth rate of 40.1 percent.To read this article in full, please click here
Cisco this week expanded its Tetration Analytics system to let users quickly detect software vulnerabilities and more easily manage the security of the key components in their data centers.Introduced in 2016, the Cisco Tetration Analytics system gathers information from hardware and software sensors and analyzes the information using big data analytics and machine learning to offer IT managers a deeper understanding of their data center resources. The idea behind Tetration includes the ability to dramatically improve enterprise security monitoring, simplify operational reliability and move along application migrations to Software Defined Networking.To read this article in full, please click here
A startup called ZincFive is set to launch a modular uninterruptible power supply (UPS) for data center computers using the venerable nickel-zinc technology, which it claims is more efficient than lithium-ion.Nickel-zinc batteries were invented by Thomas Edison in 1901 but fell out of favor to newer designs due to their limitations, such as a low number of charge cycles and their inability to hold a charge for long.On the plus side, the batteries could hold a stronger charge and didn’t use toxic metals like other batteries that make them difficult to recycle. And they are not flammable, something lithium-ion batteries certainly can’t claim.To read this article in full, please click here
It’s been 160 years since the world’s first submarine cable linked a remote corner of Trinity Bay, Newfoundland, with Valentia Island on the west coast of Ireland in 1858. That telegraph cable failed after three weeks, but a new method for transoceanic communications had been established, and today submarine cables are a critical piece of digital infrastructure that’s fast expanding in prevalence and prominence globally – though not yet quickly enough to meet voracious demand for capacity.Between 2013 and 2017, the subsea cable industry has added an average of 32 percent of capacity annually on major submarine cable routes, according to the industry magazine SubTel Forum. Still, the industry needs to do more. “It will have to increase activity to stay ahead of demand,” SubTel Forum said in its annual report this year.To read this article in full, please click here
In-house IT hardware spending has been on hold thanks to executives flip-flopping on whether to move to cloud computing. It hasn't been because they've actually shifted to cloud services.The problem has been merely inertia caused in companies by "decision-making around the cloud," says Morgan Stanley in a new financial research note published this week. The financial services firm suggests that once enterprises complete their cloud assessments, their checkbooks will open once more.Also read: Top 10 data center predictions: IDC
In fact, Morgan Stanley, which advises people on industry investments, says investors could see double-digit earnings growth from the IT hardware sector in 2018. And it has upgraded its fiscal view from "cautious" to "attractive."To read this article in full, please click here
With the release of Microsoft Windows Server 2016 a couple years ago, Microsoft directly entered the hyperconverged infrastructure (HCI) platform space that has been served by organizations like Nutanix, Scale, Cisco, HP, Dell, and others — only Microsoft comes at it with a fully software-defined platform rather than hardware and applicances.+RELATEDTop 5 Windows server 2016 features that enterprises are deploying; REVIEW: Deep dive into Windows Server 2016+To read this article in full, please click here
For about as long as there has been personal computers, there has been an aftermarket of system optimization software. Even MS-DOS, which was about as basic as an operating system gets, had QEMM to get the most out of your 640K of memory. These days, there is a healthy market of Windows optimization utilities to speed up your PC.For servers, though, it gets a little more complicated. Actually, it gets very complicated. Not only does each server have to operate at peak efficiency on its own, but it then has to interact with the network, with other servers, and potentially with a public cloud service provider.Also on Network World: What will AI mean to the traditional data center?
And usage models change over time. There might be peak use times when certain processes are not run, such as backups, and slow times of day when other tasks can be run. So an optimal configuration at one point in the day is not optimal at a different time of the day.To read this article in full, please click here