For the past seven years, I’ve conducted Uptime Institute’s Annual Data Center Industry Survey (over 1,000 end user respondents from around the globe, conducted by email).Every year, some trend jumps out as the main theme. Maybe it’s because I’m turning 40 this year, but my takeaway from 2017 is that enterprise data center professionals need to relax—and reevaluate what’s important to their organizations.Over the course of the survey, I’ve watched our respondents wrestle with uncertainties as the IT profession continues to evolve. But the data from this year’s survey illustrates that many of the industry’s concerns are not coming to pass, meanwhile chronic management problems go untended.To read this article in full or to leave a comment, please click here
Welcome to the relaunch of my blog. We are undergoing a slight change in direction here at Network World, and with it a change in direction for this blog. Instead of covering Microsoft issues, I will be focused on data center issues, a change I am looking forward to because I love all things big iron. So, on to the show.-----------------------------------------------The state of Florida is not the first place you think of when it comes to tech. IBM had that legendary Boca Raton facility, and there are a few firms here and there, but it pales in comparison to California, Oregon, Washington and Texas.The state is looking to change that in a unique way. Instead of luring tech firms, it’s looking to lure data centers. The state legislature has passed, and Governor Rick Scott has signed, legislation for sales tax exemptions for large data center projects. The law goes into effect July 1, 2017.To read this article in full or to leave a comment, please click here
At its Discover conference this week, HPE is pulling back the curtains on firmware security and advances in software-defined IT aimed to reduce costs and increase system flexibility for its users and help it stay ahead of competitors in next-generation infrastructure.There is plenty of competition in the market for converged and hyperconverged data center systems, but at the moment HPE has the lead in composable infrastructure, a term gaining currency in the system management world.Composable infrastructure allows data center managers to deploy infrastructure resources using software commands, notes Patrick Moorhead, founder of Moor Insights and Strategy.To read this article in full or to leave a comment, please click here
When there isn’t much else to choose between brands, customer service becomes an important differentiation, and in financial services the situation has become acute. As regulators continue to make it easier for customers to switch providers, financial institutions must spend as much time keeping existing account holders happy as they do wooing new ones. Issuing apps and making it easier for customers to bank and source products online is a good start, but account holders will soon notice and defect if such moves are really a thinly disguised attempt to reduce costs and close branches.To read this article in full or to leave a comment, please click here
Bandwidth, storage space and computing power (CPU/RAM/etc.) on your web servers represent a distinct and noteworthy cost for any company with a major online presence.As traffic to a company’s website increases, most opt to throw money at the problem. More servers. Caching systems. More bandwidth. But these are Band-Aids—temporary solutions to the problem, solutions that will only suffice for so long before yet another round of "throw money at the problem" is required to keep up with ever-growing web traffic. The real problem is simple: Your web pages are just plain too big.Way too big. Enormously large. The average website is 2.9 MB in size (as of May 15, 2017). And that's just the average—an average that is growing. Fast.To read this article in full or to leave a comment, please click here
While you may associate web-scale networking with cloud giants like Facebook, Google, and Amazon, it’s not just an architecture for the large scale enterprises anymore. The industry has looked at data centers like theirs and asked the question: “What are they doing that we can mimic at a smaller scale?” Through analysis of the way these organizations ran, the term “web-scale” was born, referring specifically to the hyperscale website companies that have built private, efficient, and scalable cloud environments. Since then, it’s become a growing model for organizations to adopt in their journey toward evolving for the future.To read this article in full or to leave a comment, please click here
The new APC Back-UPS BE600M1 provides instant battery power to your critical electronics when the power goes out, keeping you connected and available both personally and professionally. Designed specifically to enhance the features that matter most to you, including more runtime, more battery backup outlets, and a USB port for charging convenience, the BE600M1 is also smaller and lighter than the previous model. APC's BE600M1 offers guaranteed surge and lightning protection for attached devices. When the power goes out, the APC BE600M1 will power critical devices including home networking equipment; allowing you to maintain your internet connection. This allows you to work productively, avoid the loss of valuable data, and safely shut down equipment. It currently averages 4.5 out of 5 stars from over 4,400 people on Amazon (read reviews), where its list price of $74.99 has been reduced 22% to $58.73.To read this article in full or to leave a comment, please click here
Software is now dominating IT spending.My research shows that the combination of SaaS and on-premises software is now a $650 billion market that has seen a steady growth of 6 percent per annum over the past five years. Today, almost all areas of IT are sold at least partially as software, including applications, security, storage and network infrastructure. Software is agile, enables rapid innovation and is a key component of digital transformation.This is one reason why enterprise agreements (EAs) for software have become increasingly popular with corporate buyers. Enterprise agreements are software site licenses that are issued to a large company that brings consistency to pricing and allows for the widespread use of the application throughout the company. EAs have become a common option for almost every large software company today.To read this article in full or to leave a comment, please click here
It was a busy holiday weekend, and Great Britain’s national flag carrier was forced to ground all flights out of London’s two main airports, Heathrow and Gatwick—which affected the airline’s operations around the world. Oh, and the incident also affected British Airways’ call centers and online booking sites, making the situation even more frustrating for stranded passengers.Most operations have now been restored, the airline says, but more than 1,000 flights were canceled and 75,000 passengers stranded. But here’s the thing: The problems weren’t due to some evil cyber attack or ransomware assault. Nope, it was just another “global IT system failure,” reportedly British Airways’ sixth such incident in the last year alone!To read this article in full or to leave a comment, please click here
Microsoft has apparently firmed up its plans for a DNA-based storage device that it expects to be commercially available within about three years.The software giant originally unveiled its research into DNA as an archival storage medium last year; it described the technology being able to store the amount of data in "a big data center compressed into a few sugar cubes. Or all the publicly accessible data on the Internet slipped into a shoebox."That is the promise of DNA storage -- once scientists are able to scale the technology and overcome a series of technical hurdles," the company said in a 2016 blog post.To read this article in full or to leave a comment, please click here
A major British Airways crash has highlighted the importance for businesses of testing backup systems and disaster recovery procedures to ensure that they work as planned.The airline experienced what CEO Alex Cruz described as "a major IT systems failure" that, he said, affected all check-in and operational systems.The failure on Saturday, May 27, resulted in the delay or cancellation of hundreds of flights, leaving thousands of passengers stranded at London's Heathrow Airport on a holiday weekend. Things were still not back to normal two days later.Cruz described the cause of the failure as "a power supply issue," without going into detail.To read this article in full or to leave a comment, please click here
Shortly after talking the helm as Polycom, CEO Mary McDowell discussed her strategy for the company moving forward. One of the focus areas for it is to broaden its technology partner ecosystem. The company has a great partnership with Microsoft and is the only vendor that has products that interoperate natively with Skype for Business/Office 365. As lucrative as this partnership has been to Polycom, McDowell recognizes that not everyone will be using Microsoft for their collaboration needs. + Also on Network World: Polycom brings a wide variety of video solutions to Microsoft Unified Communications +
Also, Polycom will be directing more resources into endpoint innovation. The infrastructure business at Polycom has been in decline for years because customers are choosing to leverage the power and ubiquity of the cloud. Polycom has been a technology leader since its inception, but the transition of video from being on premises to the cloud has shrunk the companies addressable market. Hence the change in strategy.To read this article in full or to leave a comment, please click here
The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here
The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here
While in Israel late last year, I caught up with Shaked Zin and Avi Shulman, co-founders of security company PureSec. PureSec was in a bit of a conundrum. It was doing important work but in a space that was still nascent: serverless computing. As such, it was having a hard time both articulating its value proposition and getting investors to understand and commit to their story.I found this conundrum interesting. Serverless computing is, after all, pretty high on the hype cycle. Ever since Amazon Web Services (AWS) introduced the notion of serverless via its Lambda offering a few years ago, all vendors have been rushing to commercialize their own serverless offering.To read this article in full or to leave a comment, please click here
While in Israel late last year, I caught up with Shaked Zin and Avi Shulman, co-founders of security company PureSec. PureSec was in a bit of a conundrum. It was doing important work but in a space that was still nascent: serverless computing. As such, it was having a hard time both articulating its value proposition and getting investors to understand and commit to their story.I found this conundrum interesting. Serverless computing is, after all, pretty high on the hype cycle. Ever since Amazon Web Services (AWS) introduced the notion of serverless via its Lambda offering a few years ago, all vendors have been rushing to commercialize their own serverless offering.To read this article in full or to leave a comment, please click here
A couple of weeks ago someone asked me to define the term digital transformation. I didn’t want to give a long technical answer, so instead I gave the one word answer of “speed.” In the digital era, market leaders will be defined by which organization can adapt to market trends the fastest. This means the whole company must move with speed—business leaders need to make decisions fast, employees need to adapt to new processes quickly, and the IT department must make changes to the infrastructure with speed.+ Also on Network World: Automation: Disrupt or be disrupted +
However, IT moving faster does not mean trying to execute the same manual processes 10 percent faster, as that would just lead to more errors. Nor does it mean throwing more people at the problem by adding to the IT staff. IT in the digital era means a complete re-think of operations with automation at the heart of the strategy.To read this article in full or to leave a comment, please click here
Veeam Software has been busy at its VeeamON user conference in New Orleans this week. During the event, the company talked about how it supports the "always on enterprise" and how it is helping enterprises support the transition to supporting the "digital life."The company's new Veeam Availability Suite v10 is designed to, in the company's words, "provide non-stop business continuity, digital transformation agility and analytics and visibility."Veeam Availability Suite v10
Here's what the company has to say about this new version of its software:This platform protects:
Physical servers and Network Attached Storage (NAS).
Tier-1 applications and mission-critical workloads with NEW Veeam CDP (continuous data protection), bringing recovery SLAs of seconds using continuous replication to the private or managed cloud.
Native object storage support, freeing up costly primary backup storage with policy-driven automated data management to reduce long-term retention and compliance costs. This includes broad cloud object storage support with Amazon S3, Amazon Glacier, Microsoft Azure Blob and any S3/Swift compatible storage.
The company goes on to describe what's new:To read this article in full or to leave a comment, please click here
Thousand megabit broadband is a turning point for internet delivery speeds. Newer tech, such as virtual reality, and the incumbents, such as video streaming, will benefit. Right now, though, only about 17 percent of the U.S.’s population has access to those super-fast speeds, which are primarily delivered by fiber, according to Viavi Solution’s latest Gigabit Monitor report.Although Gigabit is kicking in, it’s not going to be particularly simple to implement at the networking level, internet metrics company Ookla said earlier this month. Upgraded, wired installs will likely handle the throughput better than existing, now commonly used Wi-Fi, among other things, the company said.To read this article in full or to leave a comment, please click here
Thousand megabit broadband is a turning point for internet delivery speeds. Newer tech, such as virtual reality, and the incumbents, such as video streaming, will benefit. Right now, though, only about 17 percent of the U.S.’s population has access to those super-fast speeds, which are primarily delivered by fiber, according to Viavi Solution’s latest Gigabit Monitor report.Although Gigabit is kicking in, it’s not going to be particularly simple to implement at the networking level, internet metrics company Ookla said earlier this month. Upgraded, wired installs will likely handle the throughput better than existing, now commonly used Wi-Fi, among other things, the company said.To read this article in full or to leave a comment, please click here