Archive

Category Archives for "Network World SDN"

Microsoft launches undersea, free-cooling data center

A free supply of already-cooled deep-sea water is among the benefits to locating pre-packaged data centers underwater, believes Microsoft, which recently announced the successful launch of a submarine-like data center off the coast of the Orkney Islands in Scotland.The shipping-container-sized, self-contained server room, called Project Natick, submerged earlier this month on a rock shelf 117 feet below the water’s surface also has the benefit of potentially taking advantage of bargain-basement real estate near population centers — there’s no rent in open sea.“Project Natick is an out-of-the-box idea to accommodate exponential growth in demand for cloud computing infrastructure near population centers,” John Roach writes on Microsoft’s website.To read this article in full, please click here

More signs the Qualcomm Centriq is in trouble

Last month there were rumors that Qualcomm was looking to exit the data center business and abandon the Centriq processor, an ARM-based 48-core chip designed to take on Intel in the enterprise server market. The news seemed surprising, given Qualcomm had put years of work into the chip and had only just launched it a few months earlier.Now Bloomberg adds further fuel to the fire with a report that the company is preparing to lay off almost 280 employees, and most of them are in the data center group. Bloomberg got wind of the layoffs due to filings with the state governments in North Carolina and California, which require advanced notice of significant layoffs.To read this article in full, please click here

Network professionals should think SD-Branch, not just SD-WAN

Earlier this year, fellow industry analyst Lee Doyle wrote a blog post on the software-defined branch (SD-Branch) market hitting $3 billion by 2022. Doyle defines the SD-Branch as having SD-WAN, routing, network security, and LAN/Wi-Fi functions all in one platform with integrated, centralized management. An SD-Branch can be thought of as the next step after SD-WAN, as the latter transforms the transport and the former focuses on things in the branch, such as optimizing user experience and improving security.I don’t often critique other analysts work, as their opinion is theirs and not everyone agrees. However, in this case, I don't think “all in one platform” should be a requirement. The integrated and centralized management hits the nail on the head, but the software should act as a management overlay, so even though the infrastructure isn’t a “single box,” it’s managed like it.To read this article in full, please click here

HPE adds GreenLake Hybrid Cloud to enterprise service offerings

With its new GreenLake Hybrid Cloud offering, HPE's message to the enterprise is simple: Your cloud, your way.HPE is adding Microsoft Azure and Amazon Web Services capabilities to its GreenLake pay-per-use offerings, providing a turnkey, managed service to deploy public and on-premises clouds.[ Check out What is hybrid cloud computing and learn what you need to know about multi-cloud. | Get regularly scheduled insights by signing up for Network World newsletters. ] The company debuted the new HPE GreenLake Hybrid Cloud service Tuesday at its Discover conference in Las Vegas, saying that it can manage enterprise workloads in public and private clouds using automation and remote services, eliminating the need for new skilled staff to oversee and manage cloud implementations.To read this article in full, please click here

IDG Contributor Network: When will your company ditch its data centers?

Agility and speed are of paramount importance for most organizations as they try to innovate and differentiate themselves from the competition. The need for flexibility and rapid scalability is driving more and more companies into the cloud, as traditional data centers are no longer proving to be competitive, agile or robust enough.It should come as no surprise that Cisco predicts 94 percent of workloads and compute instances will be processed by cloud data centers by 2021. But deciding when to take the leap, weighing the costs and risks, and developing a successful strategy is easier said than done. Let’s take a closer look at why companies are ditching those data centers and how they can make the transition as smooth as possible.To read this article in full, please click here

Cisco’s buying July Systems to bolster its Wi-Fi application options

Cisco took a step toward improving its mobile application family by saying it intended to buy privately held mobile firm July Systems for an undisclosed price.July Systems, founded in 2001, features its flagship product, Proximity MX, that offers what Cisco calls “an enterprise-grade location platform” with features such as instant customer activation, data-driven behavioral insights, contextual rules engine and APIs.The platform works with multiple location technologies such as Wi-Fi, Bluetooth Beacons or GPS to sense the user’s device with or without an app installed. Proximity MX can engage the user with SMS, E-mail or push notifications or trigger a notification to the business user or system via API, SMS or E-mail, July says.    To read this article in full, please click here

What is Cohesity and why did it just pull in $250M in venture money?

Normally venture funding stories don’t get much play here, but when a company scores $250 million, for a grand total of $410 million raised, one has to wonder what all the hoopla is about. Especially given some of the spectacular flameouts we’ve seen over the years.But Cohesity isn’t vapor; it’s shipping a product that it claims helps solve a problem that has plagued enterprises forever: data siloing.Founded in 2013 and led by Nutanix co-founder Mohit Aron, Cohesity just racked up a $250 million investment led by SoftBank Group’s Vision Fund, which includes funding from Cisco Investments, Hewlett Packard Enterprise, Morgan Stanley Expansion Capital, and Sequoia Capital. Those are some big names, to say the least.To read this article in full, please click here

Rackspace introduces data center colocation services

The effort around data center reduction has been to draw down everything, from hardware to facilities. Rackspace has an interesting new twist, though: Put your hardware in our data centers.The company announced a new data center colocation business this week, offering space, power, and network connectivity to customers who provide their own hardware. The facilities are in 10 locations around the world.It’s not a bad idea. The servers are the cheapest expense compared to facility costs, such as the physical building, power, and cooling.[ Learn how server disaggregation can boost data center efficiency and how Windows Server 2019 embraces hyperconverged data centers. | Get regularly scheduled insights by signing up for Network World newsletters. ] 'Lift and shift' to the cloud The new business, dubbed Rackspace Colocation, is positioned as a way for enterprises to kick off their cloud journey by getting out of their self-managed data center to lower their expenses as they move to the cloud.To read this article in full, please click here

Data freshness, not speed, most important for IoT

The age of sensor data is more important than how fast it takes that information to travel around Internet- and Location-of-Things environments, say some experts. Scientists are, in fact, rethinking the network because of it.“It’s not enough to transmit data quickly. That data also needs to be fresh,” says MIT in a news release.The university has been working on better ways to ensure that sensors, which distribute readings for analysis, provide the most salient stuff. It’s not easy because you can’t just send everything at the same time (an obvious solution) — there isn't enough bandwidth.To read this article in full, please click here

Automation critical to scalable network security

Securing the business network has been and continues to be one of the top initiatives for engineers. Suffering a breach can have catastrophic consequences to a business, including lawsuits, fines, and brand damage from which some companies never recover.To combat this, security professionals have deployed a number of security tools, including next-generation firewalls (NGFW) such as Cisco’s Firepower, which is one of the most widely deployed in the industry. Managing firewalls becomes increasingly difficult Managing a product like Firepower has become increasingly difficult, though, because the speed at which changes need to be made has increased. Digital businesses operate at a pace never seen before in the business world, and the infrastructure teams need to keep up. If they can’t operate at this accelerated pace, the business will suffer. And firewall rules continue to grow in number and complexity, making it nearly impossible to update them manually.To read this article in full, please click here

End-to-end data, analytics key to solving application performance problems

As someone who used to work in corporate IT, I can attest to the fact that in general, workers and IT are at odds most of the time. Part of the problem is the tools that IT uses has never provided the right information to help the technical people understand what the user is experiencing.That is why help desks are often referred to as “the no help desk” or “helpless desk” by the internal employees. Users call the help desk when an application isn’t performing the way it should, and IT is looking at a dashboard where everything is green and indicates things should be working.Traditional network management tools don’t provide the right information The main reason for this mismatch is that traditional network management tends to look at the IT environment through the lens of infrastructure instead of what the user experiences. Looking at specific infrastructure components doesn’t provide any view of the end-to-end environment, leading to a false sense of how things are running.To read this article in full, please click here

IBM launches new availability zones worldwide for hybrid enterprise clouds

CIOs and data center managers who run large hybrid clouds worldwide have a good chance of hearing IBM knock on their doors in the next few months.That's because IBM is opening 18 new "availability zones" for its public cloud across the U.S., Europe, and Asia-Pacific. An availability zone is an isolated physical location within a cloud data center that has its own separate power, cooling and networking to maximize fault tolerance, according to IBM.Along with uptime service level agreements and high-speed network connectivity, users have gotten used to accessing corporate databases wherever they reside, but proximity to cloud data centers is important. Distance to data centers can have an impact on network performance, resulting in slow uploads or downloads.To read this article in full, please click here

Supermicro is the latest hardware vendor with a security issue

Security researchers with Eclypsium, a firm created by two former Intel executives that specializes in rooting out vulnerabilities in server firmware, have uncovered vulnerabilities affecting the firmware of Supermicro servers. Fortunately, it’s not easily exploited.The good news is these vulnerabilities can be exploited only via malicious software already running on a system. So, the challenge is to get the malicious code onto the servers in the first place. The bad news is these vulnerabilities are easily exploitable and can give malware the same effect as having physical access to this kind of system.“A physical attacker who can open the case could simply attach a hardware programmer to bypass protections. Using the attacks we have discovered, it is possible to scale powerful malware much more effectively through malicious software instead of physical access,” Eclypsium said in a blog post announcing its findings.To read this article in full, please click here

IoT has an obsolescence problem

The Internet of Things (IoT) is a long way from becoming a mature technology. From wearable devices to industrial sensors and consumer conveniences, IoT vendors and users are still trying to figure out what the technology does best as it grows into a $9 trillion market by 2020 (according to some estimates).And yet, IoT is somehow already faced with a huge and growing problem of obsolescence. The problem, ironically, lies in the “things” themselves.Apple Watch: A premature antique Don’t believe me? Consider the solid gold Apple Watch Edition, launched in 2015 and sold for $10,000 to as much as $17,000 a pop. A traditional watch at that price point would be expected to last decades, perhaps even generations as it turns into a family heirloom. But with the announcement of Apple Watch OS 5 at the company’s World Wide Developers Conference this week, the original version of these fancy timepieces can no longer keep up. They simply won’t run the latest version of the operating system due out this fall, and they won’t have the features of brand-new Apple Watches that cost a tiny fraction of that amount.To read this article in full, please click here

What is digital twin technology? [and why it matters]

Digital twin technology has moved beyond manufacturing and into the merging worlds of the Internet of Things, artificial intelligence and data analytics.As more complex “things” become connected with the ability to produce data, having a digital equivalent gives data scientists and other IT professionals the ability to optimize deployments for peak efficiency and create other what-if scenarios.[ Click here to download a PDF bundle of five essential articles about IoT in the enterprise. ] What is a digital twin? The basic definition of a digital twin: it’s a digital representation of a physical object or system. The technology behind digital twins has expanded to include larger items such as buildings, factories and even cities, and some have said people and processes can have digital twins, expanding the concept even further.To read this article in full, please click here

IDG Contributor Network: Living on the edge: 5 reasons why edge services are critical to your resiliency strategy

When it comes to computing, living on the edge is currently all the rage. Why? Edge computing is a way to decentralize computing power and move processing closer to the end points where users and devices access the internet and data is generated. This allows for better control of the user experience and for data to be processed faster at the edge of the network – on devices such as smartphones and IoT devices.As enterprise organizations look to extend their corporate digital channel strategies involving websites with rich media and personalized content, it is vital to have a strong resiliency strategy.Deploying a combination of cloud and edge services can help by: reducing unplanned downtime; improving security and performance; extending the benefits of multi-cloud infrastructure; speeding application development and delivery; and improving user experience.To read this article in full, please click here

Comparing files and directories with diff and comm

There are a number of ways to compare files and directories on Linux systems. The diff, colordiff, and wdiff commands are just a sampling of commands that you're like to run into. Another is comm. The command (think "common") lets you compare files in side-by-side columns the contents of individual files.Where diff gives you a display like this showing the lines that are different and the location of the differences, comm offers some different options with a focus on common content. Let's look at the default output and then some other features.Here's some diff output -- displaying the lines that are different in the two files and using < and > signs to indicate which file each line came from.To read this article in full, please click here

Network-intelligence platforms, cloud fuel a run on faster Ethernet

2018 is shaping up to be a banner year for all things Ethernet.First of all, the ubiquitous networking technology is having a banner year already in the data center where in the first quarter alone, the switching market recorded its strongest year-over-year revenue growth in over five years, and 100G Ethernet port shipments more than doubled year-over-year, according to a report by Dell’Oro Group researchers.[ Now see who's developing quantum computers.] The 16-percent switching growth was, "driven by the large-tier cloud hyperscalers such as Amazon, Google, Microsoft and Facebook but also by enterprise customers,” said Sameh Boujelbene, senior director at Dell’Oro.To read this article in full, please click here

IDG Contributor Network: A digital-first enterprise needs SD-WAN

Since the advent of the internet and IP, networking technology has not seen a seismic shift of this magnitude that is occurring in Enterprise networks today. As organizations move from on-premises application hosting to a cloud-based approach, they are inundated with the inherent challenges of legacy network solutions. The conventional network architectures in most of today’s enterprises, were not built to handle the workloads of a cloud-first organization. Moreover, the increasing usage of broadband to connect to multi-cloud-based applications have escalated concerns around application performance, agility, and network security.Software-defined WAN (SD-WAN) has gained immense traction among CIOs lately. Gartner forecasts that SD-WAN will grow at a 59% compound annual growth rate through 2021 to become a $1.3 billion market. This is because there are a myriad of payoffs of moving to SD-WAN: Primarily, SD-WAN enables easier access to cloud and SaaS based applications for geographically distributed branch offices and mobile work force. Here are but just a few other important benefits that SD-WAN brings to digital-first organizations:To read this article in full, please click here

IBM, strengthens mainframe cloud services with CA’s help

IBM continues to mold the Big Iron into a cloud and devops beast.This week IBM and its long-time ally CA teamed up to link the mainframe and its Cloud Managed Services on z Systems, or zCloud, software with cloud workload-development tools from CA with the goal of better performing applications for private, hybrid or multicloud operations.[ For more on mainframes, read: The (mostly) cool history of the IBM mainframe and Why are mainframes still in the enterprise data center? | Get regularly scheduled insights by signing up for Network World newsletters. ] IBM says zCloud offers customers a way to move critical workloads into a cloud environment with the flexibility and security of the mainframe. In addition, the company offers the IBM Services Platform with Watson that provides another level of automation within zCloud to assist clients with their moves to cloud environments.To read this article in full, please click here

1 2 3 274