Archive

Category Archives for "Network World SDN"

IDG Contributor Network: How NFV and interconnection can help you think outside the ‘box’

Networking used to be all about specialized “boxes,” but that era is fading fast.  By a specialized box, I mean a piece of hardware that was built to perform an individual function. Physical firewalls, routers, servers, load balancers, etc., are all examples of these different boxes, and they are still everywhere. But new technology is seriously disrupting the old ways of doing things.Virtualization has made it possible to separate the software functionality of all those boxes from the specific appliance-type hardware in which it resides. Network functions virtualization (NFV) software can replicate an appliance’s function in a more cost-effective commodity server, which is easy to obtain and deploy and can hold the software for numerous functions at once. People like the improved simplicity, cost, agility and speed that comes with this change.To read this article in full, please click here

Product–services bundles boost AI

The infrastructure required to run artificial intelligence algorithms and train deep neural networks is so dauntingly complex, that it’s hampering enterprise AI deployments, experts say.“55% of firms have not yet achieved any tangible business outcomes from AI, and 43% say it’s too soon to tell,” says Forrester Research about the challenges of transitioning from AI excitement to tangible, scalable AI success.[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ] “The wrinkle? AI is not a plug-and-play proposition,” the analyst group says. “Unless firms plan, deploy, and govern it correctly, new AI tech will provide meager benefits at best or, at worst, result in unexpected and undesired outcomes.”To read this article in full, please click here

BrandPost: SD-WAN: A Modern Approach to Connectivity for Digital Businesses

Digital transformation has crossed the chasm from visionary aspiration to practical implementation and in the process, it is disrupting technologies across the business landscape.As enterprises and governments transform their operations, many are finding their legacy wide area networks (WANs) cannot meet today’s digital-driven bandwidth demands. Rather than giving them a competitive edge and supporting business growth, their networks are stifling innovation and impeding flexibility.To address this challenge, software-defined WANs (SD-WANs) are emerging as a smart way to streamline connections among enterprise sites.Enterprise WANs are under pressure to keep pace with the cloud revolution, which plays a critical role in digital transformation. “Companies worldwide are aggressively consolidating their data centers, implementing new data models, and shifting development to agile, mobile-first, cloud-based models,” IDC Group Vice President and IT executive advisor Joseph Pucciarelli writes in the Winter 2018 Issue of CIO's Digital Magazine.To read this article in full, please click here

The cloud continues to drive network evolution

It’s fair to say that there has never been a bigger driver of network evolution than the cloud. The reason for this is the cloud is a fundamentally different kind of compute paradigm, as it enables applications, data, and architecture changes to be done seemingly instantly. Cloud-native infrastructure is what enables mobile app developers to roll out new versions daily if they so choose.The cloud is network-centric Another fact about the cloud is that it is a network-centric compute model, so a poorly performing network leads to equally poorly performing applications. A lack of network agility means DevOps teams need to sit around twiddling their thumbs while network operations make changes to the network. To read this article in full, please click here

Nvidia packs 2 petaflops of performance in a single compact server

At its GPU Technology Conference this week, Nvidia took the wraps off a new DGX-2 system it claims is the first to offer multi-petaflop performance in a single server, thus greatly reducing the footprint to get to true high-performance computing (HPC).DGX-2 comes just seven months after the DGX-1 was introduced, although it won’t ship until the third quarter. However, Nvidia claims it has 10 times the compute power as the previous generation thanks to twice the number of GPUs, much more memory per GPU, faster memory, and a faster GPU interconnect.[ Learn how server disaggregation can boost data center efficiency. | Get regularly scheduled insights by signing up for Network World newsletters. ] The DGX-2 uses a Tesla V100 CPU, the top of the line for Nvidia’s HPC and artificial intelligence-based cards. With the DGX-2, it has doubled the on-board memory to 32GB. Nvidia claims the DGX-2 is the world’s first single physical server with enough computing power to deliver two petaflops, a level of performance usually delivered by hundreds of servers networked into clusters.To read this article in full, please click here

Simplifying Linux with … fish?

No, the title for this post is not a mistake. I’m not referring to the gill-bearing aquatic craniate animals that lack limbs with digits or the shark-shaped magnetic Linux emblem that you might have stuck to your car. The “fish” that I’m referring to is a Linux shell and one that’s been around since 2005. Even so, it’s a shell that a lot of Linux users may not be familiar with.The primary reason is that fish isn't generally installed by default. In fact, on some distributions, the repository that provides it is one your system probably doesn't access. If you type "which fish" and your system responds simply with another prompt, you might be missing out on an interesting alternative shell. And, if your apt-get or yum command can't find what you're looking for, you will probably have to use commands like those shown below to get fish loaded onto your system.To read this article in full, please click here

BrandPost: How network automation moves AI from science fiction to reality

Artificial intelligence (AI) has become a buzzword, and what once was realized only in sci-fi movies, is now a burgeoning reality in IT processes.There are significant savings — both in terms of time and money — to be had, as well as an increase in mission delivery.However, before organizations can take advantage of advancements like AI today, they must take a few key steps. One area is in the network. Let’s explore how enterprises can begin to evolve their network technology to leverage AI capabilities in the near future.AutomationNetwork automation is a meaningful step towards AI that can provide enhanced mission delivery today. By leveraging automation capabilities within the network, immediate efficiencies can be realized.To read this article in full, please click here

BrandPost: Mobile user engagement apps: Trends & requirements

The mobile engagement app has emerged as a way to acquire, retain, and monetize loyal user bases. When designed properly, everyone gains from the app. Users are more satisfied, productive, and even safer. Businesses can enjoy larger and more predictable revenue streams. Executed poorly, mobile apps can have low download rates, and become abandoned, forgotten or deleted.To learn more about how businesses are using these apps and their plans for the future, we surveyed companies across all industries. A high percentage of organizations have already determined they need an engagement app. To date, most of the apps in use are being developed in-house; commercial off-the-shelf versions are up and coming, but not yet well-known. We learned there is still lots of room for improvement and that an important requirement of the apps is to track location.To read this article in full, please click here

BrandPost: 802.11ax enhancements: What’s all the hype about?

Devin Akin, the Principal Wi-Fi Architect for Divergent Dynamics, recently gave a great webinar presentation on the upcoming release of the new 802.11ax standard.When any new technology is introduced, there is a tendency for companies to over-rotate and get caught up with the hype. Devin is anti-hype; he balances the discussion with education, and shares the detailed realities of the new underlying technology. As with past introductions of 802.11 enhancements, it is important to pay attention to the standard ratification date, silicon production schedules, and in particular, 11ax client introductions.  What’s So Different About 802.11ax?To read this article in full, please click here

BrandPost: An efficient network: The Fabric of a complex paperless hospital system

As CTO of a large regional hospital system, I know that the network is vital to everything. I consider the network in the same critical category as electricity, oxygen, and water. It needs to be available 24x7x365 to support Concord Hospital’s half a million patients and over 5,000 network users across our 30 locations.Because we don’t do anything on paper, the network can never go down. Extreme Fabric Connect has solved that problem for us. It provides a secure, self-healing, highly-available network to serve our patients effectively in our complex paperless environment. All the Benefits of MPLS, None of the HeadachesTo read this article in full, please click here

BrandPost: The cloud payoff: Ensuring hybrid works for your enterprise

As more and more enterprises move to hybrid cloud, there are some interesting relationships among enterprises, Internet and cloud exchanges, and colocation providers to satisfy IT strategies through hybrid clouds. In its Strategic Roadmap for Data Center Infrastructure, Gartner notes that “by 2019, 80% of enterprises will have an IT strategy that includes multiple Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) providers.” This is up from only 10% in 2015, while “by the end of 2018, 10% of enterprises will close their on-premises data centers entirely.”To read this article in full, please click here

IoT could help at-risk seniors

The internet of things is also, in part, the internet of people, particularly in the plans of an Ontario-based chain of retirement homes and long-term care facilities called Schlegel Villages.The company, which is based in Kitchener, Ontario, designs its facilities to be less institutional-looking and more friendly, preferring to call them “villages.” But it’s got a problem to deal with, as at-risk seniors can sometimes become confused and attempt to leave.[ For more on IoT see tips for securing IoT on your network, our list of the most powerful internet of things companies and learn about the industrial internet of things. | Get regularly scheduled insights by signing up for Network World newsletters. ] According to Schlegel’s IT director, Chris Carde, it’s a serious issue.To read this article in full, please click here

What is the Open Compute Project?

The Open Compute Project began in 2011 when Facebook published the designs of some homebrew servers it had built to make its data centers run more efficiently.Facebook hoped that other companies would adopt and adapt its initial designs, pushing down costs and improving quality – and they have: Sales of hardware built to Open Compute Project designs topped $1.2 billion in 2017, double the previous year, and are expected to reach $6 billion by 2021.[ Don’t miss customer reviews of top remote access tools and see the most powerful internet of things companies . | Get weekly insights by signing up for our CIO Leader newsletter. ] Those figures, from IHS Markit, exclude hardware spending by OCP board members Facebook, Intel, Rackspace, Microsoft and Goldman Sachs, which all use OCP to some degree. The spend is still a small part of the overall market for data-center systems, which Gartner estimated was worth $178 billion in 2017, but IHS expects OCP’s part to grow 59 percent annually, while Gartner forecasts that the overall market will stagnate, at least through 2019.To read this article in full, please click here

10 hot storage companies to watch

Innovations such as software-Defined Storage (SDS), hyper-converged infrastructures (HCI), and blockchain have investors flocking to enterprise storage startups, and this market shows no signs of slowing down.Collectively, the 10 startups featured in this roundup have raised more than $736 million in VC funding. This total is even more impressive when you factor in two startups not included in that calculation. One of them is entirely self-funded, while the other has a unique business model and an equally unique source of non-VC funding: an ICO, or Initial Coin Offering.[ Don’t miss customer reviews of top remote access tools and see the most powerful internet of things companies . | Get weekly insights by signing up for our CIO Leader newsletter. ] According to research firm IDC, the worldwide enterprise storage market expanded by 13.7 percent year-over-year to just under $13.6 billion in the fourth quarter of 2017. Other research firms believe the growth rate will accelerate in the near term. Research and Markets, for instance, predicts that one fast-growing segment of the overall enterprise storage market, cloud storage, will expand to become a $92.5 billion market by 2022.To read this article in full, please click here

How we chose10 hot storage startups to watch

The hardest thing about compiling a startup roundup isn’t choosing 10 hot startups. Rather, it’s eliminating the many promising startups that could easily end up being more successful than any one of my top picks.It’s a challenge that comes with the territory. After all, the success or failure of any given startup will be due to many factors, plenty of which are impossible to measure. However, in our data-driven era, I’ve been experimenting with ways to improve my hit rate.[ Don’t miss customer reviews of top remote access tools and see the most powerful internet of things companies . | Get weekly insights by signing up for our CIO Leader newsletter. ] As a journalist, I’ve been writing about startups since the height of dotcom bubble, easily covering hundreds, if not thousands, of startups along the way. As a writer, content marketer, and strategist, I’ve worked in, consulted with, and devised go-to-market strategies for dozens and dozens more.To read this article in full, please click here

How we chose 10 hot storage startups to watch

The hardest thing about compiling a startup roundup isn’t choosing 10 hot startups. Rather, it’s eliminating the many promising startups that could easily end up being more successful than any one of my top picks.It’s a challenge that comes with the territory. After all, the success or failure of any given startup will be due to many factors, plenty of which are impossible to measure. However, in our data-driven era, I’ve been experimenting with ways to improve my hit rate.[ Don’t miss customer reviews of top remote access tools and see the most powerful internet of things companies . | Get daily insights by signing up for Network World newsletters. ] As a journalist, I’ve been writing about startups since the height of dotcom bubble, easily covering hundreds, if not thousands, of startups along the way. As a writer, content marketer, and strategist, I’ve worked in, consulted with, and devised go-to-market strategies for dozens and dozens more.To read this article in full, please click here

Cisco emboldens disaggregation strategy

The notion of disaggregation – separating the operating system and applications from the underlying hardware –  has always been a conundrum for Cisco.  In a nutshell, why would the company risk losing all of the millions of dollars in development and the key networking features tied up in current Cisco hardware and software packages?But in the new world of all-things software in which Cisco plans to be king, the disaggregation strategy is gaining momentum.[ Check out REVIEW: VMware’s vSAN 6.6 and hear IDC’s top 10 data center predictions . | Get regularly scheduled insights by signing up for Network World newsletters. ] This week the company took things a step further in announcing a variety of disaggregation steps enterprise and service provider customers could be interested in.To read this article in full, please click here

To understand big data, convert it to sound

Humans are far better at identifying data pattern changes audibly than they are graphically in two dimensions, researchers exploring a radical concept say. They think that servers full of big data would be far more understandable if the numbers were all moved off the computer screens or hardcopies and sonified, or converted into sound.That's because when listening to music, nuances, can jump out at you — a bad note, for example. And researchers at Virginia Tech say the same thing may apply with number crunching. Data-set anomaly spotting, or comprehension overall, could be enhanced.Also read: How tech giants are putting big data to work | Sign up: Receive daily Network World news updates The team behind a project to prove this is testing the theory with a recently built 129-loudspeaker array installed in a giant immersive cube in Virginia Tech’s performance space/science lab, the school's Moss Arts Center.To read this article in full, please click here

IDG Contributor Network: Building the smart city: 8 things that matter

What is a smart city? The answer depends on whom you ask.Solutions providers will tell you it’s smart parking, smart lighting or anything to do with technology. City officials may tell you it’s about conducting city business online, such as searching records or applying for permits. City residents may tell you it’s the ease of getting around, or about crime reduction.Everyone is right. A smart city, built properly, will have different value for different stakeholders. They may not even think of their city as a “smart” city. They know it only as a place they want to live in, work in, and be a part of.Technology is top of mind in the smart city. But it is just one layer of many in the smart city ecosystem. Each layer has a different but equally important role. No one layer is more relevant than another. For example, technology “powers” the smart city, but it is data that leads to insights and new services. However, to create relevant services, cities must be able to innovate the right solutions. To get the right outcomes that matter consistently and at scale, civic leaders must establish sensible technology and data policies. Finally, Continue reading

Patches for Meltdown and Spectre aren’t that bad after all

Internal tests from a leading industry vendor have shown that fixes applied to servers running Linux or Windows Server aren’t as detrimental as initially thought, with many use cases seeing no impact at all.The Meltdown and Spectre vulnerabilities, first documented in January, seemed like a nightmare for virtualized systems, but that is overblown. There are a lot of qualifiers, starting with what you are doing and what generation processor you are using.The tests were done on servers running Xeons of the Haswell-EP (released in 2014), Broadwell-EP (released in 2016), and Skylake-EP (released in 2017). Haswell and Broadwell were the same microarchitecture, with minor tweaks. The big change there was Broadwell was a die shrink. Skylake, though, was a whole new architecture, and as it turns out, that made the difference.To read this article in full, please click here