Think about the most successful, widely scaled networks that let us function in today's world. No I’m not talking about internet service providers, I mean the Really, Really Big Networks. The ones that without which modern civilization would be very different. The telephone system. Intermodal containerized shipping. Air traffic control. And they all have one vitally important enabling element that made them all scalable: a Control Layer that is not intrinsic to the electronic or physical streams that make up the network traffic. For phones, it’s Signalling System 7, which has managed to run the world of voice calls for decades. For intermodal shipping, it’s container manifests. For aviation, it’s ATC. And they truly do, run the globe.To read this article in full or to leave a comment, please click here
If there’s one problem just about every IT professional can relate to, it is the pain of a storage migration. Aging is part of life not only for us IT veterans, but also the storage systems we manage. Despite the fact that we’ve been having to move data off old storage for decades, the challenge of moving data from one storage resource to another, without disrupting business, remains one of the most time consuming and stressful projects for an IT team.Many of the IT professionals I speak with tell me that their migrations are scheduled over months, and can even take a year to plan and execute. It’s no surprise then that IT professionals named migrations as the number two issue facing their departments in a recent survey. Only performance presents a bigger challenge for today’s IT professionals.To read this article in full or to leave a comment, please click here
The digital economy never sleeps and it never stops moving. The same can be said for many small business owners who work around the clock to keep up with rising customer demands and skyrocketing competition. The speed of business in the digital economy has required those business owners to step outside of their comfort zone and into the complex and confusing world of small business IT. Although this is a difficult step for many, they can’t deny how important it is to the survival of the company. It’s the only way they can ensure their business operates effectively. A thriving, always-available network is simply the lifeblood of any modern business.To read this article in full or to leave a comment, please click here
Optical data can be too fast for its own good. While the speeds obtained are great for carrying information over distances and into chips, when the light-carried data lands there it’s often moving too fast to be thoroughly processed and analyzed. Data can need slowing down for intense number-crunching and routing.Solutions to this apparent dichotomy have been attempted. They include the obvious one — speeding up microprocessors themselves. However, there’s a problem with that: Faster chips using electronics create more heat, generate interference and use more energy. All bad for data centers.Using sound waves to speed up networks
Scientists say sound waves, though, could present a solution. They say one should convert the light zooming into the chip to sound — creating a kind of acoustic buffer (sound waves travel slower than light waves) — then process the data and turn it back into zippy light again, to be sent on its way.To read this article in full or to leave a comment, please click here
I've worked at my fair share of large corporations in my life, and like most of you, I've experienced more network and server outages than I can shake a stick at. Sometimes these outages are small and only mildly disruptive (a file server going down for a few minutes). Other times, an outage can cause massive, widespread work stoppages (such as when an email server goes offline for multiple hours — or days). These outages are, at least for the company, bad things. If your employees can no longer communicate, work all but grinds to a halt. One hour of total downtime multiplied by the average hourly pay of your employees can equal a pretty big amount of lost moolah.To read this article in full or to leave a comment, please click here
Intent-based systems have been all the rage since Cisco announced its “Network Intuitive” solution earlier this year. For Cisco customers, its solution is certainly interesting. But what about businesses that want an alternative to Cisco? Or companies that want to run a multi-vendor environment?Over a year before Cisco’s launch, a start-up called Apstra shipped the closed-loop, intent-based solution. It was designed to be multi-vendor in nature with support for Cisco but also Arista, Juniper, HP and others, including white box. Apstra operates as an overlay to networks built on any of the leading vendors to deliver intent-based networking in heterogeneous environments.To read this article in full or to leave a comment, please click here
Does the company you work for (or own) retain data on customers? Odds are pretty high that it does, at least in some form (often fairly extensively). It's often attractive to do so for both marketing and functionality purposes.But here's the thing, storing that data is probably a bad business decision. One that could cost your business a huge amount of money and, even worse, potential loss of trust by your most valuable customers.Storage costs
Just from the IT infrastructure point of view: As your business grows and the amount of data you store on each customer slowly expands (it always does), your cost for storing that data also grows. Rather quickly. Even if your data center is already well equipped, this is a not-insignificant recurring expense (failing drives, energy costs, other equipment needs, etc.).To read this article in full or to leave a comment, please click here
Cyber security remains a hot topic with nearly every IT and business leader that I speak with. In particular, there seems to be an intensified focus on network security. Security is typically deployed in layers (network, compute and application), and I expect that model to continue in the short-term, but given the fact that many of the building blocks of digitization, such as IoT and the cloud, are network-centric, there should be a stronger focus on leveraging the network and network-based security to protect the organization.Nowhere in the network has there been more change than in the wide area network (WAN), so it stands to reason that as legacy WANs evolve into software-defined WANs, it must play an increasingly critical role in securing the enterprise. Below are my top five recommendations to better secure your organization with an SD-WAN.To read this article in full or to leave a comment, please click here
Nvidia and server makers Dell EMC, HPE, IBM and Supermicro announced enterprise servers featuring Nvidia’s Tesla V100 GPU. The question is, can servers designed for machine learning stem the erosion of enterprise server purchases as companies shift to PaaS, IaaS, and cloud services? The recent introduction of hardened industrial servers for IoT may indicate that server makers are looking for growth in vertical markets.There are very compelling reasons for moving enterprise workloads to Amazon, Google, IBM and other hosted infrastructures. The scalability of on-demand resources, operating efficiency at cloud-scale and security are just three of many reasons. For instance, Google has 90 engineers working on just security where most enterprises are understaffed.To read this article in full or to leave a comment, please click here
Google has partnered with Scale Computing, developer of infrastructure software for hyper-converged systems, to make it easier to deploy Google Cloud Platform as a backup for your own data center. The two companies have created a platform called Cloud Unity, which integrates Scale’s HC3 software environment with Google Compute Engine. HC3 is a cluster software product that merges server, storage and virtualization into a single appliance for easier converged infrastructure. Also on Network World: Google develops high-capacity cloud data transfer device
With HC3, you can build a cluster using Google's infrastructure instead of buying your own, thus creating a backup of your own data center in Google’s data centers. Cloud Unity creates a SD-WAN connection to your existing Scale environment, so the Google-hosted cloud version of your data center appears as just another cluster on the same LAN. It uses a VXLAN encryption between your site and Google’s data center. To read this article in full or to leave a comment, please click here
Companies get into IoT for multiple reasons. In the case of Arrow, the Internet of Things (IoT) was thrust upon it. Now, however, Arrow Electronics is trailblazing into IoT platform services.Last year, Arrow did nearly $24 billion in revenue that was largely split between two businesses: electronic components and enterprise computing solutions. Historically, these two divisions had little overlap, but IoT has bridged the two units.+ Also on Network World: Forrester: 3 ways IoT can drive business value +
A similar story is occurring at organizations everywhere as IoT converts disconnected things into computing peripherals. For example, hospital refrigerators that store blood and medicine are increasingly integrated into intelligent building management and communications systems.To read this article in full or to leave a comment, please click here
It’s been about a year and a half since I asked the question in this blog, “Is the Cisco 6500 Series invincible?” I believe enough time has passed and that I should revisit that question — especially since people in the industry have been talking, tweeting and writing about the demise of the venerable Catalyst 6500 chassis family for years. But don’t worry, the King is not dead because Cisco is having none of that! + Also on Network World: 4 resources to find free Cisco skills and certification labs +
Aside from being a major revenue stream for Cisco, the 6500 chassis family remains a solid platform that the company has made extensible by creating the Catalyst 6800 family. To read this article in full or to leave a comment, please click here
Network virtualization is a process of abstraction that separates the network from its underlying physical hardware, allowing for the customization of network infrastructures through aggregation and provisioning measures. Among the potential benefits of network virtualization are faster provisioning of networks, easier management of networks, and more efficient use of resources.To read this article in full or to leave a comment, please click here(Insider Story)
It’s an old cliché: If you fail to plan, you better plan to fail. That seems to apply to a new study by CompTIA that finds only 34 percent of businesses surveyed plan their IT infrastructure beyond one year.The reasons are legitimate: the disruption brought about by the migration to cloud computing and Internet of Things (IoT) deployments. Both are seriously disruptive and can make long-term planning a challenge. To stay flexible to changes as they undergo a digital transformation, businesses are reticent to plan beyond one year out. + Also on Network World: Cost optimization gains ground in IT infrastructure decisions +
The report, titled "Planning a Modern IT Architecture," also found some of the usual problems dogging IT shops. Four in 10 companies said they lacked the budget for heavy investment in new architecture, and one-third said they don’t have the knowledge on emerging technologies and new trends to formulate an integration plan. To read this article in full or to leave a comment, please click here
You can’t escape the IoT momentum these days. The Internet of Things is being used for everything from saving the rhino from poaching to leveraging stray dogs to fight crime. (No, really, I’m not kidding… check the links!) But even as vendors spend billions to try and grab IoT market share, it’s not always clear exactly how their business customers are supposed to actually benefit from IoT. (The challenges can be equally hard to understand). According to a thoughtful new report from Forrester, the answer lies in three fundamental business scenarios:To read this article in full or to leave a comment, please click here
In yet another sign of how the Internet of Things (IoT) is re-arranging the international corporate landscape, Japanese manufacturing giant Hitachi is reorganizing to challenge the global leaders in the IoT market. The conglomerate is combining three of its Bay Area divisions into a single $4 billion unit responsible for growing Hitachi's IoT operations in more than 130 countries.The new IoT-centric operation will combine Hitachi Data Systems (data center infrastructure), Hitachi Insight Group (big data software) and Pentaho (analytics) into a wholly owned subsidiary called Hitachi Vantara, which will employ some 7,000 people (about a of third of Hitachi’s IT workforce) out of its Santa Clara, California, headquarters.To read this article in full or to leave a comment, please click here
The ping command sends one or more requests to a system asking for a response. It's typically used to check that a system is up and running, verify an IP address, or prove that the sending system can reach the remote one (i.e., verify the route).The ping command is also one that network intruders often use as a first step in identifying systems on a network that they might next want to attack. In this post, we're going to take a quick look at how ping works and then examine options for configuring systems to ignore these requests.How ping works
The name "ping" came about because the ping command works in a way that is similar to sonar echo-location, which used sound propogation for navigation. The sound pulses were called "pings." The ping command on Unix and other systems sends an ICMP ECHO_REQUEST to a specified computer, which is then expected to send an ECHO_REPLY. The requests and replies are very small packets.To read this article in full or to leave a comment, please click here
I’ve been spending a lot of time the past few weeks reviewing SD-WAN vendor cloud offerings. Maybe it’s because of some the announcements in the area. It triggered a bunch of questions from my customers. Maybe it’s because a lot of folks seem to be waking up to the importance of connecting their SD-WAN into the cloud.Regardless, what’s become increasingly apparent to me are the vast differences between vendor implementations. At first glance, the cloud would seem to be just like any other site. Add an SD-WAN node as you would with any other location, let it connect into the SD-WAN, and voila! Job done. Oh, how I wish it was that simple.SD-WAN cloud configurations are like that sweet, devilish 5-year old who can terrorize your home while looking delightfully cherubic. Different tools are needed to manage cloud implementations than the cloud. Routing into the IaaS cloud is rarely simple. Properly configuring the cloud—setting up the VPCs, installing the SD-WAN nodes, provisioning the IPsec connectivity—all take time. It’s why SD-WAN vendors have made a point of introducing cloud-specific implementations.To read this article in full or to leave a comment, please click here
What if every package shipped contained a $0.20 tracker chip that could report when and approximately where the package was opened?That's a service that internet-of-things wireless network operator Sigfox thinks its partners could offer over the next year.It demonstrated a prototype wireless module contained in a cardboard envelope at its partner meeting in Prague on Tuesday, triggering the sending of a text message when the envelope was opened.Ripping open the envelope, Sigfox scientific director Christophe Fourtet showed off what he described as "an ultra-thin battery, ultra-thin contacts, and an ultra-low cost module, a few tens of cents." Seconds later, his phone buzzed to report delivery of the package.To read this article in full or to leave a comment, please click here
Your Linux users may not be raging bulls, but keeping them happy is always a challenge as it involves managing their accounts, monitoring their access rights, tracking down the solutions to problems they run into, and keeping them informed about important changes on the systems they use. Here are some of the tasks and tools that make the job a little easier.Configuring accounts
Adding and removing accounts is the easier part of managing users, but there are still a lot of options to consider. Whether you use a desktop tool or go with command line options, the process is largely automated. You can set up a new user with a command as simple as adduser jdoe and a number of things will happen. John’s account will be created using the next available UID and likely populated with a number of files that help to configure his account. When you run the adduser command with a single argument (the new username), it will prompt for some additional information and explain what it is doing.To read this article in full or to leave a comment, please click here