New, faster Internet protocol for disasters proposed

The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here

IDG Contributor Network: New, faster Internet protocol for disasters proposed

The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here

IDG Contributor Network: New, faster Internet protocol for disasters proposed

The Internet isn’t fast enough, or bandwidth capacious enough for data-intensive emergency traffic during disaster response such as in hurricanes and earthquakes, scientists think. Video streams of flood scenes, say, along with laser mapping theoretically helps responders quickly allocate resources, but it gets bogged down along with other responder traffic, video chats and social media during the incidents.Multi Node Label Routing (MNLR) is a new protocol that will solve this reliability problem by routing responder data through a “high-speed lane of online traffic,” says an article in Rochester Institute of Technology’s (RIT) University News. Researchers at the school have developed the tech.To read this article in full or to leave a comment, please click here

IDG Contributor Network: Big Data will enable better network and application intelligence in 5G

We are fortunate to live in an exciting time where multiple technological leaps are occurring. Specifically, I am thinking of the mobile industry transition from 4G to 5G, and the cross-industry IT paradigm shift to the Big Data approach. The 5G standards community is already planning to support the collection and transmission of massive amounts of data. This is one of its key requirements pillars in the area of supporting the IoT. What is left, however, is for the 5G community to ensure that the other component of Big Data, namely support for network and application intelligence, is also baked into the 5G architecture. Otherwise, 5G may become simply a pipe for Big Data passing between devices and the cloud infrastructure.To read this article in full or to leave a comment, please click here

IDG Contributor Network: Postcards from the network edge

I was recently invited to participate on a panel at a major IT conference, where questions from the audience provided an interesting window into the top issues that networking professionals are dealing with as part of their organizations’ digital transformation.Every enterprise, it seems, is planning a cloud strategy.  On closer inspection, most are already using the cloud in the form SaaS ERP and CRM applications like Salesforce, NetSuite, etc. These applications have performed well enough on top of traditional, legacy networks.However, newer, more multi-dimensional cloud applications are forcing businesses to look for ways to make their networks more agile. One of these is Microsoft Office 365.  Microsoft is aggressively investing in their infrastructure to provide a superior experience for users. Nevertheless, the enterprise network, and more specifically the wide area network (WAN), remains one of the biggest impediments to providing an on-premise caliber quality of experience for cloud applications. Finding the most efficient exit to Office 365 and best performance server are usually the culprits.To read this article in full or to leave a comment, please click here

IDG Contributor Network: Postcards from the network edge

I was recently invited to participate on a panel at a major IT conference, where questions from the audience provided an interesting window into the top issues that networking professionals are dealing with as part of their organizations’ digital transformation.Every enterprise, it seems, is planning a cloud strategy.  On closer inspection, most are already using the cloud in the form SaaS ERP and CRM applications like Salesforce, NetSuite, etc. These applications have performed well enough on top of traditional, legacy networks.However, newer, more multi-dimensional cloud applications are forcing businesses to look for ways to make their networks more agile. One of these is Microsoft Office 365.  Microsoft is aggressively investing in their infrastructure to provide a superior experience for users. Nevertheless, the enterprise network, and more specifically the wide area network (WAN), remains one of the biggest impediments to providing an on-premise caliber quality of experience for cloud applications. Finding the most efficient exit to Office 365 and best performance server are usually the culprits.To read this article in full or to leave a comment, please click here

InfoVista ABCs of APIs Webinar Q&A: SDN+NFV-Driven Evolution of Performance Assurance Architectures

ABCs of APIs Webinar Q&A Thanks to all who joined us for the InfoVista ABCof APIs Webinar: SDN+NFV-Driven Evolution of Performance Assurance Architectures, where the award-winning multi-vendor discussed API, and proof of concept for multi-operator networking-as-a-service featuring LSO concepts within the context of SDN and NFV. After the webinar, we took questions from the audience but unfortunately ran out of time before we... Read more →

Patent Troll Battle Update: Doubling Down on Project Jengo

Project Jengo Doubles In Size
Jengo Fett by Brickset (Flickr)

We knew the case against patent trolls was the right one, but we have been overwhelmed by the response to our blog posts on patent trolls and our program for finding prior art on the patents held by Blackbird Tech, which we’ve dubbed Project Jengo. As we discuss in this post, your comments and contributions have allowed us to expand and intensify our efforts to challenge the growing threat that patent trolls pose to innovative tech companies.

We’re SIGNIFICANTLY expanding our program to find prior art on the Blackbird Tech patents

In a little over a week since we started the program, we’ve received 141 separate prior art submissions. But we know there’s an opportunity to find a lot more.

We’ve been impressed with the exceptionally high quality of the submissions. The Cloudflare community of users and readers of our blog are an accomplished bunch, so we have a number of searches that were done by expert engineers and programmers. In one case that stood out to us, someone wrote in about a project they personally had worked on as an engineer back in 1993, which they are convinced is conclusive prior art Continue reading

HPC Center NERSC Eases Path to Optimization at Scale

The National Energy Research Scientific Computing Center (NERSC) application performance team knows that for many users, “optimization is hard.” They’ve thought a lot about how to distill the application optimization process for users in a way that would resonate with them.

One of the analogies they use is the “Ant Farm.” Optimizing code is like continually “running a lawnmower over a lawn to find and cut-down the next tallest blade of grass,” where the blade of grass is analogous to a code bottleneck that consumes the greatest amount of runtime. One of the challenges is that each bottleneck

HPC Center NERSC Eases Path to Optimization at Scale was written by Nicole Hemsoth at The Next Platform.

Zyklon Season

The ASERT research team has recently done some work reverse engineering a family of malware called “Zyklon H.T.T.P.” that is written using the .Net framework. Zyklon (German for “cyclone”) is a large, multi-purpose trojan that includes support for a variety of malicious activities, including several […]

Amazon’s 2-For-1 Dash Button Deal Actually Nets You Two Free Dash Buttons – Deal Alert

Right now if you buy a Dash Button, Amazon will give you one for free in honor of National Pet Week, this week. But they're still giving you the $4.99 credit after the first time you use it. So that's two Dash buttons for free, really (typically Dash buttons cost $4.99). Amazon Dash is a simple Wi-Fi connected gadget that lets you order your favorite things with just the push of a button. Keep it by your washing machine, your pet food, or in the bathroom closet. When you notice you're running low, just press the button and Amazon ships it right out. Each button gets tied to a specific product from Amazon's library of over 300 brands, in categories such as (click each category to see samples) household supplies, beverage & grocery, health & personal care, beauty products, pets, kids & baby, and more. Access this deal on Amazon. To read this article in full or to leave a comment, please click here

The complexity of password complexity

Deploying password quality checking on your Debian-base Linux servers can help to ensure that your users assign reasonable passwords on their accounts, but the settings themselves can be a bit misleading. For example, setting a minimum password length of 12 characters does not mean that your users' passwords will all have twelve or more characters. Let's stroll down Complexity Boulevard and see how the settings work and examine some settings worth considering.First, if you haven't done this already, install the password quality checking library with this command:apt-get -y install libpam-pwquality The files that contain most of the settings we're going to look at will be:To read this article in full, please click here

The complexity of password complexity

Deploying password quality checking on your Debian-base Linux servers can help to ensure that your users assign reasonable passwords on their accounts, but the settings themselves can be a bit misleading. For example, setting a minimum password length of 12 characters does not mean that your users' passwords will all have twelve or more characters. Let's stroll down Complexity Boulevard and see how the settings work and examine some settings worth considering.First, if you haven't done this already, install the password quality checking library with this command:apt-get -y install libpam-pwquality The files that contain most of the settings we're going to look at will be:To read this article in full or to leave a comment, please click here

The complexity of password complexity

Deploying password quality checking on your Debian-base Linux servers can help to ensure that your users assign reasonable passwords on their accounts, but the settings themselves can be a bit misleading. For example, setting a minimum password length of 12 characters does not mean that your users' passwords will all have twelve or more characters. Let's stroll down Complexity Boulevard and see how the settings work and examine some settings worth considering.First, if you haven't done this already, install the password quality checking library with this command:apt-get -y install libpam-pwquality The files that contain most of the settings we're going to look at will be:To read this article in full or to leave a comment, please click here