Archive

Category Archives for "Networking"

National pen test execution standard would improve network security

As the number of cyber attacks increases, the demand for penetration tests – to determine the strength of a company’s defense – is also going up. People are worried about their companies’ networks and computer systems being hacked and data being stolen. Plus, many regulatory standards such PCI and HITRUST require these tests to be performed on at least an annual basis.The demand for these tests is only going to increase as attackers get more sophisticated. And it’s essential these tests catch all possible vulnerabilities.[ Also read: What to consider when deploying a next-generation firewall | Get regularly scheduled insights: Sign up for Network World newsletters ] Benefits and gaps of penetration tests Penetration tests involve live tests of computer networks, systems, or web applications to find potential vulnerabilities. The tester actually attempts to exploit the vulnerabilities and documents the details of the results to their client. They document how severe the vulnerabilities are and recommend the steps that should be taken in order to resolve them.To read this article in full, please click here

National pen test execution standard would improve network security

As the number of cyber attacks increases, the demand for penetration tests – to determine the strength of a company’s defense – is also going up. People are worried about their companies’ networks and computer systems being hacked and data being stolen. Plus, many regulatory standards such PCI and HITRUST require these tests to be performed on at least an annual basis.The demand for these tests is only going to increase as attackers get more sophisticated. And it’s essential these tests catch all possible vulnerabilities.[ Also read: What to consider when deploying a next-generation firewall | Get regularly scheduled insights: Sign up for Network World newsletters ] Benefits and gaps of penetration tests Penetration tests involve live tests of computer networks, systems, or web applications to find potential vulnerabilities. The tester actually attempts to exploit the vulnerabilities and documents the details of the results to their client. They document how severe the vulnerabilities are and recommend the steps that should be taken in order to resolve them.To read this article in full, please click here

Worth Reading: Should I Write a Book?

Erik Dietrich (of the Expert Beginner fame) published another great blog post explaining when and why you should write a book. For the attention-challenged here’s my CliffNotes version:

  • Realize you have no idea what you’re doing (see also: Dunning-Kruger effect)
  • Figure out why you’d want to spend a significant amount of your time on a major project like book writing;
  • It will take longer (and will be more expensive) than you expect even when considering Hofstadter’s law.

Understanding RSVP EROs

In our last post we covered the basics of getting an RSVP LSP setup. This was a tedious process at least when compared to what we saw with LDP setting up LSPs. So I think it’s worthwhile to spend some time talking about RSVP and what it offers that merit it’s consideration as a label distribution protocol. First off – let’s talk about naming. When talking about MPLS – RSVP is typically just called RSVP – but without the MPLS context – it might be a little confusing. That’s because RSVP itself initially had nothing to do with MPLS. RSVP was initially a means to reserve resources for flows across a network. The protocol was then extended to support setting up MPLS LSPs. In this use case, it is often referred to as “RSVP-TE” or “MPLS RSVP-TE”. For the sake of my sanity – when I reference RSVP going forward I’ll be referring to RSVP that’s used to setup MPLS LSPs.

So now let’s talk about some differences between LDP and RSVP. The first thing that everyone points out is that LDP is tightly bound to the underlying IGP. While this is an accurate statement, it doesn’t mean that RSVP Continue reading

Event-Driven Automation: The TL;DR No One Told You About

Event-Driven automation is an umbrella term much like "coffee" (also see here, it turns out I’ve used coffee anecdotes way too much). How many times do you go to a popular chain and just blurt out "coffee". At 5am, it might be the nonsensical mysterious noise automagically leaving one’s mouth but once we decide it’s bean time, we get to the specifics.

There are multiple tools that give you different capabilities. Some are easier to get started with than others and some are feature rich and return high yields of capability against invested time.

Friendly dictator advice; try not to get wrapped up in the message bus used or data encapsulation methodologies. Nerdy fun, but fairly pointless when it comes to convincing anyone or organisation to make a fundamental shift.

Event-Driven is about receiving a WebHook and annoying people on Slack

This is a terrible measure and one we needed to have dropped yesterday. In more programming languages than I can remember, I’ve written the infamous "Hello World" and played with such variables, struct instances and objects as the infamous "foo" and the much revered "bar". Using an automation platform to receive an HTTP post and updating a support Continue reading

What is a digital twin? [And how it’s changing IoT, AI and more]

Digital twin technology has moved beyond manufacturing and into the merging worlds of the Internet of Things, artificial intelligence and data analytics.As more complex “things” become connected with the ability to produce data, having a digital equivalent gives data scientists and other IT professionals the ability to optimize deployments for peak efficiency and create other what-if scenarios.[ Now read 20 hot jobs ambitious IT pros should shoot for. ] What is a digital twin? A digital twin is a digital representation of a physical object or system. The technology behind digital twins has expanded to include large items such as buildings, factories and even cities, and some have said people and processes can have digital twins, expanding the concept even further. The idea first arose at NASA: full-scale mockups of early space capsules, used on the ground to mirror and diagnose problems in orbit, eventually gave way to fully digital simulations.To read this article in full, please click here

Software-Defined and Cloud-Native Foundations for 5G Networks

Cloud-Native Foundations AvidThink (formerly SDxCentral Research) has put together a research brief that explains the infrastructure changes required, and the role that software-defined and cloud-native technologies will play in the 5G world, including supporting network slicing.

Related Stories

DRaaS options grow, but no one size fits all

AutoNation spent years trying to establish a disaster recovery plan that inspired confidence. It went through multiple iterations, including failed attempts at a full on-premises solution and a solution completely in the cloud. The Fort Lauderdale, Fla.-based auto retailer, which operates 300 locations across 16 states, finally found what it needed with a hybrid model featuring disaster recovery as a service.“Both the on-premises and public cloud disaster recovery models were expensive, not tested often or thoroughly enough, and were true planning and implementation disasters that left us open to risk,” says Adam Rasner, AutoNation’s vice president of IT and operations, who was brought on two years ago in part to revamp the disaster recovery plan. The public cloud approach sported a hefty price tag: an estimated $3 million if it were needed in the wake of a three-month catastrophic outage. “We were probably a little bit too early in the adoption of disaster recovery in the cloud,” Rasner says, noting that the cloud providers have matured substantially in recent years.To read this article in full, please click here

DARPA explores new computer architectures to fix security between systems

Solutions are needed to replace the archaic air-gapping of computers used to isolate and protect sensitive defense information, the U.S. Government has decided. Air-gapping, used often now, is the practice of physically isolating data-storing computers from other systems, computers, and networks. It theoretically can’t be compromised because there is nothing between the machines — there are no links into the machines; they’re removed.However, many say air-gapping is no longer practical, as the cloud and internet takes a hold of massive swaths of data and communications.“Keeping a system completely disconnected from all means of information transfer is an unrealistic security tactic,” says Defense Advanced Research Projects Agency (DARPA) on its website, announcing an initiative to develop completely new hardware and software that will allow defense communications to take place securely among myriad existing systems, networks, and security protocols.To read this article in full, please click here