Archive

Category Archives for "Security"

ICSA Labs Certifies NSX Micro-segmentation Capabilities

ICSA_Cert_Firewall-Corp_2C_300DPI_975x563

VMware NSX  has achieved ICSA labs Corporate Firewall Certification.

With the release of NSX for vSphere® 6.3, VMware has not only introduced several key security features such as Application Rule Manager and Endpoint Monitoring, which provide deep visibility into the application, and enable a rapid zero-trust deployment, but has also achieved Corporate Firewall Certification in independent testing performed by ICSA labs, a leading third-party testing and certification body and independent division of Verizon.

VMware NSX for vSphere 6.3 has been tested against an industry-accepted standard to which a consortium of firewall vendors, end users and ICSA labs contributed, and met all the requirements in the Baseline and Corporate module of the ICSA Module Firewall Certification Criteria version 4.2.
NSX is the only true micro-segmentation platform to achieve ICSA Firewall certification — with the NSX Distributed Firewall providing kernel-based, distributed stateful firewalling, and the Edge Services Gateway providing services such as North-South firewalling, NAT, DHCP, VPN, load balancing and high availability. VMware NSX provides security controls aligned to the application and enables a Zero-Trust model, independent of network topology.

The ICSA Firewall Certification criteria focus on several key firewall aspects, including stateful services, logging and persistence. ICSA also validates Continue reading

Announcing LinuxKit: A Toolkit for building Secure, Lean and Portable Linux Subsystems

LinuxKit
 

Last year, one of the most common requests we heard from our users was to bring a Docker-native experience to their platforms. These platforms were many and varied: from cloud platforms such as AWS, Azure, Google Cloud, to server platforms such as Windows Server, desktop platforms that their developers used such as OSX and Windows 10, to mainframes and IoT platforms –  the list went on.

We started working on support for these platforms, and we initially shipped Docker for Mac and Docker for Windows, followed by Docker for AWS and Docker for Azure. Most recently, we announced the beta of Docker for GCP. The customizations we applied to make Docker native for each platform have furthered the adoption of the Docker editions.

One of the issues we encountered was that for many of these platforms, the users wanted Linuxcontainer support but the platform itself did not ship with Linux included. Mac OS and Windows are two obvious examples, but cloud platforms do not ship with a standard Linux either. So it made sense for us to bundle Linux into the Docker platform to run in these places.

What we needed to bundle was a secure, lean and portable Linux Continue reading

Introducing Moby Project: a new open-source project to advance the software containerization movement

Moby Project

Since Docker democratized software containers four years ago, a whole ecosystem grew around containerization and in this compressed time period it has gone through two distinct phases of growth. In each of these two phases, the model for producing container systems evolved to adapt to the size and needs of the user community as well as the project and the growing contributor ecosystem.

The Moby Project is a new open-source project to advance the software containerization movement and help the ecosystem take containers mainstream. It provides a library of components, a framework for assembling them into custom container-based systems and a place for all container enthusiasts to experiment and exchange ideas.

Let’s review how we got where we are today. In 2013-2014 pioneers started to use containers and collaborate in a monolithic open source codebase, Docker and few other projects, to help tools mature.

Docker Open Source

Then in 2015-2016, containers were massively adopted in production for cloud-native applications. In this phase, the user community grew to support tens of thousands of deployments that were backed by hundreds of ecosystem projects and thousands of contributors. It is during this phase, that Docker evolved its production model to an open component based approach. In Continue reading

Mirai, Bitcoin, and numeracy

Newsweek (the magazine famous for outing the real Satoshi Nakamoto) has a story about how a variant of the Mirai botnet is mining bitcoin. They fail to run the numbers.

The story repeats a claim by Mcafee that 2.5 million devices were infected with Mirai at some point in 2016. If they were all mining bitcoin, how much money would the hackers be earning?

I bought security cameras and infected them with Mirai. A typical example of the CPU running on an IoT device is an ARM926EJ-S processor.


As this website reports, such a processor running at 1.2 GHz can mine at a rate of 0.187-megahashes/second. That's a bit fast for an IoT device, most are slower, some are faster, we'll just use this as the average.


According to this website, the current hash-rate of all minters is around 4-million terahashes/second.


Bitcoin blocks are mined every 10 minutes, with the current (April 2016) reward set at 12.5 bitcoins per block, giving roughly 1800 bitcoins/day in reward.

The current price of bitcoin is $1191.



Okay, let's plug all these numbers in:
  •  total Mirai hash-rate = 2.5 million bots times 0.185 megahash/sec = 0.468 terahashes/second
  • Continue reading

Changing Internet Standards to Build A Secure Internet

We’ve been working with registrars and registries in the IETF on making DNSSEC easier for domain owners, and over the next two weeks we’ll be starting out by enabling DNSSEC automatically for .dk domains.

DNSSEC: A Primer

Before we get into the details of how we've improved the DNSSEC experience, we should explain why DNSSEC is important and the function it plays in keeping the web safe.

DNSSEC’s role is to verify the integrity of DNS answers. When DNS was written in the early 1980’s, it was only a few researchers and academics on the internet. They all knew and trusted each other, and couldn’t imagine a world in which someone malicious would try to operate online. As a result, DNS relies on trust to operate. When a client asks for the address of a hostname like www.cloudflare.com, without DNSSEC it will trust basically any server that returns the response, even if it wasn’t the same server it originally asked. With DNSSEC, every DNS answer is signed so clients can verify answers haven’t been manipulated over transit.

The Trouble With DNSSEC

If DNSSEC is so important, why do so few domains support it? First, for a domain to Continue reading

Notes on the FCC and Privacy in the US

I’ve been reading a lot about the repeal of the rules putting the FCC in charge of privacy for access providers in the US recently—a lot of it rising to the level of hysteria and “the end is near” level. As you have probably been reading these stories, as well, I thought it worthwhile to take a moment and point out two pieces that seem to be the most balanced and thought through out there.

Essentially—yes, privacy is still a concern, and no, the sky is not falling. The first is by Nick Feamster, who I’ve worked with in the past, and has always seemed to have a reasonable take on things. The second is by Shelly Palmer, who I don’t always agree with, but in this case I think his analysis is correct.

Last week, the House and Senate both passed a joint resolution that prevent’s the new privacy rules from the Federal Communications Commission (FCC) from taking effect; the rules were released by the FCC last November, and would have bound Internet Service Providers (ISPs) in the United States to a set of practices concerning the collection and sharing of data about consumers. The rules were widely heralded Continue reading

Notes on the FCC and Privacy in the US

I’ve been reading a lot about the repeal of the rules putting the FCC in charge of privacy for access providers in the US recently—a lot of it rising to the level of hysteria and “the end is near” level. As you have probably been reading these stories, as well, I thought it worthwhile to take a moment and point out two pieces that seem to be the most balanced and thought through out there.

Essentially—yes, privacy is still a concern, and no, the sky is not falling. The first is by Nick Feamster, who I’ve worked with in the past, and has always seemed to have a reasonable take on things. The second is by Shelly Palmer, who I don’t always agree with, but in this case I think his analysis is correct.

Last week, the House and Senate both passed a joint resolution that prevent’s the new privacy rules from the Federal Communications Commission (FCC) from taking effect; the rules were released by the FCC last November, and would have bound Internet Service Providers (ISPs) in the United States to a set of practices concerning the collection and sharing of data about consumers. The rules were widely heralded Continue reading

Research: McAfee Labs Threats Report April 2017

Network professionals are the front line in cyber-defence by defining and operating the perimeter. While it is only a first layer of static defense, its well worth understanding the wider threat landscape that you are defending against. Many companies publish regular reports and this one is from McAfee.

McAfee Labs Threats Report – April 2017 – Direct Link

Landing page is https://secure.mcafee.com/us/security-awareness/articles/mcafee-labs-threats-report-mar-2017.aspx

Note: Intel has spun McAfee out to a private VC firm in the last few weeks so its possible that we will see a resurgence of the McAfee brand. I’m doubtful that McAfee can emerge but lets wait and see.


Some points I observed when reading this report:

  • McAfee wants to tell you about its cloud-based threat intelligence (which all security vendors have now, table stakes)
  • The pitch is pretty much identical to any other cloud threat intelligence.
  • The big six security companies have formed the Cyber Threat Alliance ( ….to prevent the startups from competing with them ? ) aka. Check Point, Cisco, Fortinet, Intel Security, Palo Alto Networks, Symantec
  • Big section on Mirai botnet and how it works.
  • Good summary of the different network packet attack modes in Mirai. Nicely laid out with Continue reading

Mac Flooding Attack , Port Security and Deployment Considerations

This article is the 4th in Layer 2 security series. We will be discussing a very common layer 2 attack which is MAC flooding and its TMtigation “Port Security MAC limiting” If you didn’t read the previous 3 articles; DHCP snooping, Dynamic ARP Inspection, and IP Source Guard; I recommend that you take a quick […]

The post Mac Flooding Attack , Port Security and Deployment Considerations appeared first on Cisco Network Design and Architecture | CCDE Bootcamp | orhanergun.net.

University finds Proactive Security and PCI Compliance through NSX

The University of Pittsburgh is one of the oldest institutions in the country, dating to 1787 when it was founded as the Pittsburgh Academy. The University has produced the pioneers of the MRI and the television, winners of Nobel and Pulitzer prizes, Super Bowl and NBA champions, and best-selling authors.

As with many businesses today, the University continues to digitize its organization to keep up with the demands of over 35,000 students, 5,000 faculty, and 7,000 staff across four campuses. While the first thing that comes to mind may be core facilities such as classrooms, this also includes keeping up with the evolving technology on the business side of things, such as point-of-sale (POS) systems. When a student buys a coffee before studying or a branded sweatshirt for mom using their student ID, those transactions must be facilitated and secured by the University.

What does it mean to secure financial transactions? For one, just as with a retail store operation, the University must achieve PCI compliance to facilitate financial transactions for its customers. What does this mean? Among other tasks, PCI demands that the data used by these systems is completely isolated from other IT operations. However, locking everything down Continue reading

Technology Short Take #81

Welcome to Technology Short Take #81! I have another collection of links, articles, and thoughts about key data center technologies, and hopefully I’ve managed to include something here that will prove useful or thought-provoking. Enjoy!

Networking

Enterprise Ready Software from Docker Store

Docker Store is the place to discover and procure trusted, enterprise-ready containerized software – free, open source and commercial.

Docker Store is the evolution of the Docker Hub, which is the world’s largest container registry, catering to millions of users. As of March 1, 2017, we crossed 11 billion pulls from the public registry!  Docker Store leverages the public registry’s massive user base and ensures our customers – developers, operators and enterprise Docker users get what they ask for. The Official Images program was developed to create a set of curated and trusted content that developers could use as a foundation for building containerized software. From the lessons learned and best practices, Docker recently launched a certification program that  enables ISVs, around the world to take advantage of Store in offering great software, packaged to operate optimally on the Docker platform.

Docker Hub

The Docker Store is designed to bring Docker users and ecosystem partners together with

  • Certified Containers with ISV apps that have been validated against Docker Enterprise Edition, and comes with cooperative support from Docker and the ISV
  • Enhanced search and discovery capabilities of containers, including filtering support for platforms, categories and OS.
  • Self service publisher workflow and interface to facilitate Continue reading

Developer-Ready Infrastructure: NSX and Pivotal

Organizations across industries are embarking on their journey of Digital Transformation. Time-to-market has become very crucial to the bottom-line and companies need to accelerate their application/services delivery and go from concept to production in record time.

Organizations are embracing containers, micro-service based architectures, Continuous Delivery and Integration tools as they are completely trying to change how they develop, deploy and deliver applications.

However, moving from monolith application architectures to microservices-based ones is no ordinary feat.

Many of these organizations leverage Pivotal’s expertise to deliver a modern application development environment. Pivotal’s flagship cloud-native platform Pivotal Cloud Foundry provides a modern app-centric environment that lets developers focus on delivering applications with speed and frequency of delivery. To find out more about Pivotal Cloud Foundry, and the now generally available Pivotal Cloud Foundry 1.10.

Pivotal Cloud Foundry abstracts the underlying IaaS layer so that developers get a modern self-service application development environment, without worrying about the infrastructure. BOSH vSphere CPI plugin does a good job of consuming pre-created networks.

However, the truth is – “someone” always needs to do some provisioning – networks need to be carved out, load-balancers need to be configured, NAT rules need to be defined, reachability needs to Continue reading