VMware NSX Micro-segmentation – Horizon 7

Organizations that embark on the journey of building our virtual desktop environments, are taking traditionally external endpoints and bringing them into the data center.  These endpoints are now closer and most times, reside on the same networking infrastructure as the backend application servers that they may access. These endpoints run Windows or even Linux desktop operating systems with multiple end-users that can access them. Malicious attacks that would traditionally take place outside the data center should an end-user find their desktop or laptop machine infected, could now take place on their virtual desktops inside the data center.  With physical equipment, it’s easy to isolate the physical desktop or laptop and remediate the attack.  Securing virtual desktop environments requires a different approach, but not one that’s unattainable.  Securing an end user computing deployments is one of the primary security use cases for VMware NSX and can help provide a layered approach to securing virtual desktop workloads in the data center.

The NSX platform covers several business cases for securing an end user computing deployment.  Each of these use cases, helps provide a multi-layered approach to ensure end user endpoints are as secure as possible in the Continue reading

IDG Contributor Network: Admin automation: the serverless low-hanging fruit

If we could go back in time and start using public cloud in 2009, we’d probably be better off today. The AWS beta started in 2006 and was entirely API-driven, without either a console or a command line interface to make interacting with the service easier than what we know so well now. Three years later, it was more mature. Early adopters started to solve real problems with it, padding their resumes and bringing value to their organizations in ways that seemed impossible before.Serverless computing in 2018 is about where cloud computing was in 2009. But what exactly does serverless mean, and what are some easy ways to get started with it?Function-as-a-Service: making serverless architectures possible As cool as the technology is, serverless computing is a terrible name because (spoiler alert) there are, in fact, servers under the hood. The name comes from the idea that developers don’t have to worry about the server, or even a container, as a unit of compute any more as public cloud services like AWS Lambda, IBM OpenWhisk, Google Cloud Functions and Azure Functions handle the details.To read this article in full, please click here

Breakthroughs in magnetism will change storage and computing

If you thought storage was trending towards solid-state mediums and that magnetic drives were edging out, you may want to pause a moment. A slew of scientific breakthroughs in magnetism as it relates to storage and computing were announced last year.The multiple Eureka moments could change how we compute and perform Internet of Things and might, in one case, introduce magnet-driven neural networks — which is computing that mimics how the brain processes things.3D magnets First on the list was last November's announcement of the invention of 3D nano-magnets that shift data transfers from traditional two dimensions to three dimensions. This kind of add-on could significantly increase storage and processing functions, say its inventors at the University of Cambridge in an article published by Sinc.To read this article in full, please click here

Breakthroughs in magnetism will change storage and computing

If you thought storage was trending towards solid-state mediums and that magnetic drives were edging out, you may want to pause a moment. A slew of scientific breakthroughs in magnetism as it relates to storage and computing were announced last year.The multiple Eureka moments could change how we compute and perform Internet of Things and might, in one case, introduce magnet-driven neural networks — which is computing that mimics how the brain processes things.3D magnets First on the list was last November's announcement of the invention of 3D nano-magnets that shift data transfers from traditional two dimensions to three dimensions. This kind of add-on could significantly increase storage and processing functions, say its inventors at the University of Cambridge in an article published by Sinc.To read this article in full, please click here

IDG Contributor Network: 5 things analytics could tell you about your network in 2018

1. Whether your users are happy (without having to talk to them) It’s not always cool to admit, but the ultimate goal of every networker is to have happy users. Like many other thankless jobs, we only hear about problems. When we do, we react. But that isn’t ideal. What we really want is to know about problems as they are developing, before users complain. They don't even have to know.  But we do.A Network Management System (NMS) has been the traditional go-to solution to sniff out these sorts of problems. But most were designed for just one view of a certain part of the network using antiquated technology that doesn't provide any sort of predictive problem solving based on what the user is actually experiencing. It's like trying to figure out San Francisco traffic based on the status of the traffic signals. Just because the signals are working properly doesn’t mean the drivers (users) are having a good experience.To read this article in full, please click here

IDG Contributor Network: 5 things analytics could tell you about your network in 2018

1. Whether your users are happy (without having to talk to them) It’s not always cool to admit, but the ultimate goal of every networker is to have happy users. Like many other thankless jobs, we only hear about problems. When we do, we react. But that isn’t ideal. What we really want is to know about problems as they are developing, before users complain. They don't even have to know.  But we do.A Network Management System (NMS) has been the traditional go-to solution to sniff out these sorts of problems. But most were designed for just one view of a certain part of the network using antiquated technology that doesn't provide any sort of predictive problem solving based on what the user is actually experiencing. It's like trying to figure out San Francisco traffic based on the status of the traffic signals. Just because the signals are working properly doesn’t mean the drivers (users) are having a good experience.To read this article in full, please click here

How Cisco’s newest security tool can detect malware in encrypted traffic

Cisco’s Encrypted Traffic Analytics (ETA), a software platform that monitors network packet metadata to detect malicious traffic, even if its encrypted, is now generally available.The company initially launched ETA in June, 2017 during the launch of its intent-based network strategy and it’s been in a private preview since then. Today Cisco rolled ETA out beyond just the enterprises switches it was originally designed for and made it available on current and previous generation data center network hardware too.+MORE AT NETWORK WORLD: What is intent based networking? | Why intent based networking could be a big deal +To read this article in full, please click here

How Cisco’s newest security tool can detect malware in encrypted traffic

Cisco’s Encrypted Traffic Analytics (ETA), a software platform that monitors network packet metadata to detect malicious traffic, even if its encrypted, is now generally available.The company initially launched ETA in June, 2017 during the launch of its intent-based network strategy and it’s been in a private preview since then. Today Cisco rolled ETA out beyond just the enterprises switches it was originally designed for and made it available on current and previous generation data center network hardware too.+MORE AT NETWORK WORLD: What is intent based networking? | Why intent based networking could be a big deal +To read this article in full, please click here

How Cisco’s newest security tool can detect malware in encrypted traffic

Cisco’s Encrypted Traffic Analytics (ETA), a software platform that monitors network packet metadata to detect malicious traffic, even if its encrypted, is now generally available.The company initially launched ETA in June, 2017 during the launch of its intent-based network strategy and it’s been in a private preview since then. Today Cisco rolled ETA out beyond just the enterprises switches it was originally designed for and made it available on current and previous generation data center network hardware too.+MORE AT NETWORK WORLD: What is intent based networking? | Why intent based networking could be a big deal +To read this article in full, please click here

Reaction: Why Safe Harbors will Fail

Copyright law, at least in the United States, tends to be very strict. You can copy some portion of a work under “fair use” rules, but, for most works, you must ask permission before sharing content created by someone else. But what about content providers? If a content provider user uploads a “song cover,” for instance—essentially a remake of a popular song, not intended to create commercial value for the individual user—should the provider be required to take the content down as a violation of copyright? Content providers argue they should not be required to remove such content. For instance, in a recent article published by the EFF—

Platform safe harbors have been in the crosshairs of copyright industry lobbyists throughout 2017. All year EFF has observed them advancing their plans around the world to weaken or eliminate the legal protections that have enabled the operation of platforms as diverse as YouTube, the Internet Archive, Reddit, Medium, and many thousands more. Copyright safe harbor rules empower these platforms by ensuring that they are free to host user-uploaded content, without manually vetting it (or, worse, automatically filtering it) for possible copyright infringements. Without that legal protection, it would be impossible for Continue reading

Quantum Computing Enters 2018 Like It Is 1968

The quantum computing competitive landscape continues to heat up in early 2018. But today’s quantum computing landscape looks a lot like the semiconductor landscape 50 years ago.

The silicon-based integrated circuit (IC) entered its “medium-scale” integration phase in 1968. Transistor counts ballooned from ten transistors on a chip to hundreds of transistors on a chip within a few short years. After a while, there were thousands of transistors on a chip, then tens of thousands, and now we have, fifty years later, tens of billions.

Quantum computing is a practical application of quantum physics using individual subatomic particles chilled to

Quantum Computing Enters 2018 Like It Is 1968 was written by Timothy Prickett Morgan at The Next Platform.

Enhancing NSX with Check Point vSEC

While VMware NSX enables micro-segmentation of the Software Defined Data Center, it mostly polices traffic in layers 3 and 4, with only limited application level (layer 7) support.  Sometimes additional layers of protection are needed for use cases such as Secure DMZ or meeting regulatory compliance requirements like PCI, in which case partner solutions can be added to the platform, with traffic steered into the supplemental solution prior to reaching the vSwitch (virtual wire).  The resulting combination is high throughput due to the scale-out nature of NSX, but can also provide deep traffic analysis from the partner solution.

The usual enemy of deep traffic inspection in the data center is bandwidth.  NSX addresses this issue, micro-segmentation security policy is zero trust – only traffic explicitly permitted out of a VM can pass, then steering policy to 3rd party solutions can be designed in order that bulk protocols such as storage and backup bypass them, leaving a more manageable amount of traffic for Check Point vSEC to provide IPS, anti-virus and anti-malware protection on, including Check Point’s Sandblast Zero-Day Protection against zero day attacks.

The connection between vSEC and NSX enables dynamic threat tagging, where traffic from an VM reaches Continue reading

Today’s Mobile Networks Demand Virtual Functionality & Data Analytics

Next-Generation Mobile Network Telecom operators need new network monitoring tools.  Although mobile core networks are increasingly virtualized—through powerful and flexible technologies such as Network Functions Virtualization (NFV) and Software-Defined Networking (SDN) —network monitoring and analytics functions have only recently started to be virtualized. Stand-alone virtual probes still require mirroring of all the traffic which in turn impacts performance... Read more →

Using Your Own Private Registry with Docker Enterprise Edition

docker trusted registry

One of the things that makes Docker really cool, particularly compared to using virtual machines, is how easy it is to move around Docker images. If you’ve already been using Docker, you’ve almost certainly pulled images from Docker Hub. Docker Hub is Docker’s cloud-based registry service and has tens of thousands of Docker images to choose from. If you’re developing your own software and creating your own Docker images though, you’ll want your own private Docker registry. This is particularly true if you have images with proprietary licenses, or if you have a complex continuous integration (CI) process for your build system.

Docker Enterprise Edition includes Docker Trusted Registry (DTR), a highly available registry with secure image management capabilities which was built to run either inside of your own data center or on your own cloud-based infrastructure. In the next few weeks, we’ll go over how DTR is a critical component of delivering a secure, repeatable and consistent software supply chain.  You can get started with it today through our free hosted demo or by downloading and installing the free 30-day trial. The steps to get started with your own installation are below.

Setting Up Docker Enterprise Edition

Docker Trusted Registry runs on Continue reading