Today's Heavy Networking aims to separate the hype from the substance on hot tech topics including SD-whatever, cloud native, container networking, service meshes, microsegmentation, and more. Our guests are Avi Freedman and Jon Mendoza.
Check Point GAiA is the next generation Secure Operating System for all Check Point appliances, open servers and virtualized gateways. In this tutorial we will create a network infrastructure which supports usage of Gaia Qemu VM as a personal firewall on Ubuntu Linux. We will also go through the entire installation of Gaia on Qemu VM. This firewall appliance can be used up to 15 days period covered by a free trial Gaia license (no registration needed).
Hardware: Asus K55VM laptop:
- Intel(R) Core(TM) i7-3610QM CPU @ 2.30GHz
- RAM - 2 x Kingston DDR3 8192MB,
- HDD - ST1000LM024 HN-M101MBB 1000GB
Software:
Host - Kubuntu Linux 18.04.1 LTS with installed QEMU emulator version 3.0.0 and KVM module
Guest 1 - Checkpoint GAiA R80.10, OS build 462, OS kernel version 2.6.18-92cpx86_64
Guest 2 - Windows 7 Home Premium, x86 with installed Smart Console R80.10 Build 991140073
Credentials - username/password:
- Gaia web portal: admin/check123point
- Gaia expert mode: check123point
- Windows 7: no password Continue reading
This is part 3 of a six part series based on a talk I gave in Trento, Italy. To start from the beginning go here.
After Cloudbleed, lots of things changed. We started to move away from memory-unsafe languages like C and C++ (there’s a lot more Go and Rust now). And every SIGABRT or crash on any machine results in an email to me and a message to the team responsible. And I don’t let the team leave those problems to fester.
Making 1.1.1.1
So Cloudbleed was a terrible time. Let’s talk about a great time. The launch of our public DNS resolver 1.1.1.1. That launch is a story of an important Cloudflare quality: audacity. Google had launched 8.8.8.8 years ago and had taken the market for a public DNS resolver by storm. Their address is easy to remember, their service is very fast.
But we thought we could do better. We thought we could be faster, and we thought we could be more memorable. Matthew asked us to get the address 1.1.1.1 and launch a secure, privacy-preserving, public DNS resolver in a couple of months. Continue reading
Memory module for the Apollo Guidance Computer(Mike Stewart). The AGC weighed 70 pounds and had 2048 words of RAM in erasable core memory and 36,864 words of ROM in core rope memory. It flew to the moon.
$10.9B: Apple's Q1 services revenue; 5 million: homes on Airbnb; 2.5-5%: base64 gzipped files close to original; 60MB/s: Dropbox per Kafka broker throughput limit; 12%: Microsoft's increased revenues; 9: new datasets; 900 million: installed iPhones; $5.7B: 2018 game investment; way down: chip growth;
Quotable Quotes:
Daniel Lemire: Most importantly, I claim that most people do not care whether they work on important problems or not. My experience is that more than half of researchers are not even trying to produce something useful. They are trying to publish, to get jobs and promotions, to secure grants and so forth, but advancing science is a secondary concern.
Enterprises that are betting the massive amounts of data that they are collecting will deliver the insights they need to make better and faster businesses are finding that they could just as easily be overwhelmed by them. …
In the last 25 years, half the world has been connected to the Internet and the almost infinite opportunities it has to offer. Most of these, among the 3.5 billion connected individuals of the world, are people who are largely economically empowered, literate, and reside in urban or accessible areas. However, there is also half the world that is yet to get online and access what the Internet has to offer them.
The biggest barrier to widespread connectivity is the high cost of infrastructure. With many telecom companies unwilling or unable to build infrastructure in far flung and rural areas, large swathes of the world have remained in media darkness. Evidently, most of those who are excluded from digital ecosystems are people who are largely at the bottom of the pyramid and reside in rural or inaccessible areas. They are people who have not been connected by the mainstream Internet Service Providers (ISP) – and who may have to wait a long time to be connected.
So who will take the responsibility of connecting them?
A research collaboration between the National Cancer Institute (NCI) and Lawrence Livermore National Laboratory (LLNL) is demonstrating the value of using machine learning to overcome daunting computational challenges. …
This is the text I prepared for a talk at Speck&Tech in Trento, Italy. I thought it might make a good blog post. Because it is 6,000 words I've split it into six separate posts.
Here's part 1:
I’ve worked at Cloudflare for more than seven years. Cloudflare itself is more than eight years old. So, I’ve been there since it was a very small company. About twenty people in fact. All of those people (except one, me) worked from an office in San Francisco. I was the lone member of the London office.
Today there are 900 people working at Cloudflare spread across offices in San Francisco, Austin, Champaign IL, New York, London, Munich, Singapore and Beijing. In London, my “one-person office” (which was my spare bedroom) is now almost 200 people and in a month, we’ll move into new space opposite Big Ben.
The original Cloudflare London "office"
The numbers tell a story about enormous growth. But it’s growth that’s been very carefully managed. We could have grown much faster (in terms of people); we’ve certainly raised enough money to do so.
I ended up at Cloudflare because I gave a really good talk at a conference. Well, Continue reading
As the number of cyber attacks increases, the demand for penetration tests – to determine the strength of a company’s defense – is also going up. People are worried about their companies’ networks and computer systems being hacked and data being stolen. Plus, many regulatory standards such PCI and HITRUST require these tests to be performed on at least an annual basis.The demand for these tests is only going to increase as attackers get more sophisticated. And it’s essential these tests catch all possible vulnerabilities.[ Also read: What to consider when deploying a next-generation firewall | Get regularly scheduled insights: Sign up for Network World newsletters ]
Benefits and gaps of penetration tests
Penetration tests involve live tests of computer networks, systems, or web applications to find potential vulnerabilities. The tester actually attempts to exploit the vulnerabilities and documents the details of the results to their client. They document how severe the vulnerabilities are and recommend the steps that should be taken in order to resolve them.To read this article in full, please click here
As the number of cyber attacks increases, the demand for penetration tests – to determine the strength of a company’s defense – is also going up. People are worried about their companies’ networks and computer systems being hacked and data being stolen. Plus, many regulatory standards such PCI and HITRUST require these tests to be performed on at least an annual basis.The demand for these tests is only going to increase as attackers get more sophisticated. And it’s essential these tests catch all possible vulnerabilities.[ Also read: What to consider when deploying a next-generation firewall | Get regularly scheduled insights: Sign up for Network World newsletters ]
Benefits and gaps of penetration tests
Penetration tests involve live tests of computer networks, systems, or web applications to find potential vulnerabilities. The tester actually attempts to exploit the vulnerabilities and documents the details of the results to their client. They document how severe the vulnerabilities are and recommend the steps that should be taken in order to resolve them.To read this article in full, please click here
As the number of cyber attacks increases, the demand for penetration tests – to determine the strength of a company’s defense – is also going up. People are worried about their companies’ networks and computer systems being hacked and data being stolen. Plus, many regulatory standards such PCI and HITRUST require these tests to be performed on at least an annual basis.The demand for these tests is only going to increase as attackers get more sophisticated. And it’s essential these tests catch all possible vulnerabilities.[ Also read: What to consider when deploying a next-generation firewall | Get regularly scheduled insights: Sign up for Network World newsletters ]
Benefits and gaps of penetration tests
Penetration tests involve live tests of computer networks, systems, or web applications to find potential vulnerabilities. The tester actually attempts to exploit the vulnerabilities and documents the details of the results to their client. They document how severe the vulnerabilities are and recommend the steps that should be taken in order to resolve them.To read this article in full, please click here
As the number of cyber attacks increases, the demand for penetration tests – to determine the strength of a company’s defense – is also going up. People are worried about their companies’ networks and computer systems being hacked and data being stolen. Plus, many regulatory standards such PCI and HITRUST require these tests to be performed on at least an annual basis.The demand for these tests is only going to increase as attackers get more sophisticated. And it’s essential these tests catch all possible vulnerabilities.[ Also read: What to consider when deploying a next-generation firewall | Get regularly scheduled insights: Sign up for Network World newsletters ]
Benefits and gaps of penetration tests
Penetration tests involve live tests of computer networks, systems, or web applications to find potential vulnerabilities. The tester actually attempts to exploit the vulnerabilities and documents the details of the results to their client. They document how severe the vulnerabilities are and recommend the steps that should be taken in order to resolve them.To read this article in full, please click here
Welcome to Technology Short Take #110! Here’s a look at a few of the articles and posts that have caught my attention over the last few weeks. I hope something I’ve included here is useful for you also!
Networking
Via Kirk Byers (who is himself a fantastic resource), I read a couple of articles on network automation that I think readers may find helpful. First up is a treatise from Mircea Ulinic on whether network automation is needed. Next is an older article from Patrick Ogenstad that provides an introduction to ZTP (Zero Touch Provisioning).
The folks over at Cilium took a look at a recent CNI benchmark comparison and unpacked it a bit. There’s some good information in their article.
I first ran into Forward Networks a few years ago at Fall ONUG in New York. At the time, I was recommending that they explore integration with NSX. Fast-forward to this year, and the company announces support for NSX and (more recently) support for Cisco ACI. The recent announcement of their GraphQL-based Network Query Engine (NQE)—more information is available in this blog post—is also pretty interesting to me.
Servers/Hardware
Patrick Kennedy over at Serve The Home shares the Continue reading
This is a cool paper on a number of levels. Firstly, the main result that catches my eye is that it’s possible to build a distributed systems ‘debugger’ that can suggest protocol-level fixes. E.g. say you have a system that sometimes sends acks before it really should, resulting in the possibility of state loss. Nemo can uncover this and suggest a change to the protocol that removes the window. Secondly, it uses an obscure (from the perspective of most readers of this blog) programming language called Dedalus. Dedalus is a great example of how a different programming paradigm can help us to think about a problem differently and generate new insights and possibilities. (Dedalus is a temporal logic programming language based on datalog). Now, it would be easy for practitioner readers to immediately dismiss the work as not relevant to them given they won’t be coding systems in Dedalus anytime soon. The third thing I want to highlight in this introduction therefore is the research strategy:
Nemo operates on an idealized model in which distributed executions are centrally simulated, record-level provenance of these Continue reading