Cylance touts its predictive advantage technology that allows a company to protect endpoints from threats that may not exist for years to come.
No doubt about it: the prospect of adding another zero to the end of your top network speeds is exciting. And the reward of the immediately noticeable performance improvement never gets old. Speed makes a noticeable, and not just measurable, difference. And with the massive increase in the amount of data servers need to process, 100G is soon going to be a necessity for many organizations.
But increasing network speed is about more than pushing more bits across a wire. Faster networks enable you to squeeze more out of your physical rack space. You need fewer servers, fewer network connections, and – dare I say it – fewer switches. It’s true. A faster network lets you pack more computing into the same space.
Whether you plan to do a forklift upgrade to 100G or intend to replace one switch at a time, there are some key things you need to know to avoid getting locked into one switch vendor or losing backward compatibility with your existing equipment. In this post, I’m going to give you my top 5 tips for making transitioning to 100G networking a smooth one.
First, a little background. Continue reading
Interested in getting Google Cloud Certified? INE now offers a Bootcamp to help get you there. We still have a few spots left in our August and September Google Cloud Bootcamps! Visit our Bootcamps Site to learn more.
The new edge offering bundles SD-WAN, wired, and wireless networking technologies, along with unified security and policy enforcement.
Let me tell you a story. It’s 2014 and I had read so many articles about Docker (as the project was called then), how awesome it is and how it makes the lives of developers so much easier. Being one, I decided to try it out. Back in the day, I was working on some django applications. Those apps were really simple: just a webserver and a database. So I went straight ahead to docker-compose. I read in the docs that I should create a docker-compose.yml file and then just docker-compose up. An error message here and there but I was able to navigate the containers to success with no big issues. And that was it. One command to run my application. I was sold on containers.
I was so excited that I started talking about Docker and docker-compose to everyone, everywhere. In the office breakroom, to my dad, at a meetup, to a crowd of 50 at a local conference. It wasn’t completely easy, since some people argued or did not understand fully. But I definitely made some converts. We even made a workshop series with my friends Peter Schiffer and Continue reading

Photo by NESA by Makers / Unsplash
At Cloudflare, we believe that getting new products and features into the hands of customers as soon as possible is the best way to get great feedback. The thing about releasing products early and often is that sometimes they might not be initially ready for your entire user base. You might want to provide access to only particular sets of customers that may be: power users, those who have expressed interest participating in a beta, or customers in need of a new feature the most.
As I have been meeting with many of the users who were in our own Workers beta program, I’ve seen (somewhat unsurprisingly) that many of our users share the same belief that they should be getting feedback from their own users early and often.
However, I was surprised to learn about the difficulty that many beta program members had in creating the necessary controls to quickly and securely gate new or deprecated features when testing and releasing updates.
Below are some ideas and recipes I’ve seen implemented inside of Cloudflare Workers to ensure the appropriate customers have access to the correct features.
First, a brief Continue reading
Edge computing can mean different things to different people, as is the case with any new phenomena in the IT sector. …
Crating All Of The Data That The Edge Creates was written by Daniel Robinson at .
The managed hybrid cloud service initially supports Amazon Web Services (AWS), Microsoft Azure, and Azure Stack.
Pathology laboratories are big data environments. However, these big data are often hidden behind expert humans who manually and with great care visually parse large complex and detailed datasets to provide critical diagnoses. …
Augmenting Pathology Labs With Big Data And Machine Learning was written by James Cuff at .
Colt services now available in the United States include enterprise bandwidth services up to 100 Gb/s.
A recent Gartner report found that more than 20 percent of global enterprises will have deployed serverless technologies by 2020, compared with less than 5 percent today.