Using the External AWS Cloud Provider for Kubernetes

In 2018, after finding a dearth of information on setting up Kubernetes with AWS integration/support, I set out to try to establish some level of documentation on this topic. That effort resulted in a few different blog posts, but ultimately culminated in this post on setting up an AWS-integrated Kubernetes cluster using kubeadm. Although originally written for Kubernetes 1.15, the process described in that post is still accurate for newer versions of Kubernetes. With the release of Kubernetes 1.22, though, the in-tree AWS cloud provider—which is what is used/described in the post linked above—has been deprecated in favor of the external cloud provider. In this post, I’ll show how to set up an AWS-integrated Kubernetes cluster using the external AWS cloud provider.

In addition to the post I linked above, there were a number of other articles I published on this topic:

Most of the information in these posts, if not all of it, is found in the latest iteration, but I wanted to include these links here for some additional context. Also, Continue reading

Tech Bytes: Why Fortinet Zero Trust Works For You

Today on the Tech Bytes podcast we’re talking Zero Trust Network Access, or ZTNA, with sponsor Fortinet. As organizations grapple with controlling end user access to applications and services, particularly when those end users and applications could be anywhere, Fortinet is here to make the case that it’s the right platform for ZTNA. Our guest […]

The post Tech Bytes: Why Fortinet Zero Trust Works For You appeared first on Packet Pushers.

Grafana Cloud


Grafana Cloud is a cloud hosted version of Grafana, Prometheus, and Loki. The free tier makes it easy to try out the service and has enough capability to satisfy simple use cases. In this article we will explore how metrics based on sFlow streaming telemetry can be pushed into Grafana Cloud.

The diagram shows the elements of the solution. Agents in host and network devices are configured to stream sFlow telemetry to an sFlow-RT real-time analytics engine instance. The Grafana Agent queries sFlow-RT's REST API for metrics and pushes them to Grafana Cloud.
docker run -p 8008:8008 -p 6343:6343/udp --name sflow-rt -d sflow/prometheus
Use Docker to run the pre-built sflow/prometheus image which packages sFlow-RT with the sflow-rt/prometheus application. Configure sFlow agents to stream data to this instance.
Create a Grafana Cloud account. Click on the Agent button on the home page to get the configuration settings for the Grafana Agent.
Click on the Prometheus button to get the configuration to forward metrics from the Grafana Agent.
Enter a name and click on the Create API key button to generate configuration settings that include a URL, username, and password that will be used in the Grafana Agent configuration.
server:
Continue reading

Introducing SSL/TLS Recommender

Introducing SSL/TLS Recommender
Introducing SSL/TLS Recommender

Seven years ago, Cloudflare made HTTPS availability for any Internet property easy and free with Universal SSL. At the time, few websites — other than those that processed sensitive data like passwords and credit card information — were using HTTPS because of how difficult it was to set up.

However, as we all started using the Internet for more and more private purposes (communication with loved ones, financial transactions, shopping, healthcare, etc.) the need for encryption became apparent. Tools like Firesheep demonstrated how easily attackers could snoop on people using public Wi-Fi networks at coffee shops and airports. The Snowden revelations showed the ease with which governments could listen in on unencrypted communications at scale. We have seen attempts by browser vendors to increase HTTPS adoption such as the recent announcement by Chromium for loading websites on HTTPS by default. Encryption has become a vital part of the modern Internet, not just to keep your information safe, but to keep you safe.

When it was launched, Universal SSL doubled the number of sites on the Internet using HTTPS. We are building on that with SSL/TLS Recommender, a tool that guides you to stronger configurations for the backend connection Continue reading

Red Hat Ansible Automation Platform 2: Migration strategy considerations

Red Hat Ansible Automation Platform 2 introduces an updated architecture, new tools and an improved but familiar experience to automation teams. However, there are multiple considerations for your planning and strategy to migrate your current deployment to Ansible Automation Platform 2.

This document provides guidance to all of the stakeholders responsible for planning and executing an Ansible Automation Platform migration guidance with factors to address in your migration strategy.

This document does not provide a one-size-fits-all approach for migration. Various factors unique to your organization will impact the effort required, stakeholders involved and delivery plan.

What to consider before migrating

We understand that many factors specific to your needs affect your migration assessment and planning. This section highlights critical factors to determine your migration readiness and what approach will best suit your organization.

Assess your current environment

There will be configurations unique to your environment, and it’s crucial to perform a thorough technical assessment. We recommend including the following:

  • Analyze your current Ansible Automation Platform installation, including current deployment patterns, integrations and any complexities relevant to the migration.

  • Determine changes needed in your environment to meet the Ansible Automation Platform 2 technical requirements.

  • Assess stakeholders’ readiness to plan and execute Continue reading

Dynamic Process Isolation: Research by Cloudflare and TU Graz

Dynamic Process Isolation: Research by Cloudflare and TU Graz
Dynamic Process Isolation: Research by Cloudflare and TU Graz

Last year, I wrote about the Cloudflare Workers security model, including how we fight Spectre attacks. In that post, I explained that there is no known complete defense against Spectre — regardless of whether you're using isolates, processes, containers, or virtual machines to isolate tenants. What we do have, though, is a huge number of tools to increase the cost of a Spectre attack, to the point where it becomes infeasible. Cloudflare Workers has been designed from the very beginning with protection against side channel attacks in mind, and because of this we have been able to incorporate many defenses that other platforms — such as virtual machines and web browsers — cannot. However, the performance and scalability requirements of edge compute make it infeasible to run every Worker in its own private process, so we cannot rely on the usual defenses provided by the operating system kernel and address space separation.

Given our different approach, we cannot simply rely on others to tell us if we are safe. We had to do our own research. To do this we partnered with researchers at Graz University of Technology (TU Graz) to study the impact of Spectre on our environment. The Continue reading

Handshake Encryption: Endgame (an ECH update)

Handshake Encryption: Endgame (an ECH update)
Handshake Encryption: Endgame (an ECH update)

Privacy and security are fundamental to Cloudflare, and we believe in and champion the use of cryptography to help provide these fundamentals for customers, end-users, and the Internet at large. In the past, we helped specify, implement, and ship TLS 1.3, the latest version of the transport security protocol underlying the web, to all of our users. TLS 1.3 vastly improved upon prior versions of the protocol with respect to security, privacy, and performance: simpler cryptographic algorithms, more handshake encryption, and fewer round trips are just a few of the many great features of this protocol.

TLS 1.3 was a tremendous improvement over TLS 1.2, but there is still room for improvement. Sensitive metadata relating to application or user intent is still visible in plaintext on the wire. In particular, all client parameters, including the name of the target server the client is connecting to, are visible in plaintext. For obvious reasons, this is problematic from a privacy perspective: Even if your application traffic to crypto.cloudflare.com is encrypted, the fact you’re visiting crypto.cloudflare.com can be quite revealing.

And so, in collaboration with other participants in the standardization community and members of Continue reading

Privacy Pass v3: the new privacy bits

Privacy Pass v3: the new privacy bits
Privacy Pass v3: the new privacy bits

In November 2017, we released our implementation of a privacy preserving protocol to let users prove that they are humans without enabling tracking. When you install Privacy Pass’s browser extension, you get tokens when you solve a Cloudflare CAPTCHA which can be used to avoid needing to solve one again... The redeemed token is cryptographically unlinkable to the token originally provided by the server. That is why Privacy Pass is privacy preserving.

In October 2019, Privacy Pass reached another milestone. We released Privacy Pass Extension v2.0 that includes a new service provider (hCaptcha) which provides a way to redeem a token not only with CAPTCHAs in the Cloudflare challenge pages but also hCaptcha CAPTCHAs in any website. When you encounter any hCaptcha CAPTCHA in any website, including the ones not behind Cloudflare, you can redeem a token to pass the CAPTCHA.

We believe Privacy Pass solves an important problem — balancing privacy and security for bot mitigation— but we think there’s more to be done in terms of both the codebase and the protocol. We improved the codebase by redesigning how the service providers interact with the core extension. At the same time, we made progress on the Continue reading

Very quietly, Oracle ships new Exadata servers

You have to hand it to Larry Ellison, he is persistent. Or maybe he just doesn’t know when to give up. Either way, Oracle has shipped the latest in its Exadata server appliances, making some pronounced boosts in performance.Exadata was the old Sun Microsystems hardware Oracle inherited when it bought Sun in 2010. It has since discontinued Sun’s SPARC processor but soldiered on with servers running x86-based processors, all of them Intel despite AMD’s surging acceptance in the enterprise.When Oracle bought Sun in 2010, it was made clear they had no interest in low-end, mass market servers. In that regard, the Oracle Exadata X9M platforms deliver. The new Exadata X9M offerings, designed entirely around Oracle’s database software, include Oracle Exadata Database Machine X9M and Exadata Cloud@Customer X9M, which Oracle says is the only platform that runs Oracle Autonomous Database in customer data centers.To read this article in full, please click here

Very quietly, Oracle ships new Exadata servers

You have to hand it to Larry Ellison, he is persistent. Or maybe he just doesn’t know when to give up. Either way, Oracle has shipped the latest in its Exadata server appliances, making some pronounced boosts in performance.Exadata was the old Sun Microsystems hardware Oracle inherited when it bought Sun in 2010. It has since discontinued Sun’s SPARC processor but soldiered on with servers running x86-based processors, all of them Intel despite AMD’s surging acceptance in the enterprise.When Oracle bought Sun in 2010, it was made clear they had no interest in low-end, mass market servers. In that regard, the Oracle Exadata X9M platforms deliver. The new Exadata X9M offerings, designed entirely around Oracle’s database software, include Oracle Exadata Database Machine X9M and Exadata Cloud@Customer X9M, which Oracle says is the only platform that runs Oracle Autonomous Database in customer data centers.To read this article in full, please click here

Edge computing: The architecture of the future

As technology extends deeper into every aspect of business, the tip of the spear is often some device at the outer edge of the network, whether a connected industrial controller, a soil moisture sensor, a smartphone, or a security cam.This ballooning internet of things is already collecting petabytes of data, some of it processed for analysis and some of it immediately actionable. So an architectural problem arises: You don’t want to connect all those devices and stream all that data directly to some centralized cloud or company data center. The latency and data transfer costs are too high.That’s where edge computing comes in. It provides the “intermediating infrastructure and critical services between core datacenters and intelligent endpoints,” as the research firm IDC puts it. In other words, edge computing provides a vital layer of compute and storage physically close to IoT endpoints, so that control devices can respond with low latency – and edge analytics processing can reduce the amount of data that needs to be transferred to the core.To read this article in full, please click here

Edge computing: The architecture of the future

As technology extends deeper into every aspect of business, the tip of the spear is often some device at the outer edge of the network, whether a connected industrial controller, a soil moisture sensor, a smartphone, or a security cam.This ballooning internet of things is already collecting petabytes of data, some of it processed for analysis and some of it immediately actionable. So an architectural problem arises: You don’t want to connect all those devices and stream all that data directly to some centralized cloud or company data center. The latency and data transfer costs are too high.That’s where edge computing comes in. It provides the “intermediating infrastructure and critical services between core datacenters and intelligent endpoints,” as the research firm IDC puts it. In other words, edge computing provides a vital layer of compute and storage physically close to IoT endpoints, so that control devices can respond with low latency – and edge analytics processing can reduce the amount of data that needs to be transferred to the core.To read this article in full, please click here

Edge computing: 5 potential pitfalls

Edge computing is gaining steam as an enterprise IT strategy with organizations looking to push storage and analytics closer to where data is gathered, as in IoT networks. But it’s got its challenges. Tech Spotlight: Edge Computing Proving the value of analytics on the edge (CIO) The cutting edge of healthcare: How edge computing will transform medicine (Computerworld) Securing the edge: 4 trends to watch (CSO) How to choose a cloud IoT platform (InfoWorld) Edge computing: 5 potential pitfalls (Network World) Its potential upsides are undeniable, including improved latency as well as reduced WAN bandwidth and transmission costs. As a result, enterprises are embracing it. Revenues in the edge-computing market were $4.68 billion in 2020 and are expected to reach $61.14 billion by 2028, according to a May 2021 report by Grand View Research.To read this article in full, please click here

Edge computing: 5 potential pitfalls

Edge computing is gaining steam as an enterprise IT strategy with organizations looking to push storage and analytics closer to where data is gathered, as in IoT networks. But it’s got its challenges. Tech Spotlight: Edge Computing Proving the value of analytics on the edge (CIO) The cutting edge of healthcare: How edge computing will transform medicine (Computerworld) Securing the edge: 4 trends to watch (CSO) How to choose a cloud IoT platform (InfoWorld) Edge computing: 5 potential pitfalls (Network World) Its potential upsides are undeniable, including improved latency as well as reduced WAN bandwidth and transmission costs. As a result, enterprises are embracing it. Revenues in the edge-computing market were $4.68 billion in 2020 and are expected to reach $61.14 billion by 2028, according to a May 2021 report by Grand View Research.To read this article in full, please click here

Edge computing: 5 potential pitfalls

Edge computing is gaining steam as an enterprise IT strategy with organizations looking to push storage and analytics closer to where data is gathered, as in IoT networks. But it’s got its challenges. Tech Spotlight: Edge Computing Proving the value of analytics on the edge (CIO) The cutting edge of healthcare: How edge computing will transform medicine (Computerworld) Securing the edge: 4 trends to watch (CSO) How to choose a cloud IoT platform (InfoWorld) Edge computing: 5 potential pitfalls (Network World) Its potential upsides are undeniable, including improved latency as well as reduced WAN bandwidth and transmission costs. As a result, enterprises are embracing it. Revenues in the edge-computing market were $4.68 billion in 2020 and are expected to reach $61.14 billion by 2028, according to a May 2021 report by Grand View Research.To read this article in full, please click here