RFC 9386 documenting IPv6 deployment status in late 2022 has been published a few weeks ago1. It claims over a billion IPv6-capable users, and IPv6 deployment close to 50% in major countries.
Web content is a different story: while 40% of top-500 sites are IPv6-enabled, you can reach only ~20% of web sites over IPv6. Considering Cloudflare’s free proxying includes IPv6 that is enabled by default, that proves (once again) how slowly things change in IT.
RFC 9386 documenting IPv6 deployment status in late 2022 has been published a few weeks ago1. It claims over a billion IPv6-capable users, and IPv6 deployment close to 50% in major countries.
Web content is a different story: while 40% of top-500 sites are IPv6-enabled, you can reach only ~20% of web sites over IPv6. Considering Cloudflare’s free proxying includes IPv6 that is enabled by default, that proves (once again) how slowly things change in IT.
On this episode of the Hedge, Mike Dvorkin joins Russ White to talk about the cloud, tradeoffs, rethinking the cloud value proposition, and the road to becoming an architect. A key point—it is harder to fix hardware in production than it is to fix software in production.
On today's Heavy Networking we dive into the frameworks commonly used by service providers to tackle network slicing and traffic engineering challenges. We'll also talk their pros and cons, and the approach that Cisco is seeing its customers adopt as providers create virtual networking products for their customers. Cisco is our sponsor for today's show.
The post Heavy Networking 678: How Cisco Accelerates The IP/Optical Automation Journey (Sponsored) appeared first on Packet Pushers.
Part one of this blog post series explored Day Zero Ops, which covers all the planning for how you want your environment to look. Before you can implement, you must plan. Oftentimes, engineers try implementing without proper planning, which results in tech debt later on. Once you’ve set up a robust plan, it’s time to […]
The post Build Your K8s Environment For The Real World Part 2 – Day One Ops appeared first on Packet Pushers.
The other day I realized that I had become the “old man” at Tech Field Day. Not so much that I’m ready for AARP but more that I’ve been there longer than anyone else but Stephen. The realization was a long time coming but the thing that pushed me to understand it was when someone asked a question about a policy we had and I not only knew the reason why we did it but also a time before we had it.
As I spent time thinking about the way that I’ve graduated from being the new guy to the old mentor I thought about the inflection point when the changeover happened.
The first part of the demarcation between mentor and mentee in my eyes is where the knowledge lies. When you’re first starting out you’re the one that needs to understand things. You ask lots and lots of questions and try to understand how things are done and why you do them that way. Focusing on that knowledge acquisition is part of the marker of someone in need of mentorship.
For those trying to mentor these eager employees don’t make the mistake of getting frustrated at Continue reading
The tech world is awash with generative AI, which for a company like Nvidia, is a good thing. …
There’s Still A Long Way To Go With Generative AI was written by Jeffrey Burt at The Next Platform.
When 400GbE was still an emerging technology, Mark Nowell explained its basics in an update session of the Data Center Fabric Architectures webinar, starting with 400GbE optics.
When 400GbE was still an emerging technology, Mark Nowell explained its basics in an update session of the Data Center Fabric Architectures webinar, starting with 400GbE optics.
Cloud computing revolutionized how a business can establish its digital presence. Nowadays, by leveraging cloud features such as scalability, elasticity, and convenience, businesses can deploy, grow, or test an environment in every corner of the world without worrying about building the required infrastructure.
Unlike the traditional model, which was based on notifying the service provider to set up the resources for customers in advance, in an on-demand model, cloud providers implement application programming interfaces (API) that can be used by customers to deploy resources on demand. This allows the customer to access an unlimited amount of resources on-demand and only pay for the resources they use without worrying about the infrastructure setup and deployment complexities.
For example, a load balancer service resource is usually used to expose an endpoint to your clients. Since a cloud provider’s bandwidth might be higher than what your cluster of choice can handle, a huge spike or unplanned growth might cause some issues for your cluster and render your services unresponsive.
To solve this issue, you can utilize the power of proactive monitoring and metrics to find usage patterns and get insight into your system’s overall health and performance.
In this hands-on tutorial, I will Continue reading
Google is a big company with thousands of researchers and tens of thousands of software engineers, who all hold their own opinions about what AI means to the future of business and the future of their own jobs and ours. …
Missing The Moat With AI was written by Timothy Prickett Morgan at The Next Platform.
The following sponsored blog post was written by Palo Alto Networks. We thank Palo Alto Networks for being a sponsor. ChatGPT is the fastest-growing consumer application in history, with 100 million monthly active users just two months after launch. While these AI apps can significantly boost productivity and creative output, they also pose a serious […]
The post ChatGPT and AI-based Tools Require Strict Scrutiny appeared first on Packet Pushers.