According to the company’s market research, just about every demographic wants more data privacy: young, old, male, female, urban, rural. Public polling backs that up, though the results vary based on how the question is asked. One recent survey found that “93 percent of Americans would switch to a company that prioritizes data privacy if given the option.”
In a blog post on March 3, Google announced that it would be removing third-party cookies from its Chrome browser—a decision that would effectively end use of third-party cookies. Google also pledged to avoid any other technology for tracking individuals as they browse the web.
A long-standing, generally accepted norm in the computing field distinguishes between software interfaces and implementations: Programmers should have to write their own implementing code, but they should be free to reimplement other developers’ program interfaces.
The traditional approach to statistical disclosure control (SDC) for privacy protection is utility-first. Since the 1970s, national statistical institutes have been using anonymization methods with heuristic parameter choice and suitable utility preservation properties to protect data before release.
Shared libraries encourage code reuse, promote consistency across teams, and ultimately improve product velocity and quality. But application developers are still left to choose the right libraries, figure out how to correctly configure them, and wire everything together.
When October 5 came, there was no vulnerability advisory being published and I still had not heard a CVSS or CVE for the issue, so I reached out again to their PSIRT who this time replied that the release had been postponed until October 14th now due to a delay in QA.
Organizations relying on traditional signature-based tools to detect security threats would likely have missed roughly three-quarters of malware samples that hit their networks and systems last quarter, a new analysis shows.
Automation is surely one of the best things to come to the networking world—the ability to consistently apply a set of changes across a wide array of network devices has speed at which network engineers can respond to customer requests, increased the security of the network, and reduced the number of hours required to build and maintain large-scale systems. There are downsides to automation, as well—particularly when operators begin to rely on automation to solve problems that really should be solved someplace else.
In this episode of the Hedge, Andrew Wertkin from Bluecat Networks joins Tom Ammon and Russ White to discuss the naïve reliance on automation.
In the reorganization that relatively new chief executive officer and formerly not only Intel’s first chief technology officer (in 2001) and also the first general manager (in 2005) of its Digital Enterprise Group, the company’s first implementation of Data Center Group.
The 2020 calendar year will long be remembered as an annus horribilis for most, except for a handful of technology companies that reaped the rewards of a global shift to remote work with successful initial public offerings (IPOs).
In 2021, the high-end TV landscape is just as confusing to new buyers as ever. There’s a bunch of new televisions to consider, a raft of technical-sounding features — 8K, HDR, Ultra HD 4K, 120Hz and HDMI 2.1 — and a stable of familiar brand names competing for your dollar.
I’m teaching a webinar on router internals through Pearson (Safari Books Online) on the 23rd of July. From the abstract—
A network device—such as a router, switch, or firewall—is often seen as a single “thing,” an abstract appliance that is purchased, deployed, managed, and removed from service as a single unit. While network devices do connect to other devices, receiving and forwarding packets and participating in a unified control plane, they are not seen as a “system” in themselves.
What is the first thing almost every training course in routing protocols begin with? Building adjacencies. What is considered the “deep stuff” in routing protocols? Knowing packet formats and processes down to the bit level. What is considered the place where the rubber meets the road? How to configure the protocol.
I’m not trying to cast aspersions at widely available training, but I sense we have this all wrong—and this is a sense I’ve had ever since my first book was released in 1999. It’s always hard for me to put my finger on why I consider this way of thinking about network engineering less-than-optimal, or why we approach training this way.
This, however, is one thing I think is going on here—
The typical program aims to counter the inherent complexity of the decision by providing in-depth information. By providing such extremely detailed and complex information, these interventions try to enable people to make perfect decisions.
We believe that by knowing ever-deeper reaches of detail about a protocol, we are not only more educated engineers, but we will be able to make better decisions in the design and troubleshooting spaces.
To some degree, we think we are managing the Continue reading
If you are new to the security world, it is fair to ask yourself, “Isn’t access to data and systems always conditional? Isn’t it always granted to someone who has access to the credentials (ID and password)?”
Could artificial intelligence be better at designing chips than human experts? A group of researchers from Google’s Brain Team attempted to answer this question and came back with interesting findings.
The CIS Controls are a prioritized set of Safeguards to mitigate the most prevalent cyber-attacks against systems and networks. They are mapped to and referenced by multiple legal, regulatory, and policy frameworks.
In this article, we look at the key differences between the most popular cloud technology delivery models: Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), and (Infrastructure-as-a-Service).
Bluecat, in cooperation with an outside research consultant, jut finished a survey and study on the lack of communication and divisions between the cloud and networking teams in deployments to support business operations. Dana Iskoldski joins Tom Ammon and Russ White to discuss the findings of their study, and make some suggestions about how we can improve communication between the two teams.
Please find a copy of the study at http://bluecatnetworks.com/hedge.
Electric vehicles are expected to account for 58% of global passenger vehicle sales by 2040. The software and electrical components markets are also likely to face increased pressure and new challenges as they develop secure designs and equipment for these futuristic vehicles.
In a world that is constantly evaluating costs, it is little wonder that there is an increasing demand for cost-effective solutions to business problems. In the real world, this means ‘free,’ and in the digital marketplace, it means ‘open source.’
I often feel like I’m “behind” on what I need to get done. Being a bit metacognitive, however, I often find this feeling is more related to not organizing things well, which means I often feel like I have so much to do “right now” that I just don’t know what to do next—hence “processor thrashing on process scheduler.” Todd Palino joins this episode of the Hedge to talk about the “Getting Things Done” technique (or system) of, well … getting things done.
It’s not unusual in the life of a network engineer to go entire weeks, perhaps even months, without “getting anything done.” This might seem odd for those who do not work in and around the odd combination of layer 1, layer 3, layer 7, and layer 9 problems network engineers must span and understand, but it’s normal for those in the field. For instance, a simple request to support a new application might require the implementation of some feature, which in turn requires upgrading several thousand devices, leading to the discovery that some number of these devices simply do not support the new software version, requiring a purchase order and change management plan to be put in place to replace those devices, which results in … The chain of dominoes, once it begins, never seems to end.
Or, as those who have dealt with these problems many times might say, it is more complicated than you think. This is such a useful phrase, in fact, it has been codified as a standard rule of networking in RFC1925 (rule 8, to be precise).
Take, for instance, the problem of sending documents through electronic mail—in the real world, there are various Continue reading
Ten years ago this spring, two million Tunisians came together on Facebook to transform their pent-up fury at their country’s authoritarian government into a four-week revolution that toppled the longtime dictatorship of Zine El Abidine Ben Ali
As it turns out, the internet is not that exceptional after all. It may be one of the greatest inventions since the printing press, as the cliché goes, and it has unquestionably revolutionized communication and commerce. But as David Pierce observed in Protocol shortly after the January 6 Capitol riot, “Everything is IRL” now. “[T]he barriers between online and offline life have disappeared completely.”
The one-two punch of the recent and upcoming executive orders on supply chains and cybersecurity may well be this jump-start and set the foundation for a significant and much-needed shift in US grand strategy.
Domain name abuse is one of the most dangerous and under-regulated issues in digital business security today. Many of the largest companies in the world still lack basic domain security protocols, making them prime targets for bad actors.
The Internet Engineering Task Force (IETF) is a collaborative body that has developed internetworking specifications for more than five decades, successfully shaping the global marketplace of digital network equipment and services.
The network monitoring world is rife with formats for packets being measured—every tool has its own format. What would make things a lot better for network engineers is a standard data representation for packet analysis, no matter what format packets are captured in. Jordan Holland joins Russ White and Tom Ammon on this episode of the Hedge to discuss the problem and nprint, a standard packet analysis format and tools for converting from other formats.
Ultimately, the incident stemmed from a ransomware infection in which a well-known threat actor took volumes of corporate data in just two hours and made their demands including the threat to block and encrypt the company’s network.
Last week, someone began began posting classified notices on LinkedIn for different design consulting jobs at Geosyntec Consultants, an environmental engineering firm based in the Washington, D.C. area. Those who responded were told their application for employment was being reviewed and that they should email Troy Gwin — Geosyntec’s senior recruiter — immediately to arrange a screening interview.
Seagate has been working on dual-actuator hard drives—drives with two independently controlled sets of read/write heads—for several years. Its first production dual-actuator drive, the Mach.2, is now “available to select customers,” meaning that enterprises can buy it directly from Seagate, but end-users are out of luck for now.
TCP and QUIC are the two primary transport protocols in use on the Internet today—QUIC carries a large part of the HTTP traffic that makes the web work, while TCP carries most everything else that expects reliability. Why can’t we apply the lessons from QUIC to TCP so we can merge these two protocols, unifying Internet transport? TCPLS is just such an attempt at merging the most widely used reliable transport protocols.
Fear sells. Fear of missing out, fear of being an imposter, fear of crime, fear of injury, fear of sickness … we can all think of times when people we know (or worse, a people in the throes of madness of crowds) have made really bad decisions because they were afraid of something. Bruce Schneier has documented this a number of times. For instance: “it’s smart politics to exaggerate terrorist threats” and “fear makes people deferential, docile, and distrustful, and both politicians and marketers have learned to take advantage of this.” Here is a paper comparing the risk of death in a bathtub to death because of a terrorist attack—bathtubs win.
But while fear sells, the desire to appear unafraid also sells—and it conditions people’s behavior much more than we might think. For instance, we often say of surveillance “if you have done nothing wrong, you have nothing to hide”—a bit of meaningless bravado. What does this latter attitude—“I don’t have anything to worry about”—cause in terms of security?
Several attempts at researching this phenomenon have come to the same conclusion: average users will often intentionally not use things they see someone they perceive as paranoid using. Continue reading
UMN computer science researchers who were interested in the ability to intrude bad code into the Linux kernel abused submission processes, faking supportive reviews, to insert code deep in the operating system codebase.
Security researchers Thursday disclosed a new critical vulnerability affecting Domain Name System (DNS) resolvers that could be exploited by adversaries to carry out reflection-based denial-of-service attacks against authoritative nameservers.
We are launching a new Privacy Breakdown of Mobile Phones “playlist” on Surveillance Self-Defense, EFF’s online guide to defending yourself and your friends from surveillance by using secure technology and developing careful practices.
Email Campaign builders (marketers) are flying blind. I know ESPs are genuinely timely about rolling out new products for their marketers, but there is a colossal gap in adopting data science and MLops into the email campaign building workflow.
The NSFNET followed the CSNET, connecting the campuses of several colleges and supercomputing systems with a 56K core in 1986. The NSFNET was the first large-scale implementation of Internet technologies in a complex environment of many independently operated networks, and forced the Internet community to iron out technical issues arising from the rapidly increasing number of computers and address many practical details of operations, management and conformance. The NSF eventually became the “seed” of the commercialized core of the Internet, playing an outsized role in the current design of routing, transport, and other Internet technologies.
In this episode of the History of Networking, Dennis Jennings joins Donald Sharp and Russ White to discuss the origins and operation of the NSFNET.
You can find out more about Dennis and the NSFNET in the following links.
It’s easy to assume automation can solve anything and that it’s cheap to deploy—that there are a lot of upsides to automation, and no downsides. In this episode of the Hedge, Terry Slattery joins Tom Ammon and Russ White to discuss something we don’t often talk about, the Return on Investment (ROI) of automation.