A storm is brewing between Kim Cameron and Ben Adida. I do not mean to step into the middle of their dispute over the Google Wifi surveillance fiasco. Both have good arguments to make.
I noticed, on the other hand, that Cameron makes a huge error in the blog post today “Misuse of network identifiers was done on purpose“.
SSID and MAC addresses are the identifiers of your devices. They are transmitted as part of the WiFi traffic just like the payload data is. And they are not “publically broadcast” any more than the payload data is.
Actually the SSID and MAC are broadcast more in the public in my view since they are part of the handshake required to both negotiate a connection as well as avoid collisions. It is like the front door to a house that has a unique number on it for everyone to see and know whether it is the right door or not.
There is an argument to be made that doors should be able to be hidden, but the implementation of WiFi thus far does not offer any real way to hide them. So I say they are more public than data, which lives behind the door.
The identifiers are persistent and last for the lifetime of the devices. Their collection, cataloging and use is, in my view, more dangerous than the payload data that was collected.
That simply is not accurate.
The MAC and SSID can be easily changed. I just changed it on my network. I just changed it again. Did you notice? No, of course not. Anyone can change their MAC to whatever they want and the SSID is even easier to change. I do not recommend fiddling with the SSID if you have a good one, since it can be seen anyway and doesn’t offer much information to an attacker. My only recommendation is to think about what you are advertising with it. Changing the MAC is nice because you can hide the true identity of your device. Attackers would think it’s a Cisco, for example, when it really is D-Link or SMC.
I am no fan of Google’s response on this issue. A defense is centered around “our engineers did not know what they were doing” or “we didn’t realize that Wifi scanning would collect all this data” is hard to believe. Who in the world writes surveillance software without factoring the risk to privacy and security? That seems rather obvious, which could might be why Congress is getting involved.
“As we have said before, this was a mistake. Google did nothing illegal and we look forward to answering questions from these congressional leaders,” a Google spokesperson said in an email.
A mistake? The first thing anyone should do when turning on WiFi capture tools is verify they will not be in violation of ethics and law. It reads from their defense that they still do not think surveillance would violate a law. This could have a major impact on the security industry and begs the question of ethical data capture.
Google really needs to explain themselves in terms of the basics. They are being asked “why did you just drive to all the homes in town and take photos through the windows?” If they answer “we had no idea we actually were taking pictures of anything” you are right to be suspicious. That’s what cameras do…
Over the years Google has told me their organization is flat and all the engineers are just expected to do the right thing. They throw their hands up at the idea of someone telling anyone else what is or is not allowed. They balk at the idea of anything organized or top down. They are open to passive education and seminars but nothing strict like what is needed for compliance — thou shalt not wiretap without authorization, thou shalt disable SSLv2. Compliance means agreeing with someone else on what Google can and can not do; that is a hard pill to swallow for an engineering culture that prides itself on being better than everyone else.
Although it is easy to see why they are trying to emulate an academic model they grew out of (all schools tend to struggle with the issue of how to secure an open learning environment) they are clearly not a learning institution. Aside from modeling the “academic peer review” process into a search algorithm, they have completely left the innocence of a real campus behind. Their motives are not anywhere close to the same.
Collecting personal data for academic research in a not-for-profit organization could make sense somewhere to someone as a legal activity. Taking profits from collecting personal data to fund surveillance software that runs around the world to collect personal data…uh, hello?
A naive political structure regarding security management and lax attitude to consumer privacy is what will really come into focus now at Google. They need a security executive. They seem to need someone to explain and translate right/wrong (compliance) from the outside world into Google-ese so their masses of engineers can make informed decisions about how to change or come to understand laws they run up against instead of violating them and only then trying to figure out if they should care.
New regulations and lawsuits may help. I am especially curious to see if this will alter surveillance laws. Google of course may plan just to dismiss the results as more outsider/non-Google concepts. The big banks are known to have a capital offset fund to pay fines rather than comply. Large utilities are known to have a “designated felon” who gets to go to jail on their behalf rather than comply. It is definitely worth noting that executives at BP said the $87 million OSHA fine for safety violations last year did nothing for them. It did not register as enough pain to change their culture of risk and save employee lives — it certainly did not prevent the Gulf spill after the loss of a $500 million rig. The spill itself is said to so far have cost BP $930 million and cost the government $87 million. The damage to the economy and environment is yet to be determined. Will BP change?
Although Cameron is wrong on technical points he is right in principal. The Google Wifi surveillance issue has exposed what appears to be a systemic issue for Google management to instill privacy and security due diligence in their engineering practices.