It was this fear that has lead governments (most notably the US) to float the idea of criminalizing the use of encryption software or requiring the Government hold a key in escrow (such as with the Clipper chip).
A few years go the UK passed a law ("RIPA section 49")requiring suspects to hand over encryption keys when requested or face fines and up to two years jail. They have since charged suspects under it.
A great piece on the controversy of whether encryption is harmful or not is also available here.
Cryptography is a tool and can be used for good or for ill. Personally I don't believe in a system where the Government holds keys in escrow without unprecedented transperancy around who is accessing keys (and why!) and don't believe such a system would ever be workable. Make Cryptography illegal? Well the 'bad guys' are already breaking the law and only law-abiding citizens would be disadvantaged.
Oh, and I'm more than happy for criminals to remain a lazy, overconfident and superstitious cowardly lot!
While doing some recent reading on Digital Foerensics I came across a particularly interesting older case where a Russian hacker was caught by the FBI and charged with computer intrusion and fraud. While this doesn't sound like anything too out of the ordinary what caught my attention was some of the details.
The FBI alleged that Ivanov and other international hackers gained unauthorized access into computers at CTS Network Services (an ISP) and used them to attack other e-commerce companies, including two credit card processors, where he stole customer financial information and used this information in the usual fraud schemes. Nothing too out of the ordinary so far.
Once the FBI had identified their culprit, in order to make the arrest they lured him and an accomplice to the US on the premise of offering a job as an IT security consultant. When the pair arrived, the FBI had them remotely connect to their machines back in Russia as a demonstration of their skills for the new prospective employer. But not all was as it seemed, as the FBI were keylogging the machines the Russians used in the US and used these captured credentials to connect to the Russian computers and extract the evidence they needed (without a search warrant) to prosecute Ivanov and his accomplice.
Do the ends justify the means? The Russian Federal Security Service, or FSB, didn't think so, started criminal proceedings against the FBI Agents for unauthorized access to computer information. Meanwhile back in the States, the Agents involved were awarded the director’s award for excellence as the case was the first in bureau’s history to “utilize the technique of extra-territorial seizure.”
The assistant US District attorney commented that he "wouldn't call it hacking" when discussing the Agent's actions and a federal judge agreed, rejecting motions filed that sought to suppress the evidence obtained from the computers with Ivanov eventually being sentenced to three years in prison.
Do, in this case, the ends justify the means? Or is it simply the beginning of a slipperly slope allowing state-sanctioned hacking in the name of justice?
This case is wan older one and was 'pre-9/11', so I wonder what effect the PATRIOT act has had in the intervening years...
Recently Nick gave a great presentation at the AISA Risk Management Special Interest Group (RMSIG) in Sydney.
Some of the points that came out of his presentation** that I found rather interesting follow:
- Most InfoSec-related cases are brought under the tort of negligence
- Damages cannot be recovered under negligence for pure economic loss
- No cases have yet been tried in Australia for under the tort of Negligence for InfoSec breaches ~ although cases have been settled before going to court
- The highest privacy breach payout in Australia is around $8000 ~ leaving privacy breaches more damaging to reputation than financially (barring lost revenue from reputational damage of course!)
- The Trade Practices Act Section 52 is the key area to pay attention to for Australian InfoSec professionals when verifying legal liability ~ it has less hurdles that proving negligence and can be 'creatively' applied by the courts.
- The ALRC has recommended a new tort of "serious invasion of privacy" and recommended compulsory disclosure laws in Australia.
The Rule specifies that what is “reasonable” will depend on the size and complexity of the business, the nature and scope of its activities, and the sensitivity of the information at issue. This standard recognizes that there cannot be “perfect” security, and that data breaches can occur despite the maintenance of reasonable precautions to prevent themThe formal acknowledgement that "perfect" security cannot exist from someone outside of IT is interesting to see.
Nick gave a great talk, and I do recommend his book.
**Any errors or omission of information in this post are my fault and not Nick's. I am no lawyer! So go seek your legal advice from someone who is!
I'm currently studying Digital Forensics and a recent bit of google-inspired research lead me to one of the big stories of late last year (which I vaguely remembered) where a Microsoft forensic tool designed for use by law enforcement called COFEE (Computer Online Forensic Evidence Extractor) was leaked on the internet.
Given the prevelance of computer-based crime and the level of skill required to perform proper forensic analysis, it makes sense for Microsoft (or someone else) to develop a simple-to-use wrapper for what apparently was a number of common forensic tools available elsewhere on the internet.
The reaction to the leak seems to have been mixed, with Microsoft claiming they weren't bothered by the release of the software, although noting it is licenced for use by Law enforcement only, to someone developing a counter-forensic tool called (of course..) DECAF. What was the thinking in creating this counter to COFEE? One of the developers said:
"We saw Microsoft released COFEE and that it got leaked, and we checked it out," the man said. "And just like any kid's first day at the fair, when you walk up to that cotton-candy machine and it smells so good and you see it, it's all fluffy – just so good. You get up there and you grab it and you bite into it, it's nothing in your mouth.
"That's the same thing we did with COFEE. So, knowing that and knowing that forensics is a pretty important factor, and that a lot of other pretty good forensic tools are getting overlooked, we decided to put a stop to COFEE."
This arguement seems fairly disingenuous as COFEE seems to hardly have been aimed to replace any existing tools, but to simply make them easier for a less-well trained law enforcement operator to use in order gather crucial forensic evidence. The fact the tool was released by Microsoft probably had more to do with creating a counter-tool than noble thoughts of 'better tools being overlooked'.
No matter what the task, there is almost always a 'better tool', whose use might not be desirable because of cost, complexity or the expert knowledge required to operate it. Much of the history of software innovation has been designed around making complex tasks easier so more people can perform them, Windows being the prime example as it took desktop computers from the realm of geeky hobbyists to mainstream use in businesses and in homes. While simplifying (or as some may call it 'dumbing down') tasks may grate the nerves of the some, it is an inevitable and in many ways, desirable end goal.
I'll post a review once I have a chance to have a good read.
Legal ≠ Secure. When a Legal department is asked for input, they are purely concerned with determining whether whatever is being presented to them contravenes the law. Most of the time the law will state something along the lines of "due care must be taken not to disclose data" rather than "you must use a minimum of 128-bit encryption to encrypt the data and the transmission".
What is due care? Well that's up to the judge to decide after the lawsuit has begun. Lawyers aren't normally Information Security professionals (well none I know!) and in fact often suffer from the same mideset as most non-IT professionals in that they tend to lump all things IT into the same basket*. As far as they're concerned, if someone in IT said we've done our best to secure something, they'll assume we've done due diligence and sign off, not really making the distinction of whether the 'IT guy' (or gal!) is technical or non-technical, a programmer, sysadmin or IT security expert. It may only be later during the court case when a prosecuting expert testifies that using DES to encrypt those passwords wasn't a good idea.**
When an IT Security Professional is asked for input, they generally have a pretty good grasp of legal requirements (well the good ones will!) and can always see legal for clarification. They are the ones who can ensure from a technology standpoint that the company is obeying the letter and the spirit of the law.
You wouldn't ask an IT Professional to to organize your legal defence, so don't ask a lawyer to vet the security of your applications. While the lawyers have their part to play, in ensuring that the law is being upheld, Legal ≠ Secure.
*In fairness to lawyers, I probably lump them all into the same basket too, not really paying attention to the difference between a patent lawyer and an ambulance chaser.
**If you're a lawyer reading this and don't understand this comment, go ask a friendly IT Security Professional!