Домой United States USA — software Why bug bounty schemes have not led to secure software

Why bug bounty schemes have not led to secure software

86
0
ПОДЕЛИТЬСЯ

Computer Weekly speaks to Katie Moussouris, security entrepreneur and bug bounty pioneer, about the life of security researchers, bug bounties and the artificial intelligence revolution
Computer Weekly speaks to Katie Moussouris, security entrepreneur and bug bounty pioneer, about the life of security researchers, bug bounties and the artificial intelligence revolution
Governments should make software companies liable for developing insecure computer code. So says Katie Moussouris, the white hat hacker and security expert who first persuaded Microsoft and the Pentagon to offer financial rewards to security researchers who found and reported serious security vulnerabilities.
Bug bounty schemes have since proliferated and have now become the norm for software companies, with some, such as Apple, offering awards of $2m or more to those who find critical security vulnerabilities.
Moussouris likens security vulnerability research to working for Uber, only with lower pay and less job security. The catch is that people only get paid if they are the first to find and report a vulnerability. Those who put in the work but get results second or third get nothing.
“Intrinsically, it is exploitative of the labour market. You are asking them to do speculative labour, and you are getting something quite valuable out of them,” she says.
Some white hat hackers, motivated by helping people fix security problems, have managed to make a living by specialising in finding medium-risk vulnerabilities that may not pay as well as the high-risk bugs, but are easier to find.
But most security researchers struggle to make a living as bug bounty hunters.
“Very few researchers are capable of finding those elite-level vulnerabilities, and very few of the ones that are capable think it is worth their while to chase a bug bounty. They would rather have a nice contract or a full-time role,” she says.Ethical hacking comes with legal risks
Its not just the lack of a steady income. Security researchers also face legal risks from anti-hacking laws, such as the UK’s Computer Misuse Act and the US’s draconian Computer Fraud and Abuse Act.
When Moussouris joined Microsoft in 2007, she persuaded the company to announce that it would not prosecute bounty hunters if they found online vulnerabilities in Microsoft products and reported them responsibly. Other software companies have since followed suit.
The UK government has now recognised the problem and promised to introduce a statutory defence for cyber security researchers who spot and share vulnerabilities to protect them from prosecution.
Another issue is that many software companies insist on security researchers signing a non-disclosure agreement (NDA) before paying them for their vulnerability disclosures.
This flies against the best practices for security disclosures, which Moussouris has championed through the International Standards Organisation (ISO).
When software companies pay the first person to discover a vulnerability a bounty in return for signing an NDA, that creates an incentive for those who find the same vulnerability to publicly disclose it, increasing the risk that a bad actor will exploit it for criminal purposes.
Worse, some companies use NDAs to keep vulnerabilities hidden but don’t take steps to fix them, says Moussouris, whose company, Luta Security, manages and advises on bug bounty and vulnerability disclosure programmes.
“We often see a big pile of unfixed bugs,” she says. “And some of these programmes are well funded by publicly traded companies that have plenty of cyber security employees, application security engineers and funding.”
Some companies appear to regard bug bounties as a replacement for secure coding and proper investment in software testing.

Continue reading...