Software without bugs? Dreaming is harmless
- Transfer
Not all software companies had to deal with errors of the same degree of importance as they were with Toyota cars ( on Habré ) , but every day it is more and more clear: any software company creates products with hidden security defects. There are practically no exceptions. According to Veracode, a software testing provider who prepared a report for the RSA San Francisco conference, about 60 percent of the software that they tested through the past 18 months failed the first test cycle . According to Roger Oberg, senior vice president of marketing for Veracode, these were applications from manufacturers that care enough about security to use Veracode services first.
Veracode data is not unique. A study conducted by WhiteHat Security last year found that 82 percent of corporate websites contained “high, critical, or critical” vulnerabilities in the foreseeable past, and 63 percent had such vulnerabilities at the time of the study.
It should be noted that the studies of safety consultants are more often in the nature of self-PR. But their results cannot be discounted. Just run through the headers to notice the frequency with which vulnerabilities are detected in most software products of reputable manufacturers. Independent developers should not think that their products are at least somewhat different, simply because errors * are very difficult to avoid.
Game for developers: kill the hamster **
It is not necessary to think that complex and subtle errors are used for hacks. Every year, the SANS Institute (SANS Institute) and the Common Weakness Enumiration Library (CWE), a government-sponsored guard dog, publish 25 of the most common and dangerous software bugs . As in previous years, the 2010 list did not contain many new products, except for the unintentional display of confidential information in error messages or permission for unlimited downloading of dangerous file types. But there are a lot of such children's errors as race conditions, buffer overflows and incorrect processing of array pointers. These are eternal errors, rooted in the dawn of programming, but their prevalence in 2010 is astounding.
In addition, the facts say that even using best practices can lead to error. In 2006, Joshua Bloch of Google wrote on a blog that he found a mistake in the binary sorting algorithm from John Bentley’s famous reference book, Pearls of Programming, published for the first time in 1986. Although Bloch did not try to humiliate Bentley, it turned out that Bloch himself implemented the binary a search algorithm for the JDK containing exactly the same error, and its oversight went unnoticed for about nine years.
Can developers work better? Software testing services such as Veracode can certainly be useful, but still this approach is not ideal. In some cases, the application architecture or programming language can make testing completely pointless.
Open source developers love to extol the "Linus law," which states that "with enough eyes, all errors are detected." In other words, the transparency of the open source application development process means that errors in open source code will be detected and fixed faster than in proprietary software.
However, Microsoft Security Program Manager Sean Herman disputes this claim.and not without reason. According to Herman, the fact that programmers can inspect the code for errors does not mean that they do it; Moreover, practice shows that only full-time paid programmers are motivated enough to spend time looking at someone’s code. If this is so - but it seems to me that way - then only software manufacturers with the deepest pockets (and, accordingly, the largest teams) can really win the Linus law.
Be open
But none of the above means that software security is a bad case. This is not so, the answer lies in understanding that exactly so much can be done with the code. Each developer is responsible for delivering the code of the best quality possible; “Best possible” is a very important phrase. After that, the focus of any software security strategy of any developer is not the development process, but how to deal with security incidents when they inevitably happen.
Long gone are the days when *** updates were delivered on CD and floppy disks. Now users expect updates to appear quickly, almost at the speed of detecting vulnerabilities. While this can often be impractical, manufacturers delay the distribution of critical updates at their own risk.
Let me remind you that how a manufacturer distributes updates can be a problem in itself. There was a time when Microsoft distributed updates as soon as they were available. But customers complained that there was an exorbitant load on the IT staff, who had to constantly check and deploy updates. In response, Microsoft switched to the current distribution model, “Tuesday of Updates,” twice a month. This approach has also been criticized, mainly by those who say that “Tuesday of updates” leads to “Hacking Environment” ****, when hackers prey on those who have not yet installed the latest updates.
Customers will always be unhappy with security failures and the need to fix them. The only way out for developers is to be open and sincere, as far as possible in matters of security failures of their software, and make every effort to resolve the problems of customers that may be affected by the failure before the update is released.
An alternative is to impose a culture of silence and secrecy in everything related to security failures; this is a direct path to failure. The situation with Toyota is partly atypical. Closer to the point, web developers are eagerly awaiting HTML 5, which is expected to free them from the seemingly endless series of errors that crawled into extensions like Adobe Reader and Flash, and which often do not resolve for weeks or even longer.
The more research that Veracode and WhiteHat Security do that will be released, the better customers will realize that security failures are part of life. Once this perception prevails, customers will require not only updates, but also a more thorough search for security vulnerabilities. Soon, companies that do not regularly identify security threats will not be able to look like manufacturers of high-end applications; they will become those who have something to hide.
Notes for translation
* - under the error in the text is referred to as bug (bug)
** - in the original, whack-a-mole, a game in which an animal comes out of the holes and you need to hit it with a hammer; what it is called in Russian, I don’t know, in everyday life I use the name “kill the hamster”
*** - under the update in the text is meant a patch (patch), i.e. fixing a bug, not adding new functionality
**** - more precisely, “Exploit Wednesday”
PS I agree with the opinion of the author, otherwise I would not translate. In my subjective opinion, it applies to all types of errors, and not just in the field of security.