407-617-7440 or 

Monday24 January 2022

A security researcher's lot is not a happy one, but it should be

Reading time is around minutes.

The life of a security researcher is not all beer and pizza. In most cases the days are long and very few seem to appreciate what you are doing. From the stand point of a security researcher they are the good guys trying to help push an agenda of security. They spend countless hours finding the holes in code and hardware before the “bad guys” do. Sure there are bug bounty programs that pay fairly well and some researchers work for larger firms, but it is not all about the money or attribution.

Having attended more than a few DEF CONs and spoken at length with security researchers I was more than surprised to find that the most common response to a reported bug or vulnerability is to see if the researcher violated EULA or Copyright. Even the initial calls to disclose the bug seem more like a bad divorce than a friendly exchange. Both sides have lawyers there to protect their interests and threats are often exchange by the company that is getting the bad news.

Just look at a recent blog post (now taken down) by Oracle CSO, Mary Ann Davidson. She asks customers and researchers to please stop analyzing Oracle’s code. She claims that by doing this it causes more work and that it is just wrong. She also notes that code scans are not always right. On top of that she states that if they feel any reverse engineering was performed they send a letter reminding the researcher that such conduct violates EULA. I am sure that all of the malicious individuals out there looking to steal data would never break EULA, so you are safe as long as you follow it as well.

You would think that this is the exception, but it is sadly not. I have heard more than a few horror stories about disclosing bugs and vulnerabilities. Charlie Miller even had his Apple Developers Network account revoked and ended up being banned for proving a flaw in Apple’s App store. The times when people are “cool” about it are relatively small. This culture has spawned the rise of bug bounty groups where a researcher can file their bug and then let the larger company deal with sending it on. It provides a level of anonymity and protection from litigation that most do not get.

While it is a good thing that we are seeing more bug bounties, there is plenty of room for improvement in this area. In a perfect world companies would welcome researchers as they provide an extra set of eyes to find things that might be missed by staff that is busy working to get a product out. There are already agreed upon rules of disclosure where a researcher agrees not to say anything until after a predetermined time and there is a fix in place (although with the current state of patching that does not always help). This means that the cause of the issue is not really with the guys looking for holes in the code. It resides in the corporations that do not really want to deal with the time and money it costs to fix something.

In a conversation with a developer that will remain nameless I was told that the “most frustrating:” thing about security researchers was that they find things that might never be found by the bad guys. I was rather shocked to hear this comment and pushed for more information. The developer then went on to say that most bugs were tiny little holes that had little chance of being found by hackers. He then went on to describe hackers as a bunch of kids in their Moms’ basements trying to break things. He felt that the bigger danger was from researchers and the press talking about flaws in software.

This mentality along with the money it takes to patch something properly is what is slowing the progress of security. The corporations do not want to spend the money paying for the bugs, the development time to patch and take the reputation hit from the bugs (which also equals money). Oracle already has their hands full with Java so they do not want any more bad press from researchers finding holes on their other code. With Apple they have maintained the mythology that they are immune to bugs and malware for so long they do not want anything that might harm that. The story is the same for many larger companies out there. The perception of “hackers” needs to change as well. These are not kids in a basement banging out scripts. The level of sophistication that we have seen in criminal organizations targeting common software is very concerning. These are large criminal organizations (sometimes with government help) that can do massive damage. Researchers are not part of these groups and are really looking to help, they are not looking to get sued for their efforts. Until there is something to get past these hurdles at the corporate level researchers will still find the response to a bug from most companies cold at the very best and that is only going to benefit the larger criminal organizations.

Leave a comment

Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.