Eskil steenberg Hald Security talk at BSC 2025
<RANT>
I take security seriously, but I've had it with the security orthodoxy. I really want secure software but we can't let the security orthodoxy continue being unquestioned in the software community.
The security orthodoxy assumes that security is the most important aspect of computers. It NEVER EVER IS. Computers exist to get things done. If security was more important, then, here is a hot tip: Don't plug in the computer to either a power socket or a network port. But most people still do that, because most people think that the benefits of running computers are more important, and are willing to take some risks when it comes to security, to do things like make society run.
The only goal of Research should be the quest for truth. As researchers the Security orthodoxy says that only it is able to judge what constitutes good software and therefore does so from the most narrow of perspective. Real software development is not research, it is engineering, and engineering have LOADS of things we have considered and security is only one, and many times security doesn't matter. Software development is to balance the reduction of bugs with production of useful functionality to the user. It's always a balancing act. Despite what some people think, software can never be proven to work. You can sometimes prove that it reflects a specification, but you can't prove that there aren't errors in the specification or that the intention of the specification is flawed to begin with. There is no way to write software that is guaranteed to not produce bugs, all we can do is make the right prioritizations to minimize bugs. Screaming and crying about how the world isn't perfect is childish and stands in the way of actually making things better.
This utter failure to understand the reality of software development, is very evident when listening to the process advice preached by the security orthodoxy. Essentially security advice boils down to this: put so many checks on the process, that making any progress becomes so cumbersome that no one will ever do anything. I know of so many examples of hacks and lack of much needed rewrites stemming from developers doing anything in their power to avoid onerous security reviews and lengthy recertifications. Writing secure software is as much about keeping the developers motivated to maintain it, as it is to employ reasonable security practices.
I have worked in game development for many years, and I am confidently telling anyone out there, that if you are making a new multiplayer game, spending much of your time on preventing cheats is a complete waste of your time. The vast majority of games fail to find an audience, so focus all your energy on trying to make a game that any one cares enough about to even consider cheating in. Only then consider security seriously once its a problem with some kind of return on investment engaging in.
As a C programmer and member of the wg14, for once I would like to see a security expert consider that if all the most trusted software like Linux, OpenSSL, Apache, Python, Curl, MySQL, are written in C, then perhaps just maybe the people who wrote it aren't complete idiots. And perhaps just maybe C has some property that, beyond the obvious shortcomings, makes it the most successful language in history to write secure code in. Maybe it would be worthwhile trying to figure out what that is. Research topic everyone!. But the security orthodoxy doesn't consider the real world. Case in point: The Linux kernel, has over 2000 filed unaddressed security vulnerabilities. You may think that says something bad about Linux, but if Linux really had 2000 exploitable vulnerabilities then no one would run Linux, and whoever was brave enough to do so would be hacked instantly. Obviously that's not true. In the real world Linux is very secure. What it really says is that the security orthodoxy has dreamed up 2000 bullshit issues that don't matter in the real world. Yes, all bugs in security critical software are C bugs, because no software written in any other language is good enough to be widely used and trusted for security critical tasks. The survivor bias of buffer overruns is overwhelming. Speaking of, buffer overruns are not bugs. Buffer overruns are the symptoms of bugs. And BTW, buffer overruns are not seriously hard bugs. ABA bugs, time travel bugs, UB elimination bugs, lockless bugs, and aliasing bugs are hard, the people who find those bugs deserve our adoration. Your fear of buffer overruns just proves that you don't know about the C memory model.
Security researchers review code and as such they only see the bugs that are left behind not the ones caught during development. They see a bug in a design but they don't see the bugs that a design avoids. As such they want "vulnerability mitigation", trying to somehow make bugs more graceful. Software developers on the other hand know that the bugs that are found and fixed are the ones that fail hard and fast, not the ones that almost work and don't make much noise. So while the "mitigations strategies" proposed by the security orthodoxy makes some bugs less serious, they often result in more bugs, not less.
Security research is a subsection of QA. Their job is to find bugs. That's it. They are not secret agents, or cool hackers from some bad TV show. Finding bugs is good, and there are many good security researchers who quietly do this work, but its a job, not a superhero identity. Finding a security bug does not make you cooler than finding any other QA bugs. I find many bugs everyday and when I do, I fix them and move on. What I don't do is come up with a cool name for them, register a domain, and then give talks at conferences about it. If you find a security bug in some important software that's great, but you do not deserve to be more famous than the people who wrote the software to begin with. Its the same bullshit that makes anyone with a badge think they are the lead in an action movie where they are the only ones who can save humanity. You are not cooler than a nurse, teacher, firefighter or someone else who keeps society working. The Security orthodoxy doesn't know how to be software developers better than software developers, if they did they would be software developers, so lets stop pretending they are.
All this wouldn't be so bad, if it wasn't for the fact that we are living in a security hellscape, and we desperately need better security.
I'm talking about the fact that every person is walking around with tracking devices that are owned and operated by surveillance capitalists whose very existence depends on stealing our data. Our computers have massive security risks because they, by design, are entirely relying on online services. I'm talking about how the entire population is being fed slop, to divert our attention away from reality, and towards endless consumption. I'm talking about how large sections of society have lost contact with what is real, because they are being manipulated for attention, control and profit. I'm talking about how most people store their data with someone else. You should not need the internet, passwords, secure connections, remote authentication, a monthly fee and the consent of a corporation to access your own data. What does privacy even mean, when people voluntarily install microphones in their bedrooms directly connected to a retail empire? The frog has been cooked.
Much of today's software stack depends on hundreds of source packages that are without review gets downloaded from corporately controlled repos, and compiled into systems, any of which could be compromised. Most of these are protected only by two factor authentication. The two factors are a password and a phone, where one can be reset with the other and the second one can be defeated with a sim swap that is widely available for purchase on the dark web. Every time a service asks to use your phone number as an authentication device, they should be required to display the current going rate of a sim swap on the dark web.
Large service providers routinely lock people and companies out from their accounts, data and livelihoods, without any recourse. This happens without explanation for fear of revealing their methods, on the advice of the security orthodoxy. The same security orthodoxy who are preaching public security vulnerability disclosure when it's someone else's software being affected. It's like corporatized ransomware only, at least real ransomware hackers, offer human tech support and the ability to buy our data back once they have stolen it. Windows updates, that are meant to protect me from viruses, have lost me far more data than any hacker has.
The Security orthodoxy, worrying about potential hacking of voting machines, somewhat misses the point, when the entire electorate has been hacked with disinformation to begin with.
Close to 10 years after Edward Snowden curiously revealed what many of us suspected, that the US and many other governments engage in widespread hacking and surveillance of the internet and that all major service providers have given them the back door keys to your data. The security orthodoxy has not been able to even present any feasible alternative future to a world of us being perpetually beholden to the whims of a few mega corporations who without any transparency bend to any government no matter how reprehensive. But that's just the thing, the security orthodoxy don't make things better, they don't create, they just complain about the people who do. They complain and people listen and obey without questioning their orthodoxy.
They say use managed languages, because performance doesn't matter. "Computers are fast now". Any time you hear that I want you to correct them, performance equals efficiency, battery life, hardware installation costs, and power consumption and CO2 emissions. Slow software is literally cooking our planet. Sure if something used to take 1 millisecond now takes 2 what's the harm? Your phone having half the battery is the harm. Who ever convinced the world that every computer without exception needed to be slowed down to mitigate Transient execution CPU attacks is responsible for billions of dollars of increased power consumption, and untold tonnage of co2 being released in the atmosphere, should be considered on par with the Exxon Valdez, or Deepwater Horizon disaster.
Off topic: You know who does deserve our adoration, Compiler optimizers. They should be treated like rockstars, for every 0.1% they eke out, every major cloud provider should send them a million bucks. They should be given parades for their service to humanity. Forget about climate campaigners, little children should grow up having glossy posters of compiler engineers on their walls, saying "Daddy, when I grow up I want to save the world by optimizing IR too."
The threat modelling of the security orthodoxy sucks so bad that one has to wonder if it's a deliberate tactic to keep us unsafe. Maybe they keep bringing up ridiculous fantasy exploitations like rowhammer and an attacker's ability to get hold of our credentials by submerging our memory chips in liquid nitrogen, to make us forget that in the real world dominated by corporations that employ a lot of the security orthodoxy, we are all pawned by default.
</RANT>
Manuscript of talk given at the Better software conference By Eskil steenberg Hald. 2025
#programming
2025-07-31 ยท 9 months ago ยท ๐ bsj38381, lojikil
1 Comment
๐ฒ lojikil ยท 2025-07-31 at 08:07:
So I'm a security researcher (it's in my current team's title and mandate), and I've spent most of my career in the space, but I absolutely agree. A friend of mine has a quip I like: sure you found a break, I'll listen when you can tell me how to never have that happen again.
I think we over-index on the "cool" breaking side of security, and forget about all the really interesting (but less punchy, less flashy) stuff you can do on the defense side of the house. A recent example is the cupsd vulnerability: neat break, absolutely no real impact for most (really, a majority of) users.