A Microsoft engineer noticed something was off on a piece of software he worked on. He soon discovered someone was probably trying to gain access to computers all over the world.
It’s almost impossible to spot by people looking directly at the code. I’m honestly surprised this one was discovered at all. People are still trying to deconstruct this exploit to figure out how the RCE worked.
And supply chain attacks are effectively impossible to eliminate as an attack vector by a developer-user of a N-level dependency. Not having dependencies or auditing every dependency is unreasonable in most cases.
People are still trying to deconstruct this exploit to figure out how the RCE worked.
True, but we do know how it got into xz in the first place. Human error and bad practice, we wouldn’t have to reverse engineer the exploit if xz didn’t allow binary commits all together.
It’s a very convoluted exploit with hiding “junk” and using awk and other commands to cut around that junk and combining it creating a payload and executing it. Our reliance on binary blobs is a double edged sword.
supply chain attacks are effectively impossible to eliminate as an attack vector by a developer-user of a N-level dependency. Not having dependencies or auditing every dependency is unreasonable in most cases.
Also true, because human error is impossible to snuff out completely, however it can be reduced if companies donated to the projects they use. For example, Microsoft depends on XZ and doesn’t donate them anything. It’s free as in freedom not cost. Foss devs aren’t suppliers, it comes as is. If you want improvements in the software your massive company relies on, then donate, otherwise don’t expect anything, they aren’t your slaves.
Generate the binaries during test execution from known (version controlled) inputs, plaintext files and things. Don’t check binaries into source control, especially not intentionally corrupt ones that other maintainers and observers don’t know what they may contain.
Right now the greatest level of supply chain secuirty that I know of is formal verification, source reproducible builds, and full source bootstrapping build systems. There was a neat FPGA bootstrapping proj3ct (the whole toolchain to program the fpga could be built on the FPGA) at last years FOSDEMs conference, and I have to admit the idea of a physically verifiable root of trust is super exciting to me, but also out of reach for 98% of projects (though more possible by the day).
It’s almost impossible to spot by people looking directly at the code. I’m honestly surprised this one was discovered at all. People are still trying to deconstruct this exploit to figure out how the RCE worked.
And supply chain attacks are effectively impossible to eliminate as an attack vector by a developer-user of a N-level dependency. Not having dependencies or auditing every dependency is unreasonable in most cases.
There are sysadmins that discover a major vulnerabilities though troubleshooting
The key is the number of people involved
So obscure projects are fucked.
No one cares about obscure projects from an attack perspective. What you should be worried about is the dependency chain.
True, but we do know how it got into xz in the first place. Human error and bad practice, we wouldn’t have to reverse engineer the exploit if xz didn’t allow binary commits all together. It’s a very convoluted exploit with hiding “junk” and using awk and other commands to cut around that junk and combining it creating a payload and executing it. Our reliance on binary blobs is a double edged sword.
Also true, because human error is impossible to snuff out completely, however it can be reduced if companies donated to the projects they use. For example, Microsoft depends on XZ and doesn’t donate them anything. It’s free as in freedom not cost. Foss devs aren’t suppliers, it comes as is. If you want improvements in the software your massive company relies on, then donate, otherwise don’t expect anything, they aren’t your slaves.
You can’t test a archive program without binaries
Generate the binaries during test execution from known (version controlled) inputs, plaintext files and things. Don’t check binaries into source control, especially not intentionally corrupt ones that other maintainers and observers don’t know what they may contain.
Exactly this. Couldn’t have said it better myself.
laughs in Gentoo
Right now the greatest level of supply chain secuirty that I know of is formal verification, source reproducible builds, and full source bootstrapping build systems. There was a neat FPGA bootstrapping proj3ct (the whole toolchain to program the fpga could be built on the FPGA) at last years FOSDEMs conference, and I have to admit the idea of a physically verifiable root of trust is super exciting to me, but also out of reach for 98% of projects (though more possible by the day).