A lot of open source advocates, my self included, used to thing that if the code was open source then "1,000 eyes" would find and squash all bugs and security holes. Then one day, about 15 years ago, I read a paper by Ken Thompson, delivered at his 1975 ACM Turing Award ceremony, titled "Reflections on Trusting Trust".
https://www.cs.cmu.edu/~rdriley/487/...stingTrust.pdf
The necessity for keeping in mind what Thomson wrote was brought back home after I read a comment by a Blockchain enthusiast who thought that the fact that the blockchain was open source would protect it from security compromises.
Ken wrote this:
Ken had created a C compiler that, after being compiled at the 2nd stage of infection, would show no trace of a "Trojan Horse" in the binary but using that C binary to compile perfectly clean source code an app containing the Trojan Horse would result.
A commercial software company, for instance, could deliberate create such a compiler binary that they included with their programming tools. They could even supply the perfectly clean C compiler source code, and that source code could be compiled by the binary C compiler to create another C compiler that would match the original C compiler binary, BUT any code compiled by it, regardless of how clean the source is, will contain the Trojan Horse.
Could that happen to the g++ C compiler? From the Wikipedia:
I doubt that Stallman has written all those front ends to g++, that compile has been ported to many other OS's and platforms.
Could a back door exist in every program compiled by the g++ ? Or in all the microcode residing on the CPU, GPU and the PROMs on the mobos? The GPL requires that if someone receives a copy of the g++ binary they can ask for the source code as well. If the compiler was a 2nd stage product it could compile its own, clean, source code and still produce and infected g++ binary.
Do you trust Stallman and the folks who helped him create g++? Do you have a choice? Whether you think your system is clean or not depends on how cynical you are. I am very cynical.
https://www.cs.cmu.edu/~rdriley/487/...stingTrust.pdf
The necessity for keeping in mind what Thomson wrote was brought back home after I read a comment by a Blockchain enthusiast who thought that the fact that the blockchain was open source would protect it from security compromises.
CuNNTs, 13 hours ago
The compiled machine code is open sourced and independently reviewed by thousands of expert programers. SHA 256 implementation is not complex. The most significant hazards are found elsewhere - the disruption of the infrastructure. This is where true de-fi is so important.... we are not there yet; the more we move in that direction, the more the establishment will move to destroy it.
The compiled machine code is open sourced and independently reviewed by thousands of expert programers. SHA 256 implementation is not complex. The most significant hazards are found elsewhere - the disruption of the infrastructure. This is where true de-fi is so important.... we are not there yet; the more we move in that direction, the more the establishment will move to destroy it.
MORAL
The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, aloader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well-installed microcode bug will be almost impossible to detect.
The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, aloader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well-installed microcode bug will be almost impossible to detect.
A commercial software company, for instance, could deliberate create such a compiler binary that they included with their programming tools. They could even supply the perfectly clean C compiler source code, and that source code could be compiled by the binary C compiler to create another C compiler that would match the original C compiler binary, BUT any code compiled by it, regardless of how clean the source is, will contain the Trojan Horse.
Could that happen to the g++ C compiler? From the Wikipedia:
When it was first released in 1987 by Richard Stallman, GCC 1.0 was named the GNU C Compiler since it only handled the C programming language.[1] It was extended to compile C++ in December of that year. Front ends were later developed for Objective-C, Objective-C++, Fortran, Ada, D and Go, among others.[7] The OpenMP and OpenACC specifications are also supported in the C and C++ compilers.
Could a back door exist in every program compiled by the g++ ? Or in all the microcode residing on the CPU, GPU and the PROMs on the mobos? The GPL requires that if someone receives a copy of the g++ binary they can ask for the source code as well. If the compiler was a 2nd stage product it could compile its own, clean, source code and still produce and infected g++ binary.
Do you trust Stallman and the folks who helped him create g++? Do you have a choice? Whether you think your system is clean or not depends on how cynical you are. I am very cynical.
Comment