The 2010 Nall Report (PDF) shows that 70% of non-commercial and 60% of commercial accidents are caused by human error. This is because of the strict standards placed on certifying aircraft, and aircraft components.
When an airworthiness certificate is issued to an aircraft, the aircraft manufacturer provides a parts list of the components that makes up that aircraft; if there are any components in that aircraft that is not in the parts list, the particular aircraft could be declared not airworthy (there are some circumstances when an aircraft can be flown with missing/broken parts for the purpose of getting it to a repair shop).
For example, Diamond DA40's have two Garmin G1000 panels (display screen); if for example you replace one of the panels with the same display screen used in a Boeing 787 that aircraft is not airworthy.
Is the other panel capable if displaying the same screen? I'm sure it can, but the point is a DA40 is certified to have G1000 panels. It can be flown with the 787 panel, but it will be considered experimental.
I propose that secure software standards should be all-or-nothing. Either the software--and all of its dependencies--are compliant or the software is not compliant. Not owning the library, or database, will not be an excuse to meet the standards.
The application developer must specify the specific dependency versions that will be used with the application to make sure that no new security holes are introduced because a newer version of some dependency was installed.
I would even go further as to require that software running in the same environment--starting from the OS and device drivers, all the way to things like SSH, the shell, netstat, NTPD, etc--as the application must also be standards compliant.
If software that is not standards-compliant is installed in that same environment, then every application will be considered as non-compliant anymore.
I know that existing security standards such as PCI and OWASP do require that all security patches be installed on the system, and if potential security holes with third-party software are found during the audit that a bug report is filed; however, I feel this is the biggest shortcoming of these standards.
This gives the application owner an out and they are allowed to wait on the third-party to plug the security hole. For open-source software I think it is unacceptable for the user community (well the application developers to be exact) to just essentially do nothing.
If you are benefiting from that free software, do your share and contribute some fixes. Isn't having many more eyes looking at the code one of the biggest thing open-source advocates tout?
So why bring that third-party software open-source compliant, and provide your changes upstream for inclusion in the main release? I think it would be best for commercial software to have their software be standards-compliant so people will pay them for their software.
Many will argue that this bar very high. There are people out there who write software that has to be absolutely bullet-proof everyday. They do it because they know that someone will literally loose their life if their code is not solid.
I say its about time that software developers as a whole should write code knowing that lives will be ruined if they don't. Whether we like to admit it or not, but when people's personal information are stolen it ruins their lives.
Cross-posted from Home+Power
iSUPPORT this!
No comments:
Post a Comment