Friday, November 22, 2024

Google’s Rust belts bugs out of Android in Safe Coding push

Must read

Google says its effort to prioritize memory-safe software development over the past six years has substantially reduced the number of memory safety vulnerabilities in its Android operating system.

In a report scheduled for publication on Wednesday, Google reveals the percentage of Android vulnerabilities attributable to memory safety issues has fallen from 76 percent in 2019 to an expected 24 percent by the end of 2024, which is significantly less than the industry norm of 70 percent.

That’s a meaningful reduction in the risk profile of Android code, which Android security team member Jeff Vander Stoep and Google senior software engineer Alex Rebert attribute to the adoption of Safe Coding, a set of software development practices that attempts to avoid the introduction of vulnerabilities through memory safe programming languages, including Rust; static analysis; and API design.

The Android team has observed that the rollback rate of Rust changes is less than half that of C++

“The shift from previous generations to Safe Coding can be seen in the quantifiability of the assertions that are made when developing code,” said Vander Stoep and Rebert.

“Instead of focusing on the interventions applied (mitigations, fuzzing), or attempting to use past performance to predict future security, Safe Coding allows us to make strong assertions about the code’s properties and what can or cannot happen based on those properties.”

Developing software in memory safe programming languages, such as C#, Go, Java, Python, and Swift, and Rust is a big part of Safe Coding. Memory safety flaws such as buffer overflows represent the majority of serious security vulnerabilities in large codebases and that realization has led to a broad push across the public and private sector to create fewer memory safety bugs.

Largely, this goal has been pursued by recommending against the use of C and C++, which require manual memory management and lead to memory vulnerabilities unless developers are extraordinarily diligent. (This has not been lost on the C/C++ community, where memory safety has also become a priority.)

For Google, the international memory safety crusade has meant more development in Rust for Android among other projects, which provides memory safety guarantees without compromising performance, at least most of the time. And doing so has improved productivity.

“Safe Coding improves code correctness and developer productivity by shifting bug finding further left, before the code is even checked in,” said Vander Stoep and Rebert.

“We see this shift showing up in important metrics such as rollback rates (emergency code revert due to an unanticipated bug). The Android team has observed that the rollback rate of Rust changes is less than half that of C++.”

The good news for organizations with a lot of unsafe legacy code is that rewriting old code in new languages probably isn’t necessary. As Vander Stoep and Rebert point out, vulnerabilities have half-life – they decay over time as code evolves. “For example, based on the average vulnerability lifetimes, 5-year-old code has a 3.4x (using lifetimes from the study) to 7.4x (using lifetimes observed in Android and Chromium) lower vulnerability density than new code,” they said.

That’s not to say old bugs miraculously become unexploitable. Rather, the overall density of vulnerabilities diminishes – a statistical win but not a guarantee of safety.

Even so, by focusing on making old code interoperable with memory safe code and by writing new code in memory safe languages, the natural decay rate of existing vulnerabilities tends to make large code bases safer over time without onerous code revision. If you stop making new memory safety bugs, the old bugs become less of an issue, at least in aggregate.

“Adopting Safe Coding in new code offers a paradigm shift, allowing us to leverage the inherent decay of vulnerabilities to our advantage, even in large existing systems,” said Vander Stoep and Rebert.

“The concept is simple: Once we turn off the tap of new vulnerabilities, they decrease exponentially, making all of our code safer, increasing the effectiveness of security design, and alleviating the scalability challenges associated with existing memory safety strategies such that they can be applied more effectively in a targeted manner.” ®

Latest article