By making one weird change in aapt2, we sped up our build by 45 seconds. Developers love that stuff.
This is the second in a three part series about adventures with aapt2, Android's resource compiler / optimizer. You can read the first bit and get more context here.
Proguard is an optimizer that many Android apps use. It can do nifty things like removing unused code and resources, inlining things that have no real reason to be in separate methods, and even obfuscating symbols so you can pretend like nobody will ever be able to figure out what your clever code is doing. In modern Android, the r8 shrinker has a similar function, and is driven by proguard configuration files.
However, r8 / Proguard can't always figure out if something is used, or sometimes optimizes more aggressively than you'd like. Configuration directives can be used to tell it to keep things that would otherwise be removed. aapt2 has options that let it emit configuration files for resource related code.
I used a profiler to look at the performance of aapt2 on our codebase, and it turned out that a significant chunk of the increased time was being spent in one function, aapt::proguard::CollectLocations(), which is part of the machinery that generates these rules. In particular, it was spending a lot of time generating rules for the --proguard-conditional-keep-rules option, which removes resource ids that don't match certain usage patterns known to be used in layouts.
It turned out that we didn't have that option turned on in our codebase (we use other tools for optimizations like this), so the extra work that was being done here was being thrown away anyway. I wrote up my findings and sent a patch upstream, which I think is always quite a polite thing to do when you discover an easily fixable issue. This immediately sped up our round trip build time by about 45 seconds. Many thanks to the folks at Google for quickly accepting this upstream!
But I was still not happy with how long developers had to wait for aapt2 to do its thing... Stay tuned for another optimization in the next post.
The next blog post in this series talks about another big performance optimization that came from reducing the number of languages we use in dev builds..