paint-brush
Android: An Effective Approach to Building Cachesby@vladkasprov
753 reads
753 reads

Android: An Effective Approach to Building Caches

by Vladyslav KasprovNovember 1st, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Enhancing App Performance with Data Caching: A Step-by-Step Guide for Implementing Runtime and Persistent Cache in Your Android App
featured image - Android: An Effective Approach to Building Caches
Vladyslav Kasprov HackerNoon profile picture

Do you want to cache some data in your app, but you’re unsure how to implement it in your codebase? I’ve been in this situation myself and would like to tell you how I tackled this problem and give a few examples of how you could do this.


In one of my previous Android apps, I found a few GET endpoints that were called very frequently. The network calls were initiated by many different places – view models, use cases, repositories, etc. Each caller was making an individual network call, locally using the response, and wasn’t sharing it with the other callers. This was very inefficient. Do you have a similar situation in your codebase?


This resulted in unnecessary network calls and battery drain. This was noticeable to the users since we were operating in Emerging Markets where network traffic is costly and phones aren’t the most capable. Also, we were putting a redundant load on the backend, and as our user base kept growing, our requests started to be blocked by the backend rate limiter. To fix this, I cached the data in runtime memory and created a global Single source of truth to read and mutate it. This allowed me to share the data and replace most network calls with a cache read.


This was a good start, but there was another problem. The fetching of the data wasn’t synchronized. If multiple callers needed the data at the same time, but the cache was empty, they were still making individual network calls, for example, on login or app launch. To fix this, I added a synchronous fetch to the existing solution. This allowed me to synchronize the calls and share the data when it was simultaneously requested by multiple callers.


Finally, there was a problem that this data was needed on app launch, but it wasn’t available in runtime memory after a process restart. I had to make a network call to fetch it and display a loading spinner in the meantime. This slows down the app launch and even makes it fail if the network call fails. To fix this, I added persistence to the existing solution. This allowed me to use a cached version of the data on app launch and unblock it from waiting for a network call.


Do you also have data in your app that you can benefit from caching?

Impact

As a result, the number of requests per second (RPS) to the cached endpoints has decreased by 66%. This was measured on the backend side. Given that these were the most frequently accessed endpoints, this has significantly cut down the cost of running our backend services.


Secondly, the app now launches faster and always successfully since it doesn’t depend on a network call. Instead, it now reads from a persistent storage, which is much more reliable. Previously, if the network call failed, an error UI with a retry prompt was displayed.

Runtime Cache

As the first step, I cached the data in runtime memory and created a global Single source of truth to read and mutate it. This allowed me to share the data and replace most network calls with a cache read. Previously, each caller was making an individual network call and using the data only locally. Now, callers can access the data through a shared repository.


Internally, I stored the data in a reactive data holder – MutableStateFlow, which allowed me to make it observable. This turned out to be very handy since my screens were already implemented using MVVM and could react to the data changes.


For example, imagine the following scenario. The data is used in Screen A and Screen B, the user navigates forward from Screen A to Screen B, changes the data in Screen B and then returns back to Screen A. In this case, Screen A will have to be manually reloaded unless it can observe the data and react to its changes.

Notes

  • All operations are thread-safe because value and update are thread-safe.
  • In this example, null is the initial and empty value, but you can adjust the code to have other behavior.
  • observe() emits only distinct values because MutableStateFlow skips values equal to the current one.
  • If you prefer RxJava, you can use BehaviorSubject instead of MutableStateFlow.

Runtime Cache with Synchronous Fetch

As the second step, I added the ability to synchronously fetch the data by calling updateAndGet. This is needed when multiple callers try to simultaneously fetch the data. For example, on login or app launch.


Previously, if multiple callers tried to fetch data at the same time, this could result in a race condition and multiple network calls. Now, the fetch operation is synchronized, only one network call will happen, and the response will be shared with all the awaiting callers.


Furthermore, by passing an update strategy, callers can specify when to fetch fresh data or a cached version can be returned. You can implement other strategies based on your use case.

Notes

  • All operations are thread-safe because they are synchronized with Mutex.
  • You need to write your own implementation of ExampleRemoteDataSource.

Runtime & Persistent Cache with Synchronous Fetch

Finally, along with caching the data in runtime memory, I persisted it as well. This is needed on app launch when the data isn’t available in runtime memory after a process restart but is required to display the first screen.


Previously, on app launch, before showing the first screen UI, I had to make a network call and display a loading spinner until it was completed. If it failed, I had to display an error UI with a retry prompt. This meant that the user couldn’t interact with the app until that network call had been successfully completed.


Now, on app launch, I simply load the previously cached data from the persistent storage. This made the app launch faster and always successful.

Notes

  • All operations are thread-safe because they are synchronized with Mutex.
  • The initial value is loaded from the persistent storage during ExampleRepository construction. Until the load has been completed, observe delays emissions and get returns null.
  • It’s up to you how to implement persistence, so write your own implementation of ExampleLocalDataSource. You can store the data in a database using Room or simply serialize it into JSON and use SharedPreferences.

Conclusion

This three-step refactoring process allowed me to safely and incrementally introduce caching into the existing codebase. Each step delivered measurable value to the company and the users, such as fewer network calls and a faster and more reliable app launch.


Can caching benefit your app as well? If the answer is, but you’re unsure where to start, consider my story as a possible guide and example.