Zum Inhalt springen

How I Built a Privacy-First Expense Tracker with 100% On-Device AI

Hello everyone,

I’m excited to share a project I’ve been passionately building: PennyWise AI, a free, open-source, and privacy-first expense tracker for Android.

The guiding mission was to build a powerful financial tool on one non-negotiable rule: the user’s financial data must never leave their device.

You can check out the project on GitHub: https://github.com/sarim2000/pennywiseai-tracker

This „privacy-by-design“ principle led me down a fascinating technical path, forcing me to solve two major problems without relying on a server.

The Tech Stack

  • UI: 100% Jetpack Compose with Material Design 3.
  • Architecture: MVVM, Clean Architecture principles.
  • Async: Kotlin Coroutines and Flows.
  • DI: Hilt.
  • Database: Room for on-device storage.
  • The „Magic“: Google’s MediaPipe for on-device LLM.

Challenge 1: Building a Scalable, Offline SMS Parsing Engine

One of the key features is automatically logging transactions from bank SMS alerts. This is a common feature, but most apps do the parsing on their servers. To do it offline, I built a flexible parsing engine.

I used a factory pattern. A SmsReaderWorker (running via WorkManager) grabs new messages and passes them to a BankParserFactory. This factory maps a sender ID to a specific parser class.

// In SmsReaderWorker.kt
for (sms in messages) {
    // Factory returns the correct parser for the SMS sender
    val parser = BankParserFactory.getParser(sms.sender)
    if (parser != null) {
        val parsedTransaction = parser.parse(sms.body, sms.sender, sms.timestamp)
        // ... save to the local Room DB
    }
}

This architecture makes the app incredibly extensible. To prove it out, the initial launch supports over 15 banks in the complex Indian market, but the framework is ready to scale globally. New parsers for banks in any country can be added easily without touching the core logic.

Challenge 2: On-Device AI without Compromise

The app’s standout feature is a chat interface where you can ask natural language questions like, „What did I spend on food last month?“

Sending financial data to a cloud LLM API (like OpenAI or Gemini) was a privacy non-starter. The solution was Google’s MediaPipe LlmInference Task.

I’m using a Gemma 2B model that is downloaded and runs entirely on the user’s device. All inference happens locally.

// A look at the LlmInference service
override suspend fun initialize(modelPath: String): Result<Unit> = withContext(Dispatchers.IO) {
    try {
        val options = LlmInference.LlmInferenceOptions.builder()
            .setModelPath(modelPath)
            .build()
        llmInference = LlmInference.createFromOptions(context, options)
        Result.success(Unit)
    } catch (e: Exception) {
        Result.failure(e)
    }
}

This was a true game-changer. It allows PennyWise to offer a genuinely smart, next-gen feature while making the powerful promise that your financial data physically never leaves your phone.

Why Open Source?

For an app that handles sensitive data, transparency is the ultimate currency. I want users and developers to be able to look under the hood, verify the privacy claims, and contribute to making the app better.

If you’re an Android developer who believes in building privacy-first tools, I’d love for you to check out the repository.

Thanks for reading!

Star us on GitHub: https://github.com/sarim2000/pennywiseai-tracker

Schreibe einen Kommentar

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind mit * markiert