AI that never sees your data leave your device.
Every competitor sends your transactions to cloud servers for AI processing. PocketVault runs everything locally.
Local LLM inference
PocketVault integrates llama.cpp for on-device large language model inference. Choose from multiple model tiers (fast, balanced, powerful) based on your device's capabilities.
- Choose from multiple model tiers
- Adapts to your device's capabilities
- No internet required for AI features
LLM Engine
Semantic matches
Semantic vector search
Search transactions by meaning using all-MiniLM-L6-v2 ONNX embeddings. Search "coffee purchases" and find Starbucks, Dunkin', and local cafes. Hybrid search combines semantic + keyword matching.
Smart categorization
The AI learns from your patterns and suggests categories for new transactions. Personalized to your spending, improving as you use it.
Learns
Adapts from your spending patterns
Suggests
Auto-categorizes new entries
Improves
Gets smarter over time
Natural language queries
Ask "How much did I spend on food last month?" and get streaming answers powered by on-device RAG that pulls context from your transaction history. The AI can also take actions — create budgets, tag transactions, or generate reports through tool-calling.
Take control of your financial privacy.
Try the finance app that respects your data. Free to start. Premium from $1/month. No account required.