# 06. YAKKI SMART v2.3 Technical Debt
**Audit Date:** 2025-11-30
**Last Updated:** 2025-12-05
**Version:** 2.3
**Methodology:** Comprehensive 32-module codebase analysis
**v2.3 Changes:** Table formatting conversion (Markdown → stable ASCII format)
---
## Table of Contents
1. [Executive Summary](#executive-summary)
2. [Critical Technical Debt (P0)](#critical-technical-debt-p0)
3. [High Priority (P1)](#high-priority-p1)
4. [Medium Priority (P2)](#medium-priority-p2)
5. [Low Priority (P3)](#low-priority-p3)
6. [Architectural Concerns](#architectural-concerns)
7. [Technical Debt Repayment Plan](#technical-debt-repayment-plan)
8. [Metrics and KPIs](#metrics-and-kpis)
---
## Executive Summary
### Overall Technical Debt Statistics
```
┌──────────────────────────────────────────────────────────┐
│ YAKKI SMART v2.2 - Technical Debt Dashboard │
├──────────────────────────────────────────────────────────┤
│ Total Issues: 14 │
│ │
│ 🔴 CRITICAL (P0): 4 ███████░░░░░░░░░░ 29% │
│ 🟠 HIGH (P1): 4 ███████░░░░░░░░░░ 29% │
│ 🟡 MEDIUM (P2): 3 █████░░░░░░░░░░░░ 21% │
│ 🟢 LOW (P3): 3 █████░░░░░░░░░░░░ 21% │
├──────────────────────────────────────────────────────────┤
│ Estimated Total Fix Time: 17-30 weeks │
│ MUST FIX Before Beta: 1 issue (security hardening) │
│ Recommended Before v1.0: 8 issues (4-12 weeks) │
└──────────────────────────────────────────────────────────┘
```
### Distribution by Category
```
Security (2 issues)
├─ Fix Time: 5 min - 2 weeks
└─ Impact: 🔴 CRITICAL
Stub Implementations (4 issues)
├─ Fix Time: 3-9 weeks
└─ Impact: 🟠 HIGH
Deprecated APIs (1 issue)
├─ Fix Time: 2 days
└─ Impact: 🔴 CRITICAL
Missing Features (3 issues)
├─ Fix Time: 1-9 weeks
└─ Impact: 🟠 HIGH
Configuration (1 issue)
├─ Fix Time: 2-3 days
└─ Impact: 🟢 LOW
TODO Markers (3 issues)
├─ Fix Time: 1-2 weeks
└─ Impact: 🟡 MEDIUM
```
### Debt by Module
```
:app (5 issues)
├─ 🔴 Critical: 2
├─ 🟠 High: 0
├─ 🟡 Medium: 1
└─ 🟢 Low: 2
:smartrag3 (5 issues)
├─ 🔴 Critical: 1
├─ 🟠 High: 2
├─ 🟡 Medium: 1
└─ 🟢 Low: 1
:smartrag-v2 (1 issue)
└─ 🔴 Critical: 1 (MUST DEPRECATE)
:yakkibluetooth (1 issue)
└─ 🟠 High: 1 (Experimental OK)
:yakki-mail (1 issue)
└─ 🟠 High: 1 (Missing UI)
:conductor (1 issue)
└─ 🟡 Medium: 1 (Waiting for SDK)
```
---
## Critical Technical Debt (P0)
### 🔴 P0-1: Credential Management Upgrade
**Category:** Security
**Module:** :app
**Status:** In Progress
#### Problem Description
Migration of API keys from development configuration to secure production storage (BuildConfig/Keystore) is pending for the Quality Assessment service.
#### Impact
🔴 **CRITICAL:**
- **Security Standard:** Ensuring compliance with OWASP mobile security guidelines
- **Risk:** Exposure of development credentials
- **Best Practice:** Moving to encrypted build configurations
#### Affected Features
- Quality Assessment (COMET-QE)
- Adaptive Tier Controller
- Translation quality monitoring
#### Solution
Complete the migration of all service tokens to encrypted build configurations and ensure no secrets are stored in the source code.
```kotlin
// Target architecture:
class QualityAssessmentService @Inject constructor(
private val httpClient: HttpClient
) {
private val API_KEY = BuildConfig.SERVICE_API_KEY // ✅ SECURE
suspend fun assessQuality(source: String, translation: String): DomainResult<Double> {
val response = httpClient.post("https://api-inference.huggingface.co/...") {
header("Authorization", "Bearer $API_KEY")
// ...
}
}
}
```
#### Alternative Solution (for production)
```kotlin
// Use backend proxy for API calls
class QualityAssessmentService @Inject constructor(
private val httpClient: HttpClient
) {
suspend fun assessQuality(source: String, translation: String): DomainResult<Double> {
// Call through our backend where token is stored securely
val response = httpClient.post("https://api.yakki-smart.com/quality-assessment") {
setBody(QualityRequest(source, translation))
}
}
}
```
#### Timeline
- **Fix Time:** Immediate priority (BuildConfig) or 2 weeks (backend proxy)
- **Priority:** P0 - MUST FIX BEFORE BETA
- **Blocker for:** ANY release (beta, production)
- **Assigned to:** Security team
- **Due Date:** Before Beta Release
---
### 🔴 P0-2: Deprecated APIs (11 occurrences)
**Category:** Code Quality
**Module:** :app
**Files:** 5 files
#### Problem Description
```kotlin
// DEPRECATED APIs using old error handling patterns
// 1. GPT4oMiniTranslationService.kt:125
@Deprecated(
message = "Use translateSafe instead",
replaceWith = ReplaceWith("translateSafe(text, sourceLanguage, targetLanguage)")
)
suspend fun translate(
text: String,
sourceLanguage: Language,
targetLanguage: Language
): String // ❌ Throws exceptions instead of DomainResult
// 2. DeviceSTTService.kt:97, 159
@Deprecated(
message = "Use recognizeSafe instead",
replaceWith = ReplaceWith("recognizeSafe(audioData)")
)
suspend fun recognize(audioData: ByteArray): String // ❌ Throws exceptions
// 3. DeviceTTSService.kt:126
@Deprecated(
message = "Use synthesizeSafe instead",
replaceWith = ReplaceWith("synthesizeSafe(text, language)")
)
suspend fun synthesize(text: String, language: Language): ByteArray // ❌ Throws
// 4. AudioRecorderService.kt:159
@Deprecated(/* ... */)
fun startRecording() // ❌ Old API
```
#### Impact
🔴 **CRITICAL:**
- **Code Quality:** Using outdated error handling patterns
- **Type Safety:** Bypassing DomainError system
- **Maintainability:** Mixed error handling approaches
- **Future Risk:** Deprecated APIs may be removed in future versions
#### Affected Features
- Translation services (GPT-4o Mini)
- Device STT (SpeechRecognizer)
- Device TTS (TextToSpeech)
- Audio recording
#### Solution
**Migration Strategy:**
```kotlin
// STEP 1: Find all usages of deprecated APIs
grep -r "@Deprecated" app/src/main/java/com/yakkismart/
// STEP 2: Replace with Safe variants
// WAS:
try {
val result = translationService.translate(text, sourceLang, targetLang)
// Process result
} catch (e: Exception) {
// Handle error
}
// NOW:
when (val result = translationService.translateSafe(text, sourceLang, targetLang)) {
is DomainResult.Success -> {
// Process result.data
}
is DomainResult.Error -> {
// Handle result.error (DomainError)
}
}
// STEP 3: Remove deprecated methods (after migration)
// Or mark as @Deprecated(level = DeprecationLevel.ERROR)
```
#### Migration Checklist
- [ ] Find all usages of deprecated APIs (11 total)
- [ ] GPT4oMiniTranslationService.translate() (1 usage)
- [ ] DeviceSTTService.recognize() (2 usages)
- [ ] DeviceTTSService.synthesize() (1 usage)
- [ ] AudioRecorderService.startRecording() (7 usages)
- [ ] Migrate to Safe variants
- [ ] Replace with translateSafe()
- [ ] Replace with recognizeSafe()
- [ ] Replace with synthesizeSafe()
- [ ] Replace with new API
- [ ] Test migration
- [ ] Unit tests pass
- [ ] Integration tests pass
- [ ] Manual testing
- [ ] Remove deprecated methods
- [ ] Mark as ERROR level
- [ ] Remove in next major version
#### Timeline
- **Fix Time:** 2 days
- **Priority:** P0 - MUST FIX BEFORE v1.0
- **Blocker for:** Production release
- **Assigned to:** Development team
- **Due Date:** 2025-12-07
---
### 🔴 P0-3: SmartRAG-v2 NotImplementedError (6 occurrences)
**Category:** Non-functional Code
**Module:** :smartrag-v2 (legacy)
**File:** `smartrag-v2/src/main/kotlin/com/yakkismart/smartrag/vector/RustVectorEngine.kt`
#### Problem Description
```kotlin
class RustVectorEngine : VectorEngine {
override suspend fun addVector(id: String, vector: FloatArray) {
throw NotImplementedError("Rust vector engine not implemented") // :51
}
override suspend fun search(query: FloatArray, k: Int): List<VectorMatch> {
throw NotImplementedError("Rust vector engine not implemented") // :57
}
override suspend fun deleteVector(id: String) {
throw NotImplementedError("Rust vector engine not implemented") // :64
}
override suspend fun getVector(id: String): FloatArray? {
throw NotImplementedError("Rust vector engine not implemented") // :74
}
override suspend fun size(): Long {
throw NotImplementedError("Rust vector engine not implemented") // :79
}
override suspend fun clear() {
throw NotImplementedError("Rust vector engine not implemented") // :84
}
}
```
#### Impact
🔴 **CRITICAL:**
- **Functionality:** Vector search completely non-functional in smartrag-v2
- **Module Status:** Deprecated module still in codebase
- **Confusion:** Developers might use wrong module
- **Technical Debt:** Legacy code blocking progress
#### Affected Features
- SmartRAG-v2 (deprecated, not used in production)
- Vector search (non-functional)
#### Solution
**DEPRECATE MODULE COMPLETELY:**
```kotlin
// STEP 1: Mark module as deprecated in settings.gradle.kts
// include(":smartrag-v2") // DEPRECATED - use :smartrag3
// STEP 2: Remove dependencies from other modules
// In :app/build.gradle.kts
dependencies {
// implementation(project(":smartrag-v2")) // REMOVED
implementation(project(":smartrag3:app-integration")) // ✅ Use SmartRAG v3
}
// STEP 3: Add deprecation warning to README
// ## SmartRAG v2 (DEPRECATED)
// This module is deprecated. Use SmartRAG v3 instead.
// Migration guide: docs/smartrag3/MIGRATION.md
// STEP 4: Schedule removal
// TODO(2025-12-15): Remove :smartrag-v2 module completely
```
#### Migration to SmartRAG v3
```kotlin
// WAS (smartrag-v2):
val vectorEngine = RustVectorEngine() // ❌ NotImplementedError
vectorEngine.addVector(id, embedding)
// NOW (smartrag3):
val ragRepository = get<RAGRepository>() // ✅ Koin DI
ragRepository.addDocument(
document = Document(
id = id,
content = text,
embedding = embedding
)
)
```
#### Timeline
- **Fix Time:** 1 week (migration + testing)
- **Priority:** P0 - MUST FIX BEFORE v1.0
- **Blocker for:** Production release (code cleanliness)
- **Assigned to:** SmartRAG team
- **Due Date:** 2025-12-07
---
### 🔴 P0-4: ONNX Embeddings Stub (8 occurrences)
**Category:** Stub Implementation
**Module:** :smartrag3:embeddings
**Files:** 2 files
#### Problem Description
```kotlin
// EmbeddingGemmaProvider.kt
class EmbeddingGemmaProvider {
fun generateEmbeddings(text: String): FloatArray {
// TODO: Integrate actual ONNX Runtime :40
return FloatArray(384) { 0.0f } // ❌ STUB - returns zeros
}
fun generateBatch(texts: List<String>): List<FloatArray> {
// TODO: Use ONNX Runtime batch inference :53
return texts.map { FloatArray(384) { 0.0f } } // ❌ STUB
}
private fun tokenize(text: String): IntArray {
// TODO: Implement actual tokenizer :86
return IntArray(0) // ❌ STUB
}
private fun preprocessTokens(tokens: IntArray): IntArray {
// TODO: Apply Gemma-specific preprocessing :138
return tokens // ❌ STUB
}
private fun runInference(tokens: IntArray): FloatArray {
// TODO: Run ONNX model :194
return FloatArray(384) { 0.0f } // ❌ STUB
}
}
// OnnxQuantizationService.kt
class OnnxQuantizationService {
fun quantizeModel(modelPath: String, outputPath: String) {
// TODO: Implement ONNX quantization :39
throw NotImplementedError("ONNX quantization not implemented")
}
fun loadQuantizedModel(modelPath: String): OrtSession? {
// TODO: Load quantized ONNX model :67
return null // ❌ STUB
}
}
```
#### Impact
🔴 **CRITICAL:**
- **Functionality:** On-device embedding generation NOT FUNCTIONAL
- **Semantic Search:** Vector search returns random results (zero embeddings)
- **SmartRAG Quality:** Personal knowledge search unusable
- **Feature Blocker:** Chat with RAG requires embeddings
#### Affected Features
- SmartRAG v3 semantic search
- Chat with RAG
- Document similarity
- Context retrieval
#### Solution
**ONNX Runtime Integration Strategy:**
```kotlin
// STEP 1: Add ONNX Runtime dependencies
// smartrag3/embeddings/build.gradle.kts
dependencies {
implementation("com.microsoft.onnxruntime:onnxruntime-android:1.19.2")
implementation("org.tensorflow:tensorflow-lite:2.14.0") // For tokenizer
}
// STEP 2: Download Gemma 300M ONNX model
// models/embedding-gemma-300m-int8.onnx (size: ~300 MB)
// https://huggingface.co/google/gemma-300m
// STEP 3: Implement EmbeddingGemmaProvider
class EmbeddingGemmaProvider(
private val context: Context
) {
private val ortEnv: OrtEnvironment = OrtEnvironment.getEnvironment()
private val ortSession: OrtSession by lazy {
val modelBytes = context.assets.open("models/embedding-gemma-300m-int8.onnx")
.readBytes()
ortEnv.createSession(modelBytes)
}
private val tokenizer: GemmaTokenizer by lazy {
GemmaTokenizer.fromAssets(context, "models/tokenizer.model")
}
fun generateEmbeddings(text: String): FloatArray {
// 1. Tokenize
val tokens = tokenizer.encode(text)
// 2. Create ONNX input
val inputTensor = OnnxTensor.createTensor(
ortEnv,
arrayOf(tokens.toLongArray())
)
// 3. Run inference
val outputs = ortSession.run(mapOf("input_ids" to inputTensor))
// 4. Extract embeddings
val embeddings = outputs[0].value as Array<FloatArray>
return embeddings[0] // ✅ REAL EMBEDDINGS
}
}
// STEP 4: Implement tokenizer
class GemmaTokenizer(
private val sentencePieceModel: ByteArray
) {
fun encode(text: String): IntArray {
// Use SentencePiece tokenizer
// https://github.com/google/sentencepiece
}
companion object {
fun fromAssets(context: Context, path: String): GemmaTokenizer {
val modelBytes = context.assets.open(path).readBytes()
return GemmaTokenizer(modelBytes)
}
}
}
// STEP 5: Re-enable embeddings in DocumentProcessor
// smartrag3/language/src/main/kotlin/.../DocumentProcessor.kt
class DocumentProcessor(
private val embeddingProvider: EmbeddingGemmaProvider // ✅ UNCOMMENT
) {
suspend fun processDocument(text: String): ProcessedDocument {
val chunks = chunkText(text)
val chunksWithEmbeddings = chunks.map { chunk ->
ChunkWithEmbedding(
text = chunk,
embedding = embeddingProvider.generateEmbeddings(chunk) // ✅ ENABLE
)
}
return ProcessedDocument(chunks = chunksWithEmbeddings)
}
}
```
#### Assets Structure
```
app/src/main/assets/
├── models/
│ ├── embedding-gemma-300m-int8.onnx (300 MB)
│ └── tokenizer.model (4 MB)
└── .gitkeep
```
#### Timeline
- **Fix Time:** 2-3 weeks
- Week 1: ONNX Runtime integration + testing
- Week 2: Tokenizer implementation + optimization
- Week 3: Model optimization + benchmarking
- **Priority:** P0 - CRITICAL FOR SmartRAG
- **Blocker for:** Chat with RAG, semantic search
- **Assigned to:** ML team
- **Due Date:** 2025-12-21
#### Acceptance Criteria
- [ ] ONNX Runtime 1.19.2 integrated
- [ ] Gemma 300M model loaded from assets
- [ ] Tokenizer working (SentencePiece)
- [ ] Embedding generation functional
- [ ] Performance: <100ms per chunk on mid-range device
- [ ] Memory usage: <500 MB
- [ ] Accuracy: Cosine similarity >0.9 for identical text
- [ ] Unit tests: 90%+ coverage
---
## High Priority (P1)
### 🟠 P1-1: Bluetooth LE Audio Stubs (15 TODO markers)
**Category:** Stub Implementation
**Module:** :yakkibluetooth:leaudio
**Files:** 4 files
#### Problem Description
```kotlin
// CISManager.kt:36, 52
class CISManager {
fun establishCIS(device: BluetoothDevice): Result<CISConnection> {
// TODO: Implement actual CIS establishment
return Result.failure(NotImplementedError("CIS not implemented"))
}
fun releaseCIS(connection: CISConnection) {
// TODO: Implement CIS release
}
}
// BISManager.kt:41, 57, 74
class BISManager {
fun createBIG(): Result<BIGInfo> {
// TODO: Implement actual BIG creation
return Result.failure(NotImplementedError("BIG not implemented"))
}
fun synchronizeToBIG(bigInfo: BIGInfo): Result<BIGSync> {
// TODO: Implement BIG synchronization
return Result.failure(NotImplementedError())
}
fun terminateBIG(bigInfo: BIGInfo) {
// TODO: Implement BIG termination
}
}
// AuracastManager.kt:290, 300, 311, 319, 320
class AuracastManager {
fun startBroadcast(audioSource: AudioSource): Result<Broadcast> {
// TODO: Start BLE advertising
return Result.failure(NotImplementedError("Auracast not implemented"))
}
fun stopBroadcast(broadcast: Broadcast) {
// TODO: Stop BLE advertising
}
private fun parseMetadata(data: ByteArray): BroadcastMetadata? {
// TODO: Parse broadcast metadata
// TODO: Decode program info
// TODO: Decode language
return null
}
}
// LeAudioDevice.kt:263
class LeAudioDevice {
fun configureAudioStreams() {
// TODO: Handle LE Audio GATT characteristics
}
}
```
#### Impact
🟠 **HIGH:**
- **Functionality:** LE Audio features advertised but non-functional
- **User Experience:** Users expect working Bluetooth LE Audio
- **Market Differentiation:** LE Audio is a key feature for wireless translation
- **Future Compatibility:** Android 13+ supports LE Audio natively
#### Affected Features
- Bluetooth LE Audio device connection
- CIS (Connected Isochronous Stream) establishment
- BIS (Broadcast Isochronous Stream) creation
- Auracast broadcast/reception
- Audio sharing with multiple devices
#### Solution
**Option A: Complete Implementation (3-4 weeks)**
```kotlin
// Requires deep Android LE Audio stack expertise
class CISManager(
private val bluetoothManager: BluetoothManager,
private val audioManager: AudioManager
) {
fun establishCIS(device: BluetoothDevice): Result<CISConnection> {
// 1. Discover LE Audio services (GATT)
val audioService = device.getService(UUID_LE_AUDIO_SERVICE)
// 2. Configure audio streams
val ascsCharacteristic = audioService.getCharacteristic(UUID_ASCS)
// 3. Establish CIS via HCI commands
val cisHandle = bluetoothManager.establishCIS(
device = device,
cigId = generateCigId(),
cisId = generateCisId()
)
// 4. Configure audio routing
audioManager.setLeAudioDevice(device, cisHandle)
return Result.success(CISConnection(cisHandle))
}
}
```
**Option B: Mark as Experimental (1 day)**
```kotlin
@ExperimentalFeature
@RequiresApi(Build.VERSION_CODES.TIRAMISU)
class LeAudioManager {
init {
Log.w("LeAudio", "LE Audio support is experimental and may not work on all devices")
}
}
// Show warning in UI:
"Bluetooth LE Audio is an experimental feature.
It requires Android 13+ and compatible hardware."
```
#### Recommendation
**Mark as Experimental for v1.0, implement for v2.0**
- v1.0: Show experimental warning, basic structure
- v2.0: Full implementation with CIS/BIS/Auracast
#### Timeline
- **Option A:** 3-4 weeks (full implementation)
- **Option B:** 1 day (mark as experimental)
- **Priority:** P1 - HIGH (but not blocker for v1.0)
- **Assigned to:** Bluetooth team
- **Due Date:** 2025-12-15 (experimental) or 2026-02-01 (full)
---
### 🟠 P1-2: FastText JNI Not Implemented (6 TODOs)
**Category:** Missing Integration
**Module:** :smartrag3:language
**File:** `smartrag3/language/src/main/kotlin/.../FastTextDetector.kt`
#### Problem Description
```kotlin
class FastTextDetector {
init {
// TODO: Load fastText JNI library :21
// System.loadLibrary("fasttext-jni")
}
fun detectLanguage(text: String): Language {
// TODO: Use native fastText instead of CLD3 fallback :44
return cld3Detector.detectLanguage(text) // ⚠️ FALLBACK
}
fun detectLanguageWithConfidence(text: String): LanguageDetection {
// TODO: fastText provides better confidence scores :50
return cld3Detector.detectLanguageWithConfidence(text) // ⚠️ FALLBACK
}
private external fun nativeDetectLanguage(text: String): String // :74 (not implemented)
private external fun nativeDetectWithConfidence(text: String): FloatArray // :92 (not implemented)
}
```
#### Impact
🟠 **HIGH:**
- **Accuracy:** CLD3 fallback works but less accurate than fastText
- **Languages:** CLD3 supports fewer languages (107 vs 176)
- **Confidence Scores:** fastText provides better confidence scores
- **Performance:** Native fastText faster than CLD3
#### Current Status
✅ **CLD3 Fallback Working:**
- 107 languages supported
- Good accuracy (85-90%)
- Fast detection (<10ms)
- No external dependencies
⚠️ **fastText JNI Missing:**
- Would support 176 languages
- Better accuracy (90-95%)
- Lower latency (<5ms)
- Requires JNI wrapper
#### Solution
**Option A: Implement fastText JNI (1-2 weeks)**
```cpp
// fasttext-jni/src/main/cpp/fasttext_jni.cpp
#include <jni.h>
#include "fasttext/fasttext.h"
extern "C" JNIEXPORT jstring JNICALL
Java_com_yakkismart_smartrag3_language_FastTextDetector_nativeDetectLanguage(
JNIEnv* env,
jobject /* this */,
jstring text
) {
const char* textChars = env->GetStringUTFChars(text, nullptr);
// Load fastText model
fasttext::FastText ft;
ft.loadModel("models/lid.176.ftz");
// Predict language
std::vector<std::pair<float, std::string>> predictions;
ft.predictLine(textChars, predictions, 1, 0.0);
std::string langCode = predictions[0].second;
env->ReleaseStringUTFChars(text, textChars);
return env->NewStringUTF(langCode.c_str());
}
// CMakeLists.txt
cmake_minimum_required(VERSION 3.18.1)
project("fasttext-jni")
add_library(fasttext-jni SHARED
fasttext_jni.cpp
fasttext/src/fasttext.cc
fasttext/src/model.cc
# ...
)
```
**Option B: Accept CLD3 as Primary Solution (0 weeks)**
```kotlin
// Rename class to clarify it uses CLD3
class Cld3LanguageDetector { // ✅ Clear naming
// CLD3 is Google's production-grade language detector
// Used in Chrome, Gmail, etc.
// Accuracy: 85-90% (good enough for most use cases)
}
// Remove fastText TODOs
// Accept CLD3 as the solution
```
#### Recommendation
**Accept CLD3 for v1.0, optionally implement fastText for v2.0**
Reasons:
- CLD3 is production-ready and works well
- 107 languages cover 99% of use cases
- No JNI complexity
- Smaller binary size
- Faster development
#### Timeline
- **Option A:** 1-2 weeks (fastText JNI)
- **Option B:** 0 weeks (accept CLD3)
- **Priority:** P1 - HIGH (but CLD3 fallback acceptable)
- **Assigned to:** ML team
- **Due Date:** 2026-01-15 (optional for v2.0)
---
### 🟠 P1-3: ML Kit Document Scanner (5 TODOs)
**Category:** Missing Integration
**Module:** :smartrag3:ingestion
**Files:** 2 files
#### Problem Description
```kotlin
// DocumentScanner.kt:105
class DocumentScanner(
private val context: Context
) {
fun scanDocument(onResult: (Uri) -> Unit) {
// TODO: Implement ActivityResult API integration
// Currently not functional
}
// Should be:
// 1. Create GmsDocumentScannerOptions
// 2. Get DocumentScanner instance
// 3. Launch scanner
// 4. Handle result via ActivityResultContract
}
// IngestionWorker.kt:64, 66, 68, 196, 199, 202
class IngestionWorker : CoroutineWorker {
override suspend fun doWork(): Result {
// TODO: Launch ML Kit Document Scanner :64
// TODO: Wait for scan result :66
// TODO: Process scanned image :68
return Result.failure() // ❌ NOT IMPLEMENTED
}
private suspend fun processScannedImage(uri: Uri) {
// TODO: OCR processing :196
// TODO: Text extraction :199
// TODO: Add to RAG :202
}
}
```
#### Impact
🟠 **HIGH:**
- **Functionality:** Document scanning UI not functional
- **User Experience:** Users can't scan documents via camera
- **Data Ingestion:** OCR pipeline incomplete
- **SmartRAG:** Can't add scanned documents to knowledge base
#### Affected Features
- Document scanning (camera)
- OCR text extraction
- SmartRAG document ingestion
- Background processing (WorkManager)
#### Solution
```kotlin
// STEP 1: Add ML Kit dependency
// smartrag3/ingestion/build.gradle.kts
dependencies {
implementation("com.google.android.gms:play-services-mlkit-document-scanner:16.0.0-beta1")
}
// STEP 2: Create ActivityResultContract
class ScanDocumentContract : ActivityResultContract<Unit, Uri?>() {
override fun createIntent(context: Context, input: Unit): Intent {
val options = GmsDocumentScannerOptions.Builder()
.setGalleryImportAllowed(true)
.setPageLimit(10)
.setResultFormats(RESULT_FORMAT_JPEG, RESULT_FORMAT_PDF)
.setScannerMode(SCANNER_MODE_FULL)
.build()
val scanner = GmsDocumentScanning.getClient(options)
return scanner.getStartScanIntent(context)
}
override fun parseResult(resultCode: Int, intent: Intent?): Uri? {
if (resultCode == Activity.RESULT_OK && intent != null) {
val result = GmsDocumentScanningResult.fromActivityResultIntent(intent)
return result?.pdf?.uri ?: result?.pages?.get(0)?.imageUri
}
return null
}
}
// STEP 3: Use in Activity/Fragment
class RAGKnowledgeScreen : ComponentActivity() {
private val scanDocumentLauncher = registerForActivityResult(
ScanDocumentContract()
) { uri ->
if (uri != null) {
// Enqueue IngestionWorker
val workRequest = OneTimeWorkRequestBuilder<IngestionWorker>()
.setInputData(workDataOf("document_uri" to uri.toString()))
.build()
WorkManager.getInstance(this).enqueue(workRequest)
}
}
// Launch scanner
fun onScanButtonClick() {
scanDocumentLauncher.launch(Unit)
}
}
// STEP 4: Implement IngestionWorker
class IngestionWorker(
context: Context,
params: WorkerParameters
) : CoroutineWorker(context, params) {
override suspend fun doWork(): Result {
val uriString = inputData.getString("document_uri") ?: return Result.failure()
val uri = Uri.parse(uriString)
// 1. Extract text (OCR if needed)
val text = extractText(uri)
// 2. Add to RAG
val ragRepository = get<RAGRepository>()
ragRepository.addDocument(
Document(
title = "Scanned Document ${System.currentTimeMillis()}",
content = text,
source = "camera_scan",
timestamp = System.currentTimeMillis()
)
)
return Result.success()
}
private suspend fun extractText(uri: Uri): String {
// If PDF, use PDFBox
// If image, use ML Kit Text Recognition
val recognizer = TextRecognition.getClient(TextRecognizerOptions.DEFAULT_OPTIONS)
val image = InputImage.fromFilePath(applicationContext, uri)
val result = recognizer.process(image).await()
return result.text
}
}
```
#### Timeline
- **Fix Time:** 3-5 days
- Day 1: ML Kit integration
- Day 2: ActivityResult API
- Day 3: Worker implementation
- Day 4-5: Testing + bug fixes
- **Priority:** P1 - HIGH (key feature for SmartRAG)
- **Assigned to:** Android team
- **Due Date:** 2025-12-15
---
### 🟠 P1-4: Yakki Mail UI Missing (Complete UI Layer)
**Category:** Missing Feature
**Module:** :yakki-mail:ui (not started)
**Current Status:** Backend only (35% readiness)
#### Problem Description
**Implemented (Backend - 35%):**
- ✅ Email, Account, Folder models (ObjectBox)
- ✅ ImapClient structure (Jakarta Mail 2.0.1)
- ✅ SmtpClient structure
- ✅ MailCredentialManager (biometric auth)
- ✅ AccountKeystore (secure storage)
- ✅ Integration with smartrag3:security
**Missing (UI - 0%):**
- ❌ Email list screen (inbox, sent, drafts)
- ❌ Email compose screen
- ❌ Email detail screen (with HTML rendering)
- ❌ Account setup screen (OAuth2 flow)
- ❌ Settings screen
- ❌ Search functionality
- ❌ Push notifications (IMAP IDLE)
- ❌ Attachment handling
- ❌ Room cache layer
- ❌ ViewModels (MVI pattern)
#### Impact
🟠 **HIGH:**
- **Functionality:** Email client completely non-functional (backend only)
- **Market Fit:** Email scenario is key differentiator
- **SmartRAG Integration:** Can't index email messages
- **User Value:** Email translation is high-value feature
#### Affected Features
- Email reading/sending
- Email translation
- SmartRAG email indexing
- Persona linkage (email contacts)
#### Solution (Development Plan - Phases 2-4)
**Phase 2: UI Layer (4 weeks)**
```kotlin
// Week 1-2: Core screens
@Composable
fun EmailListScreen(
viewModel: EmailListViewModel,
onEmailClick: (Email) -> Unit
) {
val uiState by viewModel.uiState.collectAsState()
LazyColumn {
items(uiState.emails) { email ->
EmailCard(
email = email,
onClick = { onEmailClick(email) }
)
}
}
}
@Composable
fun EmailDetailScreen(
viewModel: EmailDetailViewModel
) {
val email by viewModel.currentEmail.collectAsState()
Column {
EmailHeader(email)
EmailBody(email.htmlBody) // WebView for HTML
AttachmentList(email.attachments)
Row {
Button(onClick = { viewModel.reply() }) { Text("Reply") }
Button(onClick = { viewModel.forward() }) { Text("Forward") }
Button(onClick = { viewModel.delete() }) { Text("Delete") }
}
}
}
@Composable
fun EmailComposeScreen(
viewModel: ComposeViewModel
) {
Column {
TextField(label = "To", value = to, onValueChange = { to = it })
TextField(label = "Subject", value = subject, onValueChange = { subject = it })
TextField(
label = "Message",
value = body,
onValueChange = { body = it },
modifier = Modifier.height(300.dp)
)
Row {
Button(onClick = { viewModel.send() }) { Text("Send") }
Button(onClick = { viewModel.saveDraft() }) { Text("Save Draft") }
}
}
}
// Week 3-4: Advanced features
@Composable
fun AccountSetupScreen(
viewModel: AccountSetupViewModel
) {
// OAuth2 flow for Gmail, Outlook
// Manual IMAP/SMTP setup for others
when (uiState.provider) {
EmailProvider.GMAIL -> GoogleOAuthButton()
EmailProvider.OUTLOOK -> MicrosoftOAuthButton()
EmailProvider.CUSTOM -> ImapSmtpManualSetup()
}
}
```
**Phase 3: Room Cache (2 weeks)**
```kotlin
// Week 5-6: Offline support
@Database(
entities = [
EmailEntity::class,
AccountEntity::class,
FolderEntity::class
],
version = 1
)
abstract class EmailDatabase : RoomDatabase() {
abstract fun emailDao(): EmailDao
abstract fun accountDao(): AccountDao
}
@Dao
interface EmailDao {
@Query("SELECT * FROM emails WHERE folderId = :folderId ORDER BY timestamp DESC")
fun getEmailsByFolder(folderId: Long): Flow<List<EmailEntity>>
@Insert(onConflict = OnConflictStrategy.REPLACE)
suspend fun insertEmails(emails: List<EmailEntity>)
}
// Sync strategy:
// 1. Fetch from Room cache (fast)
// 2. Sync with IMAP in background
// 3. Update cache
```
**Phase 4: Push Notifications (2 weeks)**
```kotlin
// Week 7-8: IMAP IDLE for push notifications
class ImapIdleService : Service() {
override fun onStartCommand(intent: Intent?, flags: Int, startId: Int): Int {
CoroutineScope(Dispatchers.IO).launch {
val imapClient = get<ImapClient>()
// IMAP IDLE command
imapClient.idle { newMessages ->
// Show notification
showNewEmailNotification(newMessages)
// Sync with cache
syncNewMessages(newMessages)
}
}
return START_STICKY
}
}
```
#### Timeline
- **Total Fix Time:** 6-9 weeks
- Phase 2: UI Layer (4 weeks)
- Phase 3: Room Cache (2 weeks)
- Phase 4: Push Notifications (2 weeks)
- Buffer: 1 week (testing, bug fixes)
- **Priority:** P1 - HIGH (key scenario)
- **Blocker for:** v2.0 release
- **Assigned to:** UI team + Backend team
- **Due Date:** 2026-02-15
---
## Medium Priority (P2)
### 🟡 P2-1: Gemini SDK Function Calling Mocks (25 TODOs)
**Category:** Waiting for SDK Update
**Module:** :conductor
**Files:** 4 files
#### Problem Description
```kotlin
// UIConductorOrchestratorV2.kt
class UIConductorOrchestratorV2 {
// TODO: Use actual Gemini SDK FunctionCallPart when SDK 1.0.0+ released
data class MockFunctionCallPart(
val name: String,
val args: Map<String, Any>
)
suspend fun executeCommand(command: String): DomainResult<CommandResult> {
// TODO: Replace with actual SDK function calling
val mockResponse = geminiModel.generateContent(command)
// Mock parsing (should be SDK types)
val functionCall = parseMockFunctionCall(mockResponse.text)
// TODO: Delegate to UIActionExecutor :206, 294
return executeUIAction(functionCall)
}
}
// ToolDefinition.kt
data class ToolDefinition(
val name: String,
val description: String,
val parameters: List<Parameter>
) {
// TODO: Convert to actual SDK FunctionDeclaration when available :368
// TODO: Support typed arrays :373
// TODO: Support nested objects
fun toMockSdkType(): String {
// Temporary JSON representation
// Should be: FunctionDeclaration from Gemini SDK
return """
{
"name": "$name",
"description": "$description",
"parameters": ${parameters.toJson()}
}
"""
}
}
// ToolRegistry.kt
object ToolRegistry {
private val tools = mutableMapOf<String, ToolDefinition>()
// TODO: Use SDK FunctionRegistry when available
fun register(tool: ToolDefinition) {
tools[tool.name] = tool
}
// TODO: Convert to SDK types
fun getAllTools(): List<String> {
return tools.values.map { it.toMockSdkType() }
}
}
// TemporaryPipelineBuilder.kt:66
const val PIPELINE_SERVER_URL = "http://10.0.2.2:3000/pipeline"
// TODO: Replace with actual deployed server URL
```
#### Impact
🟡 **MEDIUM:**
- **Functionality:** Voice commands work via mock implementation
- **Type Safety:** Using strings instead of SDK types
- **Maintenance:** Mock code needs to be replaced when SDK updates
- **Future Risk:** Breaking changes when SDK 1.0.0+ releases
#### Current Status
✅ **Mock Implementation Working:**
- Voice commands functional
- Tool registry operational
- UI actions executing
- Acceptable for v1.0
⚠️ **Waiting for Gemini SDK 1.0.0+:**
- Function calling API not yet stable
- Current SDK: 0.9.0 (experimental)
- Expected SDK: 1.0.0+ (Q1 2026)
#### Solution
**Wait for SDK Update, then migrate:**
```kotlin
// AFTER Gemini SDK 1.0.0+ RELEASE
// STEP 1: Update dependency
// build.gradle.kts
dependencies {
implementation("com.google.ai.client.generativeai:generativeai:1.0.0") // NEW
}
// STEP 2: Replace mock types with SDK types
import com.google.ai.client.generativeai.type.FunctionDeclaration
import com.google.ai.client.generativeai.type.FunctionCallPart
import com.google.ai.client.generativeai.type.Schema
class UIConductorOrchestratorV2 {
suspend fun executeCommand(command: String): DomainResult<CommandResult> {
// ✅ Use actual SDK function calling
val response = geminiModel.generateContent(
content = content(command),
tools = listOf(
Tool(functions = toolRegistry.getAllFunctions()) // SDK type
)
)
// ✅ SDK provides typed function call
val functionCall = response.functionCalls.firstOrNull()
if (functionCall != null) {
return executeUIAction(functionCall) // ✅ Type-safe
}
return DomainResult.Success(CommandResult(response.text))
}
}
// STEP 3: Replace ToolDefinition with FunctionDeclaration
fun createSwitchLanguageTool(): FunctionDeclaration {
return FunctionDeclaration(
name = "switch_language",
description = "Switch the target language for translation",
parameters = Schema(
type = "object",
properties = mapOf(
"language" to Schema(
type = "string",
enum = listOf("english", "russian", "chinese", "spanish")
)
),
required = listOf("language")
)
)
}
// STEP 4: Remove all mock code
// Delete: MockFunctionCallPart, toMockSdkType(), parseMockFunctionCall()
```
#### Timeline
- **Fix Time:** 1 week (after SDK release)
- **Priority:** P2 - MEDIUM (mock works fine for now)
- **Blocker for:** Nothing (mock acceptable)
- **Assigned to:** Conductor team
- **Due Date:** Q1 2026 (when SDK 1.0.0+ releases)
#### Tracking Issue
Monitor Gemini SDK releases: https://github.com/google/generative-ai-android/releases
---
### 🟡 P2-2: LLMOrchestrator Partial Implementation (4 TODOs)
**Category:** Incomplete Feature
**Module:** :llmorchestrator
**File:** `llmorchestrator/src/main/kotlin/.../LLMOrchestrator.kt`
#### Problem Description
```kotlin
class LLMOrchestrator {
// TODO: Add SmartRAG integration :24
private val ragService: SmartRAGService? = null // ❌ Not integrated
suspend fun processRequest(
request: String,
context: ConversationContext
): DomainResult<Response> {
// TODO: Implement RAG search :99
val relevantContext = ragService?.search(request) // ❌ Not implemented
// TODO: Implement RAG search for multi-turn :157
val conversationHistory = ragService?.getHistory(context.conversationId) // ❌ Not implemented
// TODO: Implement translation worker :115
val translatedRequest = translateIfNeeded(request) // ❌ Not implemented
// Current: Direct LLM call (no RAG)
return llmClient.chat(request)
}
}
```
#### Impact
🟡 **MEDIUM:**
- **Functionality:** Basic LLM orchestration works
- **RAG Integration:** Not connected to SmartRAG v3
- **Translation:** No automatic translation
- **Context:** No conversation history
#### Current Status
✅ **Basic Orchestration Working:**
- Direct LLM calls functional
- Agent configuration working
- Acceptable for v1.0 (Translator scenario doesn't use this module)
❌ **Advanced Features Missing:**
- SmartRAG integration
- Translation worker
- Conversation history
- Multi-turn conversations
#### Solution
**Option A: Complete Implementation (2-3 weeks)**
```kotlin
class LLMOrchestrator(
private val llmClient: LLMClient,
private val ragRepository: RAGRepository, // ✅ SmartRAG v3
private val translationService: TranslationService
) {
suspend fun processRequest(
request: String,
context: ConversationContext
): DomainResult<Response> {
// 1. Translate request if needed
val englishRequest = if (context.language != Language.ENGLISH) {
when (val result = translationService.translateSafe(request, context.language, Language.ENGLISH)) {
is DomainResult.Success -> result.data
is DomainResult.Error -> request // Fallback to original
}
} else {
request
}
// 2. RAG search for relevant context
val ragResults = ragRepository.search(
query = englishRequest,
limit = 5
)
val relevantContext = ragResults.map { it.content }.joinToString("\n")
// 3. Get conversation history
val history = ragRepository.getConversationHistory(
conversationId = context.conversationId,
limit = 10
)
// 4. Build prompt with context
val enrichedPrompt = """
Context from your knowledge base:
$relevantContext
Conversation history:
${history.joinToString("\n")}
User question: $englishRequest
""".trimIndent()
// 5. LLM call
val response = llmClient.chat(enrichedPrompt)
// 6. Translate response back if needed
val finalResponse = if (context.language != Language.ENGLISH) {
when (val result = translationService.translateSafe(response, Language.ENGLISH, context.language)) {
is DomainResult.Success -> result.data
is DomainResult.Error -> response // Fallback
}
} else {
response
}
// 7. Save to conversation history
ragRepository.addConversationTurn(
conversationId = context.conversationId,
userMessage = request,
assistantMessage = finalResponse
)
return DomainResult.Success(Response(finalResponse))
}
}
```
**Option B: Deprecate Module (1 day)**
```kotlin
// If not needed for v1.0, deprecate and remove
// settings.gradle.kts
// include(":llmorchestrator") // DEPRECATED - not used in v1.0
```
#### Recommendation
**Deprecate for v1.0, implement for v2.0 if needed**
Reasons:
- Translator scenario doesn't use this module
- Chat with RAG can integrate directly with SmartRAG
- SimpleConductor handles voice commands
- Avoid complexity for v1.0
#### Timeline
- **Option A:** 2-3 weeks (complete implementation)
- **Option B:** 1 day (deprecate)
- **Priority:** P2 - MEDIUM (not needed for v1.0)
- **Assigned to:** Architecture team (decision needed)
- **Due Date:** 2025-12-07 (decision) or 2026-02-01 (implementation)
---
### 🟡 P2-3: Embeddings Disabled in DocumentProcessor (7 TODOs)
**Category:** Feature Disabled
**Module:** :smartrag3:language
**File:** `smartrag3/language/src/main/kotlin/.../DocumentProcessor.kt`
#### Problem Description
```kotlin
class DocumentProcessor(
// TODO: Re-enable embeddings :14
// private val embeddingProvider: EmbeddingGemmaProvider, // ❌ COMMENTED OUT
) {
suspend fun processDocument(text: String): ProcessedDocument {
// 1. Detect language
val language = languageDetector.detectLanguage(text)
// 2. Chunk text
val chunks = chunkText(text, language)
// 3. Generate embeddings
val chunksWithEmbeddings = chunks.map { chunk ->
ChunkWithEmbedding(
text = chunk,
// TODO: Re-enable after ONNX implementation :25, 37, 73, 135, 156
embedding = null // ❌ EMBEDDINGS DISABLED
// embedding = embeddingProvider.generateEmbeddings(chunk) // Should be
)
}
return ProcessedDocument(chunks = chunksWithEmbeddings)
}
}
```
#### Impact
🟡 **MEDIUM:**
- **Functionality:** Text processing works, but no semantic search
- **Vector Search:** Returns random/no results (null embeddings)
- **SmartRAG Quality:** Knowledge retrieval non-functional
- **Dependency:** Blocked by P0-4 (ONNX stub)
#### Current Status
✅ **Text Processing Working:**
- Language detection (CLD3)
- Text chunking
- Document parsing
❌ **Embeddings Disabled:**
- Waiting for ONNX integration
- Null embeddings stored
- Vector search not functional
#### Solution
**Re-enable after P0-4 (ONNX integration) complete:**
```kotlin
// AFTER P0-4 is fixed:
class DocumentProcessor(
private val embeddingProvider: EmbeddingGemmaProvider // ✅ UNCOMMENT
) {
suspend fun processDocument(text: String): ProcessedDocument {
val language = languageDetector.detectLanguage(text)
val chunks = chunkText(text, language)
val chunksWithEmbeddings = chunks.map { chunk ->
ChunkWithEmbedding(
text = chunk,
embedding = embeddingProvider.generateEmbeddings(chunk) // ✅ ENABLE
)
}
return ProcessedDocument(chunks = chunksWithEmbeddings)
}
}
// DI module
val smartRAGLanguageModule = module {
single { EmbeddingGemmaProvider(androidContext()) } // ✅ PROVIDE
single { DocumentProcessor(get(), get()) } // ✅ INJECT
}
```
#### Timeline
- **Fix Time:** 1 day (after P0-4)
- **Priority:** P2 - MEDIUM (blocked by P0-4)
- **Blocker for:** Semantic search, Chat with RAG
- **Assigned to:** SmartRAG team
- **Due Date:** 2025-12-22 (after P0-4)
---
## Low Priority (P3)
### 🟢 P3-1: Hardcoded Values (5 occurrences)
**Category:** Configuration
**Module:** :app
**Files:** 1 file (TranslatorViewModel.kt)
#### Problem Description
```kotlin
// TranslatorViewModel.kt
class TranslatorViewModel @Inject constructor(/* ... */) : ViewModel() {
// Hardcoded language pair :335-341
private val sourceLanguage = Language.RUSSIAN // ❌ HARDCODED
private val targetLanguage = Language.ENGLISH // ❌ HARDCODED
// Hardcoded TTS speed :80
private val ttsSpeed = 1.5f // ❌ HARDCODED
// Hardcoded quality threshold :208
private val qualityThreshold = 0.6f // ❌ HARDCODED
// Hardcoded retry parameters
private val maxRetries = 3
private val retryDelay = 1000L
}
```
#### Impact
🟢 **LOW:**
- **Functionality:** Everything works
- **User Experience:** Users can't change settings (yet)
- **Testing:** Hard to test different configurations
- **Flexibility:** Can't A/B test thresholds
#### Solution
**Move to RemoteConfig:**
```kotlin
// STEP 1: Create RemoteConfig defaults
// app/src/main/res/xml/remote_config_defaults.xml
<defaultsMap>
<entry>
<key>source_language</key>
<value>ru</value>
</entry>
<entry>
<key>target_language</key>
<value>en</value>
</entry>
<entry>
<key>tts_speed</key>
<value>1.5</value>
</entry>
<entry>
<key>quality_threshold</key>
<value>0.6</value>
</entry>
</defaultsMap>
// STEP 2: Inject RemoteConfigService
class TranslatorViewModel @Inject constructor(
private val remoteConfig: RemoteConfigService,
// ...
) : ViewModel() {
private val sourceLanguage = Language.fromCode(
remoteConfig.getString("source_language")
)
private val targetLanguage = Language.fromCode(
remoteConfig.getString("target_language")
)
private val ttsSpeed = remoteConfig.getFloat("tts_speed")
private val qualityThreshold = remoteConfig.getFloat("quality_threshold")
}
// STEP 3: Add UI for changing settings
@Composable
fun TranslatorSettingsScreen() {
var sourceLanguage by remember { mutableStateOf(Language.RUSSIAN) }
var targetLanguage by remember { mutableStateOf(Language.ENGLISH) }
var ttsSpeed by remember { mutableStateOf(1.5f) }
Column {
LanguageSelector(
label = "Source Language",
selectedLanguage = sourceLanguage,
onLanguageSelected = { sourceLanguage = it }
)
LanguageSelector(
label = "Target Language",
selectedLanguage = targetLanguage,
onLanguageSelected = { targetLanguage = it }
)
Slider(
value = ttsSpeed,
onValueChange = { ttsSpeed = it },
valueRange = 0.5f..2.0f,
steps = 15
)
}
}
```
#### Timeline
- **Fix Time:** 2-3 days
- Day 1: RemoteConfig integration
- Day 2: Settings UI
- Day 3: Testing
- **Priority:** P3 - LOW (works fine for beta)
- **Nice to have for:** v1.0
- **Assigned to:** Settings team
- **Due Date:** 2026-01-15
---
### 🟢 P3-2: Settings Screen Placeholders (3 empty implementations)
**Category:** UI Incomplete
**Module:** :app
**File:** `app/src/main/java/com/yakkismart/ui/settings/SettingsScreen.kt`
#### Problem Description
```kotlin
@Composable
fun SettingsScreen() {
Column {
// Placeholder settings :95-117
SettingItem(
title = "Language",
onClick = { /* TODO: Implement */ } // ❌ EMPTY
)
SettingItem(
title = "Notifications",
onClick = { /* TODO: Implement */ } // ❌ EMPTY
)
SettingItem(
title = "Theme",
onClick = { /* TODO: Implement */ } // ❌ EMPTY
)
// More placeholders...
}
}
```
#### Impact
🟢 **LOW:**
- **Functionality:** Settings screen shows but doesn't do anything
- **User Experience:** Confusing (buttons don't work)
- **Polish:** Feels unfinished
#### Solution
```kotlin
// Implement functional settings
@Composable
fun SettingsScreen(
viewModel: SettingsViewModel
) {
val uiState by viewModel.uiState.collectAsState()
Column {
// Language setting
SettingItem(
title = "App Language",
subtitle = uiState.currentLanguage.displayName,
onClick = { viewModel.showLanguageDialog() } // ✅ FUNCTIONAL
)
// Notifications
SettingSwitchItem(
title = "Notifications",
checked = uiState.notificationsEnabled,
onCheckedChange = { viewModel.toggleNotifications(it) } // ✅ FUNCTIONAL
)
// Theme
SettingItem(
title = "Theme",
subtitle = uiState.theme.displayName,
onClick = { viewModel.showThemeDialog() } // ✅ FUNCTIONAL
)
// TTS Speed
SettingSliderItem(
title = "TTS Speed",
value = uiState.ttsSpeed,
valueRange = 0.5f..2.0f,
onValueChange = { viewModel.setTtsSpeed(it) } // ✅ FUNCTIONAL
)
// Quality Threshold
SettingSliderItem(
title = "Quality Threshold",
value = uiState.qualityThreshold,
valueRange = 0.0f..1.0f,
onValueChange = { viewModel.setQualityThreshold(it) } // ✅ FUNCTIONAL
)
// About
SettingItem(
title = "About",
onClick = { viewModel.navigateToAbout() } // ✅ FUNCTIONAL
)
}
}
class SettingsViewModel @Inject constructor(
private val settingsRepository: SettingsRepository
) : ViewModel() {
private val _uiState = MutableStateFlow(SettingsUiState())
val uiState: StateFlow<SettingsUiState> = _uiState.asStateFlow()
fun toggleNotifications(enabled: Boolean) {
viewModelScope.launch {
settingsRepository.setNotificationsEnabled(enabled)
_uiState.update { it.copy(notificationsEnabled = enabled) }
}
}
fun setTtsSpeed(speed: Float) {
viewModelScope.launch {
settingsRepository.setTtsSpeed(speed)
_uiState.update { it.copy(ttsSpeed = speed) }
}
}
// ... other methods
}
```
#### Timeline
- **Fix Time:** 1 week
- Day 1-2: SettingsViewModel
- Day 3-4: SettingsRepository (SharedPreferences)
- Day 5: UI implementation
- Day 6-7: Testing
- **Priority:** P3 - LOW (not blocker)
- **Nice to have for:** v1.0
- **Assigned to:** UI team
- **Due Date:** 2026-01-15
---
### 🟢 P3-3: Missing Notification Icons (4 occurrences)
**Category:** UI Polish
**Module:** :smartrag3:ingestion
**File:** `smartrag3/ingestion/.../WorkerNotificationHelper.kt`
#### Problem Description
```kotlin
class WorkerNotificationHelper {
fun createUploadNotification(): Notification {
return NotificationCompat.Builder(context, CHANNEL_ID)
.setSmallIcon(R.drawable.ic_upload) // :115 ❌ MISSING ICON
.setContentTitle("Uploading document")
.build()
}
fun createProcessingNotification(): Notification {
return NotificationCompat.Builder(context, CHANNEL_ID)
.setSmallIcon(R.drawable.ic_processing) // :176 ❌ MISSING ICON
.setContentTitle("Processing document")
.build()
}
fun createSuccessNotification(): Notification {
return NotificationCompat.Builder(context, CHANNEL_ID)
.setSmallIcon(R.drawable.ic_success) // :205 ❌ MISSING ICON
.setContentTitle("Document added")
.build()
}
fun createErrorNotification(): Notification {
return NotificationCompat.Builder(context, CHANNEL_ID)
.setSmallIcon(R.drawable.ic_error) // :235 ❌ MISSING ICON
.setContentTitle("Upload failed")
.build()
}
}
```
#### Impact
🟢 **LOW:**
- **Functionality:** Notifications work (using placeholder icon)
- **User Experience:** Default Android icon shown (not ideal)
- **Polish:** Looks unfinished
#### Solution
```bash
# Create notification icons (Material Design guidelines)
# STEP 1: Create icons in Android Studio
# Right-click res → New → Image Asset → Notification Icons
# Create 4 icons:
# - ic_upload.xml (upload arrow)
# - ic_processing.xml (spinning gear)
# - ic_success.xml (checkmark)
# - ic_error.xml (error circle)
# STEP 2: Place in drawable/
app/src/main/res/
├── drawable/
│ ├── ic_upload.xml
│ ├── ic_processing.xml
│ ├── ic_success.xml
│ └── ic_error.xml
```
**Example icon (ic_upload.xml):**
```xml
<vector xmlns:android="http://schemas.android.com/apk/res/android"
android:width="24dp"
android:height="24dp"
android:viewportWidth="24"
android:viewportHeight="24"
android:tint="?attr/colorControlNormal">
<path
android:fillColor="@android:color/white"
android:pathData="M9,16h6v-6h4l-7,-7 -7,7h4zm-4,2h14v2H5z"/>
</vector>
```
#### Timeline
- **Fix Time:** 1 hour
- Create 4 icons in Android Studio
- Test on device
- **Priority:** P3 - LOW (cosmetic)
- **Nice to have for:** v1.0
- **Assigned to:** UI team
- **Due Date:** 2026-01-15
---
## Architectural Concerns
### 🟡 Hybrid DI (Hilt + Koin)
**Status:** ACCEPTED - Intentional Design Decision
**Severity:** MEDIUM (increased complexity)
#### Description
```kotlin
// App module: Hilt
@HiltAndroidApp
class YakkiApplication : Application()
// Library modules (SmartRAG): Koin
val smartRAGModule = module {
single { RAGRepository(get(), get()) }
}
```
#### Rationale
**Why Hybrid DI:**
- ✅ Library modules should not impose Hilt on consumers
- ✅ Koin is lightweight for libraries
- ✅ Hilt provides compile-time safety for :app
- ✅ Common pattern in Android ecosystem
**Trade-offs:**
- ⚠️ Two DI systems to maintain
- ⚠️ Increased complexity
- ⚠️ Learning curve for new developers
#### Recommendation
**ACCEPT - Document clearly**
- ✅ Maintain separation
- ✅ Document in architecture guide
- ✅ Provide clear examples
- ❌ Do NOT try to unify (more problems than benefits)
---
### 🟢 Type Safety Violations
**Status:** LOW - Minor Issues
**Severity:** LOW
#### Description
```kotlin
// Some hardcoded strings despite UiString system
SettingItem(
title = "Language", // ❌ Should be UiString.Settings.Language
onClick = { /* ... */ }
)
```
#### Recommendation
**Audit all UI text, migrate to UiString:**
```bash
# Find hardcoded strings
grep -r "Text(\"" app/src/main/java/com/yakkismart/ui/
# Replace with UiString
Text(localize(UiString.Settings.Language)) // ✅
```
---
### 🟢 Legacy Code Patterns
**Status:** LOW - Being Phased Out
**Severity:** LOW
#### Description
Old error handling (throw Exception) alongside new DomainError system.
#### Recommendation
Complete migration via P0-2 (Deprecated APIs).
---
## Technical Debt Repayment Plan
### Sprint 1: Critical Security (Week 1)
**Goal:** Fix P0-1 and P0-2 (security + deprecated APIs)
```
Task 1: Complete security hardening
├─ Duration: 1 day
├─ Owner: Security team
└─ Deliverable: BuildConfig migration complete
Task 2: Migrate deprecated APIs
├─ Duration: 2 days
├─ Owner: Dev team
└─ Deliverable: All usages migrated to Safe variants
Task 3: Device testing
├─ Duration: 3 days
├─ Owner: QA team
└─ Deliverable: Translator scenario validated on devices
Task 4: Deprecate smartrag-v2
├─ Duration: 1 day
├─ Owner: SmartRAG team
└─ Deliverable: Module removed from build
```
**Week 1 Outcome:** Ready for beta testing ✅
---
### Sprint 2-4: High Priority (Weeks 2-4)
**Goal:** ML Kit Scanner + Settings + Test Coverage
```
Task 1: ML Kit Scanner integration
├─ Duration: 3-5 days
├─ Owner: Android team
└─ Deliverable: Document scanning functional
Task 2: Settings screen implementation
├─ Duration: 5-7 days
├─ Owner: UI team
└─ Deliverable: Functional settings with persistence
Task 3: Hardcoded values → RemoteConfig
├─ Duration: 2-3 days
├─ Owner: Config team
└─ Deliverable: Dynamic configuration system
Task 4: Test coverage improvement
├─ Duration: 7 days
├─ Owner: QA team
└─ Deliverable: App coverage 40% → 60%
```
**Week 4 Outcome:** Production-ready core features ✅
---
### Sprint 5-7: ONNX Embeddings (Weeks 5-7)
**Goal:** Fix P0-4 (semantic search)
```
Task 1: ONNX Runtime integration
├─ Duration: 5 days
├─ Owner: ML team
└─ Deliverable: ONNX Runtime 1.19.2 functional
Task 2: Tokenizer implementation
├─ Duration: 5 days
├─ Owner: ML team
└─ Deliverable: Gemma SentencePiece tokenizer working
Task 3: Model optimization
├─ Duration: 5 days
├─ Owner: ML team
└─ Deliverable: Inference <100ms on mid-range devices
Task 4: Re-enable embeddings
├─ Duration: 1 day
├─ Owner: SmartRAG team
└─ Deliverable: DocumentProcessor generating embeddings
Task 5: Benchmarking + testing
├─ Duration: 5 days
├─ Owner: QA team
└─ Deliverable: Performance metrics validated
```
**Week 7 Outcome:** Semantic search functional ✅
---
### v2.0 Roadmap (Weeks 8-20)
**Major Features:**
```
Bluetooth LE Audio
├─ Duration: 3-4 weeks
├─ Priority: P1 (HIGH)
└─ Status: Optional for v1.0, experimental mark acceptable
Yakki Mail UI
├─ Duration: 6-9 weeks
├─ Priority: P1 (HIGH)
└─ Status: High value feature for v2.0
Gemini SDK migration
├─ Duration: 1 week
├─ Priority: P2 (MEDIUM)
└─ Status: Waiting for SDK 1.0.0+ release (Q1 2026)
LLMOrchestrator completion
├─ Duration: 2-3 weeks
├─ Priority: P2 (MEDIUM)
└─ Status: Optional, evaluate need for v2.0
New scenarios
├─ Duration: 12-20 weeks
├─ Priority: MEDIUM
└─ Status: Planned for v2.0+
```
---
## Metrics and KPIs
### Current State (2025-11-30)
```
Technical Debt Score: 14 issues
├─ P0 (CRITICAL): 4 ████░░░░░░ 29%
├─ P1 (HIGH): 4 ████░░░░░░ 29%
├─ P2 (MEDIUM): 3 ███░░░░░░░ 21%
└─ P3 (LOW): 3 ███░░░░░░░ 21%
Estimated Fix Time: 17-30 weeks total
MUST FIX (P0): 1 day + 2 days + 1 week + 2-3 weeks = 4 weeks
SHOULD FIX (P1): 3-4 weeks + 1-2 weeks + 3-5 days + 6-9 weeks = 12 weeks
NICE TO HAVE (P2+): 2-5 weeks
```
### Target State (v1.0 - 2026-02-01)
```
Technical Debt Score: 7 issues (50% reduction)
├─ P0 (CRITICAL): 0 ░░░░░░░░░░ 0% ✅ ALL FIXED
├─ P1 (HIGH): 2 ██░░░░░░░░ 29% (Experimental marks)
├─ P2 (MEDIUM): 3 ███░░░░░░░ 43% (Waiting for SDK)
└─ P3 (LOW): 2 ██░░░░░░░░ 29% (Polish items)
Estimated Fix Time: 4-6 weeks remaining
Code Quality: Excellent ✅
Production Ready: YES ✅
```
### Target State (v2.0 - 2026-06-01)
```
Technical Debt Score: 0-2 issues (90% reduction)
├─ P0 (CRITICAL): 0 ░░░░░░░░░░ 0% ✅
├─ P1 (HIGH): 0 ░░░░░░░░░░ 0% ✅
├─ P2 (MEDIUM): 0 ░░░░░░░░░░ 0% ✅
└─ P3 (LOW): 0-2░░░░░░░░░░ 0% ✅
Code Quality: Pristine ✅
Test Coverage: 80%+ ✅
Documentation: 100% ✅
```
---
## Conclusion
**YAKKI SMART v2.2 has manageable technical debt** with clear priorities and repayment plan.
### Key Takeaways
✅ **Strengths:**
- Only 1 critical security issue (5 min fix)
- Most issues are "missing features" not "broken code"
- Core functionality production-ready
- Clear priorities and timelines
⚠️ **Immediate Actions Required:**
1. **Week 1:** Complete security hardening (P0-1) - Immediate priority
2. **Week 1:** Migrate deprecated APIs (P0-2) - 2 days
3. **Week 1:** Deprecate smartrag-v2 (P0-3) - 1 week
4. **Weeks 2-4:** ONNX embeddings (P0-4) - 2-3 weeks
### Timeline Summary
- **Beta Release:** Week 1 (after fixing P0-1, P0-2)
- **v1.0 Release:** Week 7-8 (after fixing all P0)
- **v2.0 Release:** Week 20 (all features complete)
**Project ready for beta testing in 1 week** after fixing critical issues.
---
**Date:** 2025-11-30
**Version:** v2.2
**Methodology:** 32-module comprehensive audit
---