56
Stay up-to-date with latest news on Android Development! Content directly fetched from the subreddit just for you. Powered by : @r_channels
[DEV] I built an Open-Source, Ad-Free Movie & TV Streaming App (OHOBox) to practice UI/UX and WebView implementations
https://redd.it/1rhzf4w
@reddit_androiddev
I'm looking for honest opinions
https://redd.it/1rhq4d6
@reddit_androiddev
Trackazon – Real-Time Live Map Tracking for Amazon Deliveries
Hi everyone 👋
I’ve just released a beta of my Android app that improves how Amazon deliveries are tracked — especially during the “out for delivery” phase.
The key difference from the official Amazon app/website:
Amazon usually shows only the last 10 stops before your delivery.
My app starts tracking as soon as the package is marked “out for delivery”, even if the courier is still 30–60 km away.
Why this matters:
The Amazon app estimates position based on the remaining stops.
When the driver changes route (which happens often), the stop-based system recalculates and tracking becomes inaccurate or jumps around.
With my app:
• Tracking starts from the moment it’s on the truck
• You can follow the full route, not just the last 10 stops
• Real-time map updates (when location data is available)
• More consistent tracking even when the route changes
• Smart local notifications (in transit, out for delivery, delivered)
• Manual and automatic refresh
• Multiple shipments in one screen
Privacy:
• No proprietary servers
• Tracking data handled locally on the device
• Uses only necessary services (Amazon + maps)
It’s currently available in the UK, Spain, Germany and France (multilingual beta).
I built it because I was frustrated with the official tracking losing accuracy once the driver changed stops.
I’d genuinely love feedback — especially from people who frequently track deliveries and notice the “last 10 stops” limitation.
If anyone wants to test it, I will share the Play Store beta link in the comments.
Thanks
https://redd.it/1rhs01s
@reddit_androiddev
I made a Mac app to control my Android emulators
https://redd.it/1rhplkv
@reddit_androiddev
Samsung Health data export breaks SAF (ACTION_OPEN_DOCUMENT)
Spent my morning debugging a really weird user report and thought I'd share this OS-level quirk, as it might save someone else a headache. I'd also love to hear how you guys handle this.
I recently released an Android DevTool (GiantJson Viewer+ with Rust engine), and an user reported that they can not see / can not open the JSON files exported by Samsung Healh when they are using my in app (SAF) file picker (standard SAF `ACTION_OPEN_DOCUMENT` with `*/*` ).
But if they opened the Samsung "My Files" app, the files were physically right there, and can open from there.
It seems that **Samsung Health** using a standard file I/O when writing to the public Downloads folder, **but completely forgot to invoke MediaScannerConnection.scanFile()**
Because the mediastore db is never updated, the SAF is seeing the folder is empty. The Samsung My Files app however having MANAGE\_EXTERNAL\_STORAGE and reads the disk directly :/
So far i didn't find a way if i can trigger a scan myself on folder without proper permissions, the only solution i can tell to the users to rename the parent folder to trigger the OS rescan. This actually worked fine, but not a professional answer for users to point fingers to Samsung... This is my really first app, beginner dev, every single review matters, and it feels bad to get a possibly bad review because my hands are tied.
The only thing i could do so far is that i cut a ticket in Samsung Members, explaining the issue, and hoping that they fix it
* Has anyone else run into similar "ghost files" generated by other major apps?
* Is there any programmatic way to force a MediaStore scan on a specific public directory without having the exact file paths or `MANAGE_EXTERNAL_STORAGE`
* How to communicate this to the users?
https://redd.it/1rgwvl1
@reddit_androiddev
[Android Dev] Welletvy - A completely offline, on-device AI expense tracker (No data sent to servers)
Hey everyone,
I wanted to share an Android expense tracker called **Welletvy**. A lot of finance apps out there require you to create an account and store your sensitive financial data on their servers. This app is built to be strictly privacy-first and fully functional offline.
Here are the main features:
* **100% On-Device AI Scanner:** You can snap a picture of your receipt, and the AI extracts the data locally on your phone. No data ever leaves your device.
* **Swipe-to-Action Inbox:** It catches payment notifications and lets you swipe left/right (Tinder-style) to quickly add or ignore them. Makes tracking a lot less tedious.
* **Offline-First:** Works perfectly even if you're on a flight or without an internet connection.
* **Aesthetic UI & Dark Mode:** Clean pastel layout that also fully supports dark mode.
* **Subscription Tracker:** Easily monitor your fixed monthly recurring expenses.
https://preview.redd.it/tvotfqwcfamg1.png?width=3072&format=png&auto=webp&s=b008c2ddb1b8ab5ef015d50159590e411c119e8b
**Google Play Store Link:** [**https://play.google.com/store/apps/details?id=com.walletvy.app**](https://play.google.com/store/apps/details?id=com.walletvy.app)
I'm constantly looking to improve the app, so your feedback and feature requests would mean a lot to me.
https://redd.it/1rhbkhk
@reddit_androiddev
The state of this sub
A bit off topic..
I've been a programmer almost exactly as long as I've been a redditor - a colleague introduced me to both things at the same time! Thanks for the career and also ruining my brain?
I'm not sure how long this sub has been around, /r/android was the home for devs for a while before this took off, iirc.
Anyway, this community is one I lurk in, I tend to check it daily just in case something new and cool comes about, or there's a fight between /u/zhuinden and Google about whether anyone cares about process death. I've been here for the JW nuthugging, whatever the hell /r/mAndroiddev is, and I've seen people loudly argue clean architecture and best practices and all the other dumb shit we get caught up in.
I've also seen people release cool libraries, some nice indie apps, and genuinely help each other out. This place has sort of felt like home on reddit for me for maybe a decade.
But all this vibe coded slop and AI generated posts and comments is a serious existential threat. I guess this is the dead Internet theory? Every second post has all the hyperbole and trademark Claude or ChatGPT structure. Whole platforms are being vibe coded and marketed to us as if they've existed for years and have real users and solve real problems.
I'll be halfway through replying to a comment and I'm like 'oh wait I'm talking to a bot'. Bots are posting, reading and replying. I don't want to waste my energy on that. They don't want my advice or to have a conversation, they're trying to sell me something.
Now, I vibe code the shit out of everything just like the next person, so I think I have a pretty good eye for AI language, but I'm sure I get it wrong and I'm also sure it's going to be harder to detect. But it kinda doesn't matter? if I've lost faith that I'm talking to real people then I'm probably not going to engage.
So this kind of feels like the signal of the death of this subreddit to me, and that's sad!
I'm sure this is a huge problem across reddit and I'm sure the mods are doing what they can. But I think we're fucked 😔
https://redd.it/1rgj6he
@reddit_androiddev
Guide to Klipy or Giphy - Tenor gif api shutdown
Google is sunsetting the Tenor API on June 30 and new API sign-ups / new integrations were already cut off in January, so if your Android app still depends on Tenor for GIF search, this is probably the time to plan the replacement.
I spent some time looking at the two main options that seem most relevant, thought I'd share a guide here:
1) KLIPY (former Tenor team)
WhatsApp, Discord, Microsoft and biggest players announced that they're swapping Tenor with Klipy. From what I saw, KLIPY is positioning itself as the closest migration path for existing Tenor integrations. If your app already uses Tenor-style search flows, this looks like the lower-effort option.
For devs to migratae (base URL swap): https://klipy.com/migrate
For creators to claim & migrate their content: https://forms.gle/Z6N2fZwRLdw9N8WaA
2) GIPHY (Shutterstock)
GIPHY is obviously established option, But their own migration docs make it pretty clear this is not a pure drop-in replacement - endpoints, request params, and response handling differ.
Tenor migration docs: https://developers.giphy.com/docs/api/tenor-migration/#overview
My takeaway:
If your goal is the fastest migration with the least code churn, KLIPY looks closer to a Tenor-style replacement - It's built by Tenor founders.
If you are okay with a more involved migration and want to use GIPHY’s ecosystem, GIPHY is a solid option.
https://redd.it/1rgw5o0
@reddit_androiddev
JNI + llama.cpp on Android - what I wish I knew before starting
spent a few months integrating llama.cpp into an android app via JNI for on-device inference. sharing some things that werent obvious:
1. dont try to build llama.cpp with the default NDK cmake setup. use the llama.cpp cmake directly and just wire it into your gradle build. saves hours of debugging
2. memory mapping behaves differently across OEMs. samsung and pixel handle mmap differently for large files (3GB+ model weights). test on both
3. android will aggressively kill your process during inference if youre in the background. use a foreground service with a notification, not just a coroutine
4. thermal throttling is real. after ~30s of sustained inference on Tensor G3 the clock drops and you lose about 30% throughput. batch your work if you can
5. the JNI string handling for streaming tokens back to kotlin is surprisingly expensive. batch tokens and send them in chunks instead of one at a time
running gemma 3 1B and qwen 2.5 3B quantized. works well enough for summarization and short generation tasks. anyone else doing on-device LLM stuff?
https://redd.it/1rgsqbv
@reddit_androiddev
Should i go all in Kotlin?
In my 4th semester, I was introduced to Java for the first time and I genuinely loved OOP. I ended up building an app in Java for both Android and desktop, and that’s when I realized I actually enjoy building software.
Being the nerd I am, I started digging into whether Java is enough to build real-world apps and land a dev job. That’s when I found out Kotlin is basically the go-to for Android now, so I switched and started learning it.
Fast forward: I’ve built a few apps with Kotlin. I understand a decent amount, but I’m definitely not an expert yet. Still learning, still breaking things, still enjoying the process.
What’s messing with my head is this:
I’ve used AI agents to implement features in my apps that I haven’t fully learned yet, and they work surprisingly well. Almost too well. It made me wonder—should I really spend years learning all this deeply if tools can already do a lot of the heavy lifting?
So I’m a bit confused about direction right now:
Should I double down on Kotlin and Android dev?
Does Kotlin/Android actually have a solid future career-wise?
Is it realistic to aim for a job with this path?
Or am I setting myself up to learn skills that’ll be half-automated by the time I’m job-ready?
I enjoy building apps a lot, and I like understanding how things work under the hood. I just don’t want to end up grinding for years on something that doesn’t have a future.
https://redd.it/1rbhrgz
@reddit_androiddev
F-Droid: Keep Android Open
https://f-droid.org/2026/02/20/twif.html
https://redd.it/1rb64ru
@reddit_androiddev
Is this a good social media idea?
Hello! I created a social media called "SocialStudent" for students (from my school) to exchange ideas, Grades, memes and advices. If you want, you can check it out:
https://social-student.flutterflow.app
But is this a good idea, people would like?
https://redd.it/1rb0u0q
@reddit_androiddev
I just open-sourced my Kotlin Multiplatform project — InstaSaver Pro!
https://redd.it/1rb0tnf
@reddit_androiddev
NexusControl Open-source Android homelab manager built with Compose + SSHJ (multi-tab SSH, SFTP, monitoring, Script automation)
Hi all,
I’ve been working on an open-source Android app called NexusControl — it’s a homelab command center built entirely with Kotlin + Compose.
Features include:
Multi-tab SSH terminal (SSHJ)
SFTP browser with inline editor
Dashboard tiles pulling stats over SSH
Docker container overview
REST API tiles (Home Assistant, Proxmox, Pi-hole, custom JSON)
Script library with templates
Background monitoring via WorkManager
Encrypted credentials using Android Keystore
No backend, no cloud, everything local.
Would appreciate any feedback on architecture or feature ideas.
GitHub:
https://github.com/iTroy0/NexusControl
Screenshots :
Processing img ejd6p4e9omkg1...
https://redd.it/1raxd8o
@reddit_androiddev
I built an embedded NoSQL database in pure Kotlin (LSM-tree + vector search)
Hi everyone,
Over the past few months, I’ve been experimenting with building an embedded NoSQL database engine for Android from scratch in 100% Kotlin. It’s called KoreDB.
This started as a learning project. I wanted to deeply understand storage engines (LSM-trees, WAL, SSTables, Bloom filters, mmap, etc.) and explore what an Android-first database might look like if designed around modern devices and workloads.
Why I built it?
I was curious about a few things:
How far can we push sequential writes on modern flash storage?
Can we reduce read/write contention using immutable segments?
What would a Kotlin-native API look like without DAOs or SQL?
Can we embed vector similarity search directly into the engine?
That led me to implement an LSM-tree-based engine.
High-Level Architecture
KoreDB uses:
Append-only Write-Ahead Log (WAL)
In-memory SkipList (MemTable)
Immutable SSTables on disk
Bloom filters for negative lookups
mmap (MappedByteBuffer) for reads
Writes are sequential.
Reads operate on stable immutable segments.
Bloom filters help avoid unnecessary disk checks.
For vector search:
Vectors stored in flat binary format
Cosine similarity computed directly on memory-mapped bytes
SIMD-friendly loops for better CPU utilization
Some early benchmark
Device: Pixel 7
Dataset: 10,000 records
Vector dimension: 384
Averaged over multiple runs after JVM warm-up
Cold start (init + first read):
Room: \~15 ms
KoreDB: \~2 ms
Vector search (1,000 vectors):
Room (BLOB-based implementation): \~226 ms
KoreDB: \~113 ms
These are workload-specific and not exhaustive. I’d really appreciate feedback on improving the benchmark methodology.
This has been a huge learning experience for me, and I’d love input from people who’ve worked on storage engines or Android internals.
GitHub:
https://github.com/raipankaj/KoreDB
Thanks for reading!
https://redd.it/1ratbpj
@reddit_androiddev
Pagination
I'm wondering what do you use for making a paginations in a list screen
Do you use paging 3 or some custom logics or some other library?
https://redd.it/1rhuvvw
@reddit_androiddev
Vulkan on MALI G57 MC2 ?
Hello,
New here. Has anyone created a Vulkan sample on a Mali GPU, particularly the G57 MC2? My project works on other Android devices but fails on Mali.
Are there any do’s and don’ts when working with Mali GPUs using Vulkan 1.3?
TIA.
https://redd.it/1rhru2f
@reddit_androiddev
SteamWidget
https://redd.it/1rhrfd3
@reddit_androiddev
Under the hood: Android 17’s lock-free MessageQueue
https://android-developers.googleblog.com/2026/02/under-hood-android-17s-lock-free.html?m=1
https://redd.it/1rgg04e
@reddit_androiddev
Monochrome Icon doenst show up
https://redd.it/1rh0ga3
@reddit_androiddev
Help with Jetpack Compose and Dependency Injection
I have been trying to learn these components but it feels like either the stuff is not up to date or that its too tough. Would appreciate if you guys could suggest some resources where I can learn and which are actually up to date.
https://redd.it/1rh7gjc
@reddit_androiddev
GPlayStore - Auto Windows OS conversion - Google Playstore pushing "Google Play Games" for Windows Desktop quite aggressively.
https://redd.it/1rh9fvc
@reddit_androiddev
Struggling to Understand MVVM & Clean Architecture in Jetpack Compose – Need Beginner-Friendly Resources
Hi everyone,
I’m planning to properly learn Jetpack Compose with MVVM, and next move to MVVM Clean Architecture. I’ve tried multiple times to understand these concepts, but somehow I’m not able to grasp them clearly in a simple way.
I’m comfortable with Java, Kotlin, and XML-based Android development, but when it comes to MVVM pattern, especially how ViewModel, Repository, UseCases, and data flow work together — I get confused.
I think I’m missing a clear mental model of how everything connects in a real project.
Can you please suggest:
Beginner-friendly YouTube channels
Blogs or documentation
Any course (free or paid)
GitHub sample projects
Or a step-by-step learning roadmap
I’m looking for resources that explain concepts in a very simple and practical way (preferably with real project structure).
Thanks in advance
https://redd.it/1rh3542
@reddit_androiddev
Is this a correct way to implement Figma design tokens (Token Studio) in Jetpack Compose? How do large teams do this?
Hi everyone 👋
I’m building an Android app using Jetpack Compose and Figma Token Studio, and I’d really like feedback on whether my current token-based color architecture is correct or if I’m over-engineering / missing best practices.
# What I’m trying to achieve
Follow Figma Token Studio naming exactly (e.g. `bg.primary`, `text.muted`, `icon.dark`)
Avoid using raw colors in UI (Pink500, Slate900, etc.)
Be able to change colors behind a token later without touching UI code
Make it scalable for future themes (dark, brand variations, etc.)
In Figma, when I hover a layer, I can see the token name (bg.primary, text.primary, etc.), and I want the same names in code.
# My current approach (summary)
# 1. Core colors (raw palette)
object AppColors {
val White = Color(0xFFFFFFFF)
val Slate900 = Color(0xFF0F172A)
val Pink500 = Color(0xFFEC4899)
...
}
# 2. Semantic tokens (mirrors Figma tokens)
data class AppColorTokens(
val bg: BgTokens,
val surface: SurfaceTokens,
val text: TextTokens,
val icon: IconTokens,
val brand: BrandTokens,
val status: StatusTokens,
val card: CardTokens,
)
Example:
data class BgTokens(
val primary: Color,
val secondary: Color,
val tertiary: Color,
val inverse: Color,
)
# 3. Light / Dark token mapping
val LightTokens = AppColorTokens(
bg = BgTokens(
primary = AppColors.White,
secondary = AppColors.Pink50,
tertiary = AppColors.Slate100,
inverse = AppColors.Slate900
),
...
)
val DarkTokens = AppColorTokens(
bg = BgTokens(
primary = AppColors.Slate950,
secondary = AppColors.Slate900,
tertiary = AppColors.Slate800,
inverse = AppColors.White
),
...
)
# 4. Provide tokens via CompositionLocal
val LocalAppTokens = staticCompositionLocalOf { LightTokens }
fun DailyDoTheme(
darkTheme: Boolean,
content: u/Composable () -> Unit
) {
CompositionLocalProvider(
LocalAppTokens provides if (darkTheme) DarkTokens else LightTokens
) {
MaterialTheme(content = content)
}
}
# 5. Access tokens in UI (no raw colors)
object Tokens {
val colors: AppColorTokens
get() = LocalAppTokens.current
}
Usage:
Column(
modifier = Modifier.background(Tokens.colors.bg.primary)
)
Text(
text = "Home",
color = Tokens.colors.text.primary
)
# My doubts / questions
1. Is this how large teams (Google, Airbnb, Spotify, etc.) actually do token-based theming?
2. Is wrapping LocalAppTokens.current inside a Tokens object a good idea?
3. Should tokens stay completely separate from MaterialTheme.colorScheme, or should I map tokens → Material colors?
4. Am I overdoing it for a medium-sized app?
5. Any pitfalls with this approach long-term?
# Repo
I’ve pushed the full implementation here:
👉 **https://github.com/ShreyasDamase/DailyDo**
I’d really appreciate honest feedback—happy to refactor if this isn’t idiomatic.
Thanks! 😀
https://redd.it/1rh0v56
@reddit_androiddev
Does this follow Material 3 Design?
https://redd.it/1rbb9tl
@reddit_androiddev
Public Key Cert pining
i'm looking for some ideas about best practice to pinning public key cert on mobile app , the challenge how renew my public key cert without update the app , to reduce impact of downtime or expiration impact , any advise ,, thanks
https://redd.it/1rb4ztz
@reddit_androiddev
Designing an on-device contextual intelligence engine for Android
About me: I am an AOSP Engineer and I extensively work with Android internal systems, I switched to iOS, because its closed source, and since AOSP is open-source it always bugs me to check source code.
One of the best things I like about iOS is the appleIntelligence, and I wonder why there is no solution regarding the same for Android, I am aware about app-side aspects, and I beleive that with correct permissions something similar is possible on Android as-well.
But I want to ask some opinions regarding the same for things needed in ML aspects
https://redd.it/1rb1g73
@reddit_androiddev
First android app review time for a corporate account?
I uploaded my first app to android for review on Feb 15, now it's Feb 21st but still in review. Is this typical or is there a problem? I have a corporate account so could bypass the tester requirement.
https://redd.it/1raz1gf
@reddit_androiddev
Building linen — a native meeting notes app with automatic task detection (Kotlin + Compose)
Hi r/Android,
I’m building linen, a native Android app that turns meetings into structured notes with action items already extracted.
The goal isn’t just transcription — it’s clarity after the meeting.
Tech stack:
Kotlin
Jetpack Compose (fully declarative UI)
Room for local storage
Supabase for sync
On-device + cloud processing depending on use case
Some things I’m intentionally focusing on:
• Clean, minimal Compose UI (no cluttered productivity dashboards)
• Fast startup time
• Structured summaries instead of raw transcript dumps
• Automatic task detection synced to calendar
I’m building this solo and recently rebuilt the entire frontend to match a calmer design system.
I’d love feedback from other Android devs:
Would you prefer on-device speech processing over cloud?
How important is offline meeting capture?
Would you trust an app that listens locally but doesn’t store raw audio?
If anyone’s interested, I can share UI screenshots or talk more about the architecture decisions.
Thanks 🙌
— Vaibhav
linen
https://redd.it/1rauu5z
@reddit_androiddev
Why is the Google Play Store taking up so much storage on my phone?
https://preview.redd.it/5wu472fqvukg1.jpg?width=1010&format=pjpg&auto=webp&s=13d6fa4fdc05ef563f4ad462fdd517155b54bf92
Hey guys, does anyone know why the Google Play Store is taking up so much space on phones? In my case, I have a Galaxy S22 Ultra and the Google app store is taking up more than 9GB on it, and on my S25 Ultra it's taking up more than 12GB. To me, this is irrational. I believe it must be some kind of error. I've already cleared the cache, data, and uninstalled updates, which makes the app store go back to taking up about 250MB, but within hours when I check again it's already taking up many gigabytes of my storage.
https://redd.it/1ras4a6
@reddit_androiddev