Case Study · Aug 2024 – Apr 2026
A gallery cleaning app, designed around cognitive friction — not storage.
Role
Product & UX Designer
Timeline
~23 working days · 20 months
Platform
Android · Flutter
Status
Submitted to Google Play
A gallery cleaning app that tried to make an unpleasant chore feel fast, organized, and — on a good day — satisfying. Product and UX decisions were mine; a developer handled implementation. Written as an honest account, not a highlight reel.
This didn’t start with “I want to build an app.” It started with a phone that had over 9,000 photos on it, and the sinking feeling every time the gallery opened.
The obvious framing was “I have a storage problem.” But after a few failed attempts at cleaning manually, a different picture emerged — the real obstacle wasn’t storage. It was cognitive friction.
01
Every photo required a keep/delete decision. At 9,000 decisions, the mental cost was paralyzing. After fifty, I’d quit.
02
Twenty minutes of cleaning, and the gallery looked the same. There was no feedback loop.
03
The OS organized photos by folder (Camera, Screenshots, WhatsApp). People don’t think in folders — they think in time: “that trip,” “stuff from 2019.”
04
Pick up the phone tomorrow and there’s no memory of what was already reviewed. Every session starts from scratch.
These weren’t features I wanted to build. They were friction points I wanted to eliminate. The features came later.
A Google Forms survey ran August 4–21, 2024. Twenty-three respondents — mostly 18–34, tech-literate, with 30% self-identifying as neurodivergent.
The survey was designed with one deliberate constraint: the swipe mechanic was hidden. Questions asked about challenges and desired features, never about solutions. The goal was to validate the problem — not confirm the answer.
| TOP CHALLENGE | RESPONDENTS |
|---|---|
| Fear of losing important memories | 78% |
| Takes too much time | 73% |
| Difficulty deciding what to delete | 65% |
| Emotional attachment to photos | 60% |
| Not enough storage space | 60% |
This reframed everything. Before the survey, MediaMop was a storage utility. After it, the product was clearly an emotional support tool for a task people avoid. The top three pain points are all psychological — fear, time, and indecision. Storage was tied for fourth.
Key insight
91% asked for automatic organization into albums. Only 47% cared about progress tracking — but I kept it as a core feature anyway, because it solved the “no sense of progress” problem I’d experienced first-hand. Sometimes design intuition has to overrule survey data.
“Swipe option like tinder.”
An open-ended response. The swipe mechanic had been deliberately hidden from the survey — and somebody arrived at the same metaphor from the same problem. That’s convergent design thinking.
Built directly from survey patterns — not as filler, but as cognitive tools used to pressure-test every later decision.
24 · Freelance designer · UK
“I want a gallery that’s as clean and organized as my design workspace.”
Struggles with decision fatigue and procrastination. Needs low-friction entry and a way to not decide right now.
34 · Marketing manager · USA
“I need a quick way to clear space without worrying about losing precious memories.”
Cleans during her commute. Needs quick sessions, fast startup, and work that survives app restarts.
Noah’s decision fatigue became the single most load-bearing insight in the project. It explains why the Skip button exists, why progress persistence is non-negotiable, and why the app shows visible completion states. Bianca’s commute scenario shaped the architecture: sessions must be interruptible, startup must be fast, and the core swipe loop must be immediate.
The core idea was a swipe-based decision interface — keep, delete, skip. Three reasons it made sense:
01
The problem wasn’t effort — it was cognitive load.
02
Motivation depends on perceived completion.
03
Structure must align with mental models.
04
Interaction design as emotional design.
05
A two-second delay feels like resistance.
On naming: MediaMop, not SwipeSweep
The brainstorm produced dozens of swipe-centric candidates — SwipeSweep, SwipeClean, SwipeAway. I picked the one name that didn’t mention swiping. “Swipe to Sweep” became the tagline instead. If the interaction ever changes, the name still works. A mop is a simple, effective cleaning tool — not a “storage optimization engine.” That simplicity was the brand promise.
The swipe screen was imagined whole before I opened Figma: thumbnails at the top, large preview in the middle, action buttons at the bottom.
What took longer was the colour semantics for the action buttons. After three failed passes, it settled into: Delete = error red, Keep = primary (brand colour), Undo/Skip = tertiary (neutral). The semantics were clear; translating them into something consistent took iteration.
Video support became a small UX test in itself. You can’t judge a video from a thumbnail — so I specified auto-play on open, tap-to-toggle controls, auto-hide after three seconds. Patterns already familiar from Instagram and TikTok. Nothing new to learn.
Three features came directly from Noah’s decision-fatigue insight:
Partway through development, a quiet realization: device folders don’t match how users think about their media. Nobody with 9,000 photos thinks “I need to clean the Camera folder.” They think “I need to clean up everything from last year.”
So I added Group By Month / Year / Type — virtual albums that don’t exist on the device. Users can slice their gallery by how they think, not how the OS stores files.
This was a product decision first, an architecture challenge second. The survey’s 91% preference for automatic organization had pointed here. I just hadn’t seen it yet.
The entire interaction flow was built against mock data before touching any device API. This was deliberate: validate the UX independent of platform complexity. When real-data problems emerged later, they were integration problems, not design problems. The interaction model was already proven.
I’d do this on every future project.
The switch from mocks to actual gallery integration produced the densest day in the project — 20+ commits in a single session. Android 13+ granular media permissions, async thumbnail loading, file sizes not directly available from the media SDK. None of this had been visible in the mock layer.
One non-negotiable carried through: privacy-first. All media processing happens on-device. No cloud uploads, no external APIs.
After a seven-month pause (see §07), I returned and immediately tested on my own 9,000-item gallery. Opening the album list took 2–5 seconds. That broke the “start cleaning right away” promise from the very first design brief.
Root cause was architectural — the app was loading every media item for every album upfront, including synchronous file I/O on the main thread. The fix was a lightweight AlbumSummary model for the list view, with full items loaded on demand when a user enters an album. Later, a persistent JSON cache took the same principle further.
2–5 s
→
~ 30 s
→
This wasn’t a backend optimization. It was a UX fix. The question wasn’t “how do we make this faster” — it was “which specific delay breaks the user’s trust, and how do we eliminate that one?”
Cut after building
Features sometimes have to be built before you can tell whether they earn their place. Every removal below was a small lesson in restraint.
The visual identity didn’t arrive through planning. It arrived through iteration across multiple sessions — and, critically, through living with an imperfect version long enough to feel what was wrong.
Ghibli Green
Forest & Mint
Cerulean Clarity
Gilded Midnight
Premium Flat
Fiery Ocean
I had to see each one in the real product to know it wasn’t right. Sketches and colour boards weren’t enough.
In an earlier phase, I explored three visual paradigms: glassmorphic (frosted glass), neumorphic (soft shadows), and premium flat (solid fills, subtle outlines).
Glassmorphism and neumorphism looked impressive in isolation. Across the full app, they muddied tap targets and broke visual hierarchy. The lesson was uncomfortable but useful: choosing aesthetics for portfolio appeal over user clarity is a trap.
The first Premium Flat pass put Mahogany Red in the primary slot — bold, fiery, distinctive. It felt right on day one. A week of living with it made something quieter clear: a dramatic red as the dominant surface colour made the app feel loud and warning-like, when cleaning should feel calm and considered. Red is the language of stop, and the whole app was asking users to go.
The fix wasn’t a new palette. It was a rebalance of the same three colours into different roles:
I saved the mood board as ocean fire.png and started calling the palette Fiery Ocean — an ocean of navy lit by warm fire and gilded light. Same ingredients as Phase 4, different recipe. That’s what shipped.
Between June 2025 and January 2026, the project sat untouched. My day job was 10–12 hour days, six days a week. When I came back, something had shifted — the time away had clarified my priorities. Ten days of focused work after the pause produced more than the first three days combined. Not every gap is wasted.
The album list fix introduced a brief loading state when entering an album. I accepted that trade-off because tapping an album already implies “loading its contents” — the delay maps to user expectations, so it reads as natural rather than broken.
After rapid feature development and a visual refactor, an entire day (14 commits) went to stabilization — virtual albums showing empty, doubled item counts, premature “Done” labels, sizes showing as 0B. The cost of fixing regressions late always exceeds the cost of testing incrementally.
Four days before the intended launch, Google rejected the app: MANAGE_EXTERNAL_STORAGE was “not a core feature.” The permission was a legacy leftover from month one, never cleaned up. The fix was one line; the lesson was about audit discipline.
● v1.0.0 + 3 · Submitted to Google Play
MediaMop v1.0.0+3 runs on my own 9,000-item gallery without the friction that started the project. Album lists open under 100 ms, app restart takes 400 ms, progress persists across sessions, and the swipe loop feels immediate.
What works
What could be better
There’s a strong temptation to keep building — more features, more refinement, more polish. But that would violate a principle that shaped the whole project:
Make it exist first. Then make it better.
This version is the “exist” milestone. Future iterations will focus on:
For now, focus is shifting to a new project, with continued iteration on MediaMop in parallel.
01
“Storage” was the surface. “Fear, time, and indecision” was the reality. Surveys that uncover the real problem are worth more than surveys that confirm your solution.
02
Validating the interaction on mocks, before device APIs, meant real-data problems were integration problems — not design problems.
03
A five-second load broke a design promise. The fix wasn’t “make it fast” — it was “identify the specific delay that erodes trust, and eliminate that one.”
04
Noah’s decision fatigue directly produced the skip button, ID-based persistence, and visible completion states. Bianca’s commute produced short interruptible sessions and fast startup. The features came from the personas, not from opinion.
05
The Stats screen’s storage breakdown made sense on paper, added clutter in practice, and got deleted. Simplification is a skill only reachable after complexity.
06
Glassmorphism and neumorphism looked great in isolation and hurt usability across the app. Portfolio aesthetics are not the same as user clarity.
07
Storing decisions by ID instead of index cost a little storage. Losing ten minutes of a user’s work costs their trust permanently. The asymmetry makes the choice obvious.
08
A seven-month pause I didn’t want turned into unintentional incubation. Time away from a project isn’t the same as time lost to it.