Christoph Fahlbusch
Native app design, strategy, and systems
AI Field Note · April 21, 2026
4 days, 23 signals, 0 Figma files
I recently used AI to go from research synthesis to a working iOS prototype in 4 days.
Not a Figma prototype, but a real SwiftUI prototype with actual data flows, animations, and state management.
I work on native apps for iOS and Android. We had years of user research across different product areas, plus community feedback from developer forums, and signals from competitors across the industry. A lot of it pointed in similar directions, but it was spread across too many documents, studies, and threads.
Getting all of that into one coherent picture, and then turning it into design decisions you can actually defend, would normally take weeks, and involved disciplines like research, and product. For this specifically, engineering would also need to heavily be involved in building a SwiftUI prototype with real data, and on-device algorithms.
I pointed Copilot at our internal research, and told it to go broad. Not just the work tied to my immediate area, but anything across the company that touched the same problem space. I also had it pull in external signals like competitor patterns, community pain points, and broader UX research.
In ~30 minutes, I had structured synthesis docs across four major areas. Each one included ranked pain points, supporting data, user quotes, competitive benchmarks, and design implications.
That part alone was already super useful, but the more interesting part came after.
Copilot became a real design collaborator on the SwiftUI prototype in VS Code. Because it had all of that research in context, it could push back on weak ideas in a way that was actually useful. When I proposed something, it could point to which studies supported it, which ones contradicted it, and how others in the space were solving similar problems.
It felt a bit like having a very experienced design engineer with full context and instant recall across a huge amount of work.
To be clear, this did not replace design judgment.
The craft decisions were still mine. Information hierarchy, interaction patterns, motion, polish, all of that still needs taste, and AI still mostly does not have taste. But it stores a lot of context, and is usually a lot faster than humans (especially for going from zero to a working SwiftUI prototype with real data, all backed by said research).
And having Copilot be present in the entire process, that could hold onto every relevant finding across product areas, made the whole process faster, sharper, and easier to defend.
The outcome was a working prototype across four major areas of the app, where each decision could be traced back to a documented pain point or external signal.
So when this distributes via TestFlight, the conversation changes a bit. We are not asking how does this fit into your workflow in the abstract. We are asking: we heard you say this, and this is how we're responding to it. Did we get it right?
4 days, 23 signals (13 internal, 10 external), 0 Figma files. Absolutely wild times!