👩🏻💻 My Role
Lead Product & Interaction Designer
UX Strategy · Multimodal Interaction · AI Flows · Prototyping
Led vision, UX strategy, and execution for the AI-powered S Pen
Collaborated with a researcher, a computational engineer, and a product lead
Delivered prototypes that shaped the Galaxy AI roadmap (S24, Tab S9)
Created high-fidelity flows across sketch, gesture, and voice inputs
🗓️ Timeline
10+ months (2023–2024)
Concept through prototype delivery for Samsung SDIC and HQ
🔍 The Opportunity
From hardware tool to intelligent companion
The S Pen was widely used for handwriting and sketching—but it had untapped potential. With Samsung’s expanding ecosystem and the rise of generative AI, we saw an opportunity to evolve the pen into a context-aware productivity assistant—one that could anticipate user intent and support creativity across devices.
Key question:
How might we evolve the S Pen into a dynamic, multimodal assistant without overwhelming users?
🧠 Research & Insights
Users didn’t want more features—they wanted less friction, more flow.
We ran a robust, multi-phase research process:
🧪 220-person survey to gauge habits and frustrations
📓 10-person diary study to observe real-world S Pen workflows
🤝 22-person co-creation workshop to shape ideas collaboratively
✅ 380-person validation study to prioritize features
🔍 Key insights:
Switching devices broke momentum
Too many microtasks led to fatigue
AI felt disconnected from actual tasks
Users wanted assistive features—but only if they kept control

“I want the pen to do more—but I don’t want it to take over.”
💡 Design Vision
We imagined the S Pen not as a tool—but as a thinking companion.
A stylus that understands context, adapts across devices, and supports fluid productivity.
🎯 Core Design Pillars:
“Draw here. Speak there. Sync everywhere.”
Pillar | Focus |
---|---|
Control | Fast sketch, gesture, and voice-powered interactions |
Precision-Guided Creativity | AI-powered content enhancement, generation, and feedback |
Omni-Sync | Cross-device fluidity: Phone ⇄ Tab ⇄ PC |
🧭 Moments That Matter
We identified four core productivity moments—and designed for them:
Moment | Opportunity |
---|---|
Setting Up | Speed up formatting, layout, prep work |
Creating | Let users sketch, then AI suggests and enhances |
Upskilling | Subtle prompts that teach while doing |
Cross-Syncing | Let users move across devices without friction |
These moments grounded both our prototypes and prioritization.
✨ Design Principles
Our north star: AI should assist, not override.
Minimize Friction — Reduce micromanagement through smart defaults
Stay in Flow — Support continuity across modes and devices
Earn Trust — Make AI behavior transparent and adjustable
Preserve Agency — Every suggestion should be reviewable, never forced
🧪 Prototyping & Concepts
We designed and tested high-fidelity flows across multiple modalities: sketch, gesture, voice, and context.
📂 Dynamic S Panel
Your smartest tools, surfaced right when you need them.
Context-aware menu triggered by S Pen button
Adapts based on recent behavior (e.g. sketching, selecting text)
Floatable, flexible UI for fast access without interrupting work
Grows smarter with use—minimizing decision fatigue
ADAPTABLE TOOLS
Dynamically access your tailored menu: Based on your usage patterns and contextual understanding. Learns and adapts over time for smarter suggestions.
INSTANT LOCATION
Position tailored menu anywhere on the screen instantly: By pressing the S Pen button without navigating the interface
TIMELY ACTION
Make rapid action through situational awareness: Focus on an immediate actionable button and other options to convey your intent
🗣️ Gesture + Voice Navigation
Hands-free productivity
Circle + speak: “Summarize this” → contextual action appears
Highlight + say: “Make it friendlier” → tone-shift suggestion
Works well for lean-back or multitasking modes
HANDS-FREE SMART CAPTURE
Scribe Sync combined handwriting and voice to streamline tasks. Users could jot down ‘May 22, Project Deadline,’ and the system would parse it and offer to sync to calendar—no need to stop sketching or browsing.
SKETCH IT, SAY IT, SEE IT
In Smart Studio, users could sketch and speak to bring ideas to life—like drawing a bird and saying ‘make it fly.’ It was a generative playground where voice and gesture enabled expressive, intent-driven creation.
✏️ AI Doodle + Easy Edit
Creative flow, enhanced by generative tools
Draw → AI turns doodles into visuals or stickers
Write → Suggests tone shifts or improved structure
Circle + scribble → AI autocompletes or refines content
AI STICKER DOODLE
From scribbles to stickers—instantly and in any language: Sketch or write in any language—AI generates playful, contextual stickers and handles multilingual conversations automatically.
EASY EDIT
Turn sketches into polished content with AI: Circle, scribble, or sketch to generate images, rewrite headlines, or autocomplete paragraphs—right in the moment.
🔄 Cross-Device Flows
Sketch anywhere. Continue anywhere.
AI preserves sketch state across Galaxy phone, Tab, and PC
Mid-stroke handoff enabled real-time continuity
No re-authentication or manual syncing
CROSS-DEVICE FLOWS
We wanted mobility to feel truly intelligent—so I worked closely with our computational engineers to prototype AI-powered handoff. Whether someone started on a foldable or tablet, they could pick up exactly where they left off—even mid-sketch. The AI maintained task state and session continuity, eliminating the need to log in again or manually sync files. It created a creative flow that followed the user—effortlessly.
💫 Signature Experience Moments
We crafted 6 showcase moments to guide storytelling and testing:
Scenario | Feature | Outcome |
---|---|---|
🖊️ Sketching a slide layout | Creation Wizard | Instantly turns sketches into layout suggestions |
🎤 Speaking while circling text | Voice-to-Visual | AI suggests matching visuals and format upgrades |
👆 Drawing with gestures | Contextual Panel | Triggers smart options like summarize, rewrite, generate |
✍️ Handwriting a draft | Easy Edit | Cleans up, expands, or improves tone instantly |
📎 Highlighting tasks | Scribe Sync | Adds items to calendar or reminders with a tap and voice |
😄 Doodling a shape | AI Sticker Doodle | Transforms sketch into fun, expressive visuals |
🔐 Ethical AI & Accessibility
We made sure every feature could build trust and respect diverse needs:
Transparent AI indicators with clear user control
All actions were reversible with preview and feedback
Sketch + voice inputs reduced reliance on tap targets
Onboarding relied on nudges, not tutorials
📈 Outcome & Real-World Impact
We didn’t just prototype—we shaped Samsung’s AI narrative.
✅ Product Impact
Unified gesture, sketch, and voice into one expressive interaction system
Built scalable UX for AI onboarding, contextual menus, and cross-device flows
Collaborated with engineering teams to ship and influence features seen in:
Galaxy S24
Galaxy Tab S9
Galaxy Watch 6
🚀 Features Shipped or Adopted
Dynamic S Panel
Easy Edit
AI Sticker Doodle
Voice Action + Prompt-to-Visual
Cross-device handoff
Note Assist, Sketch Assist (Galaxy AI 2024)
✨ “A pen that doesn’t just respond—but understands.”
📊 User Validation
From surveys and prioritization studies with over 600 users:

93% felt more productive using the AI-enhanced S Pen
80% said they’d use it more frequently
Top 5 most-used features:
Auto Sync
Easy Edit
AI Sticker Doodle
Prompt-to-Visual
Rewrite with Tone
Top 5 easiest to integrate:
Smart Menu
Sync Across Devices
AI Visual Generation
Tone Shift
Voice Action + Prompt
📦 SDIC Innovation in the Wild
Our work helped shape Samsung’s strategic narrative around AI-powered productivity—moving beyond hardware into contextual intelligence and intuitive creation.

From internal concept decks to Galaxy Unpacked releases, the work we led at SDIC proved that bold UX vision can drive roadmap and market relevance.
👀 Industry Influence
Even beyond Samsung, our work echoed across the industry:
Several S Pen features were later reflected in Apple Intelligence (June 2024), including:
Image Clean Up
Math Note
Generative Emoji (Genmoji)

These overlaps weren’t coincidental—they validated our early explorations into expressive, assistive creation.
🔁 Reflection
This project reinforced that AI UX is about intentionality—not just intelligence.
Restraint is powerful – Not every AI moment needs automation
Prototypes drive clarity – Especially when communicating invisible systems
Flow is everything – Great UX protects momentum
If I could do it again, I’d push for earlier validation of AI gestures at scale. We had strong internal feedback, but broader user testing would’ve sharpened the interaction model across audiences and cultures.
————————