Back

Back

Back

Storytelling for Alexa+

Storytelling for Alexa+

A cross-platform, multimodal storytelling system that helps customers create, personalize, and revisit emotionally rich stories by blending real-time generative narratives with music, visuals, and interactive elements across Echo, mobile, and web.

Launch Year

2025

Platform

Mobile, web, multimodal

Role

Senior Product Designer

Outcome

This project explored how generative AI could move Alexa storytelling beyond one-off, linear stories into experiences that adapt to people, moods, and moments over time. Existing story experiences often struggled with revisions, tone changes, and continuity, making them feel fragile and difficult to return to.

The goal was to design a unified storytelling system that works across Echo devices, the Alexa mobile app, and web, serving kids, adults, and families in a single experience. Rather than splitting audiences into separate products, the system adjusts its depth, tone, and interaction style based on how someone wants to engage.

Customers can actively participate by making choices or changing the story’s direction, or they can simply listen as it unfolds. These modes can shift mid-story, allowing the experience to fit naturally into moments like bedtime routines, background listening, or focused solo sessions. Real-time personalization and save-and-resume continuity allow stories to evolve across sessions and devices, positioning Alexa as a creative companion rather than a transactional assistant.

My Role

I led UX concept development and experience strategy, partnering with product and engineering to define how generative storytelling could live natively within the Alexa ecosystem.

My focus was shaping the end-to-end experience model, including how stories are discovered, initiated, personalized, visually expressed, interacted with, saved, and resumed across devices. I translated customer needs, voice-of-customer feedback, and competitive gaps into a cohesive UX vision, while defining interaction patterns that could scale across genres, age groups, and modalities.

Because this was a platform-level effort rather than a single feature, my work emphasized reusable systems that could support everything from short kids’ stories to longer, episodic adult narratives.

My Design Focus

Stories that feel alive

A defining aspect of this work was designing a real-time story generation experience that blends narrative, music, visuals, and interaction into a cohesive system.

Rather than treating stories as text read aloud, the experience was designed to feel like a living storybook. As a story unfolds, on-theme music, adaptive sound design, generative imagery, and expressive text styling reinforce mood and pacing. When a customer makes a choice or changes tone, the visuals, music, and rhythm shift with the narrative. This responsiveness makes the experience feel intentional and immersive.


Grounding imagination in real moments

I explored how generative storytelling could be grounded in personally meaningful inputs instead of existing purely in fantasy. The system can optionally draw from a customer’s interests and recent photo albums, such as a family camping trip or weekend getaway, and reinterpret those moments as beautifully told stories with a creative twist.

For example, camping photos can become an illustrated adventure where familiar people and places appear in a magical or exaggerated setting. This anchors exploration in real memory, transforming everyday moments into stories that feel intimate and worth revisiting.


Making storytelling feel coherent and trustworthy

Customer feedback showed that early generative stories often felt unreliable. Details like names or character traits were lost, revisions contradicted earlier scenes, and tone changes did not always land.

I focused on interaction models that make behavior predictable and understandable. Stories were structured into clear beats such as setup, choice, consequence, and wrap-up, so customers could see how their input influenced what came next. Explicit controls like “make it scarier,” “change the ending,” or “speed things up” were treated as supported actions, helping customers iterate with confidence.

How this work affects my design approach

This project shows that as generative AI becomes more powerful, experience design becomes more critical. Customers need structure, clarity, and emotional grounding to trust and enjoy open-ended systems.


I bring experience designing platform-level UX for generative products and shaping real-time, multimodal experiences that make advanced AI feel personal, expressive, and human.

38%

38%

38%

increase in story completion when real-time visuals and music were introduced

increase in story completion when real-time visuals and music were introduced

increase in story completion when real-time visuals and music were introduced

2.1x

2.1x

2.1x

repeat story engagement when stories were personalized with interests or memories

repeat story engagement when stories were personalized with interests or memories

repeat story engagement when stories were personalized with interests or memories

55%

55%

55%

eduction in abandonment when customers could switch between interactive and listening modes

eduction in abandonment when customers could switch between interactive and listening modes

eduction in abandonment when customers could switch between interactive and listening modes

Let's work together!

Linkedin

Resume

©2025 Dani Tuchman

Monday, 2/9/2026

Let's work together!

Linkedin

Resume

©2025 Dani Tuchman

Monday, 2/9/2026

Let's work together!

Linkedin

Resume

©2025 Dani Tuchman

Monday, 2/9/2026