CASE STUDY: "Emovere" — An Interactive Emotional Landscape (FMP)
Projects
About
Designs
Contact

Take a Peek ->
Research & Analysis
Methodology
To ensure a user-centered and empathetic design process, mixed-method research was employed:
Qualitative Research:
Structured interviews with 4 hostel mates and friends.
15 detailed questions covering emotional experiences, coping mechanisms, nature’s emotional impact, and preferences for sensory inputs.
Quantitative Research:
A Google Form survey distributed to a wider demographic, mostly aged 18–24.
Focused on preferences for colours, sounds, and gestures linked to emotional states.
Key Findings
Emotional Barriers in Existing Tools:
Users struggled to express feelings verbally.
Found traditional mood tracking apps clinical, cold, or data-heavy.
Expressed desire for symbolic and poetic ways to explore emotions.
Sensory Preferences:
Colour Associations:
Blue → calm
Yellow → joy
Red → anger
Grey → sadness
Sound Associations:
Nature sounds (rain, wind, forest ambience) were preferred for calm and emotional release.
Gestural Inputs:
Users enjoyed drawing and tapping as emotional outputs.
No users reported discomfort with gesture-based input.
Empathy Mapping & User Personas
Empathy Map Highlights:
Think & Feel: “I want something easy to use after a stressful day.”
Say & Do: “Nature makes me feel calmer.”
Pain Points:
Digital fatigue and emotional overwhelm.
Dislike for apps that feel too “clinical.”
Gains:
Private, poetic emotional processing.
A calming experience, not just emotional data collection.
User Personas:
Created based on research insights, with differing goals:
A stressed student needing short emotional breaks.
A quiet creative seeking a safe space for self-reflection.
A busy professional looking for an alternative to therapy apps.
Competitive Analysis
Analyzed emotional well-being apps: Reflectly, Moodnotes, Daylio, Wysa
Gap Identified: No app used multisensory, metaphorical, and nature-inspired interaction for emotional reflection.
Introduction & Context
In today’s fast-paced digital environment, users often find themselves overstimulated, emotionally fatigued, and lacking emotionally supportive online spaces. While much of the digital world prioritizes productivity, consumption, and performance, few platforms offer a calming, reflective experience where users can process emotions non-judgmentally and creatively.
Emovere addresses this gap. It is a sensory-based, nature-inspired digital experience that transforms user emotions into metaphorical visuals like growing trees, falling rain, or wilting flowers through multimodal input: gesture, sound, and colour. Built with p5.js, pointillism, and pixel-art styles, the app offers a meditative interaction with visual and audio feedback, helping users engage in emotional self-awareness and personal reflection.
By prioritizing calm, emotional inclusivity, and non-verbal interaction, Emovere creates a space where users can explore and express their feelings without pressure, diagnosis, or judgment. The project is not a clinical tool but a creative emotional mirror designed to provide solace, mindfulness, and introspection in a screen-saturated world.
Ideation & Experimentation
Conceptual Development
After thorough research, the ideation phase focused on translating emotional input into nature-inspired visuals using metaphor. The goal was to turn a fleeting feeling into a poetic representation that grows, moves, or fades—like real emotions.
Emotion-to-Metaphor Mapping:
Joy → a growing tree
Sadness → gentle rainfall
Anger → a wilting flower
Calm → flowing wind
These mappings were grounded in psychological associations, user responses, and personal symbolism drawn from the designer’s own emotional connections to nature.
Visual Exploration
Natural Inspiration:
Designer collected real leaves, flowers, and nature textures during walks in green areas.
Created a visual colour board using real-world palettes from these explorations.
Used sketches to abstract nature elements into digital forms.
Visual Styles:
Inspired by Pointillism (Seurat, Signac) to create layered, emotionally rich visuals.
Merged with pixel art to create a digital-nostalgic look, evoking softness, warmth, and familiarity.
User Journey & Interaction Flow
Mapped user journey from emotional confusion to reflection:
Landing Page – Gentle introduction to the concept.
Onboarding – Tooltips explaining sensory input.
Input Page – User selects colour, sound, gesture.
Emotion Recognition – System affirms emotion with visuals.
Forest Scene – Visual metaphor unfolds dynamically.
Reflection Loop – Users return to observe their emotional forest grow.
Prototyping
Low-Fidelity Sketches: Quick iterations of UI structure, emotion metaphors, and interactions.
High-Fidelity Prototypes: Created in Figma, tested with peers and mentors.
Feedback:
Users loved the calming palette and metaphoric visuals.
Adjustments made to button placement, tooltip clarity, and onboarding flow.
Testing & Refinement
Testing played a key role in refining Emovere and ensuring it provided a calming, intuitive, and emotionally engaging experience.
User Testing Process
Testing was conducted with:
Classmates
External participants
Lecturers and mentors
Participants interacted with the prototype and provided feedback on usability, interaction, visuals, emotional clarity, and overall mood.
Feedback Themes & Insights
Visual Feedback
Users praised:
Pixelated illustrations
3D trees and animated forest
Weather effects (rain, wind)
Pointillist-style backgrounds
These elements were retained and enhanced for better consistency and emotional tone.
Interaction Feedback
Touch gestures (tap, scroll, draw) were well-received for their meditative quality
Morphing text was reduced for readability
Sound selections were limited to natural tones to match the visual theme
Usability Feedback
Navigation clarity was an early issue
Solutions included:
Tooltips for interaction boxes
Highlighting selected elements (color/sound/gesture)
Adding introductory popups to guide users on their first visit
Improving visibility of buttons with hover effects and contrast
Splitting content across balanced layout sections
Final Outcomes
The iterative testing process led to a more polished, emotionally resonant experience. Key improvements:
Improved interaction clarity
More accessible and guided navigation
Enhanced visual balance and responsiveness
Deeper emotional engagement through natural metaphors and multisensory elements
The result is a cohesive, meditative application that encourages users to explore and process emotions intuitively and creatively.
Development Process
The development of Emovere was an iterative and deeply creative process, combining front-end web technologies, interactive design, and experimental creative coding. The focus was on crafting an immersive and intuitive emotional journey, blending pixel art, sound, motion, and sensory feedback.
Tools & Technologies
The core development stack included:
Figma / Illustrator – for prototyping and visual asset creation
HTML / CSS / JS – to build and style each page, manage event behavior, and DOM interaction
p5.js – for canvas-based drawing, pixel effects, interactive animations, 3D elements, and gesture-based input
WEBGL (within p5.js) – to handle 3D rendering of trees and spatial interactions
Toastify.js – for styled alerts and warnings
Font Awesome – to simplify icon-based UI elements for sound and emotion inputs
Learning p5.js
My journey with p5.js began through self-led experimentation—starting with loading images and learning how to animate shapes and track interactions. The p5.js community’s tutorials and documentation played a crucial role, helping me understand features like preload()
, draw()
, WEBGL
, and input events.
Key skills gained:
Managing preloaded assets (images, fonts, sound)
Implementing animations and effects like pointillism and pixelation
Handling gesture-based interactions (tap, swipe, draw, scroll)
Using arrays and variables to control dynamic visuals
Designing interactive elements like fireworks, flowers, clouds, and bees
Integrating multiple canvas sketches using instance mode
Mapping Emotion Inputs
The second page allowed users to express themselves through:
Sound selection
Color choice
Gesture interaction (touch-based inputs)
Each input was mapped to emotional categories based on psychological research. Arrays were used to store selections, and a JS function determined the dominant emotion—or blend—by tallying user input. These results then triggered specific visuals and animations in subsequent screens.
Forest & Emotion Mapping
The final screens visualized emotions through nature metaphors:
Joy → Growing Trees
Sadness → Rain and falling drops
Calm → Wind with fluttering leaves
Anger → Wilting Flowers
Love → Flying bees and vines
Mixed Emotions → Pointillism with subtle transitions
Each element used pixel art combined with motion and layered interactions to create a living emotional landscape.
The WEBGL environment allowed me to render 3D trees that responded to joy count, while wind, rain, and flowers were dynamically animated to reflect emotional patterns over time.
Web Storage & Responsiveness
To maintain emotional history, I utilized the Web Storage API, enabling the app to keep track of users' past emotions and adapt visuals accordingly. For responsive design:
All element positions were calculated using percentages
The
windowResized()
method dynamically adjusted layoutsEach canvas was built to scale across devices, ensuring usability and accessibility
Additional Features
SVG morphing for landing visuals
Audio control via
p5.sound
Guide Tour: Interactive popups were added to explain sections, triggered only for new users
Pixel Shuffle Effects: Hover animations applied to emotion icons for engagement
Reset Function: Added during testing to reset emotional data