L.I.B.R.A.

Layered Interpretation of Behavioral & Reflective Affect

Tools:

Figma, Blender, Veo Studio, OpenAI API
January 2026
-
June 2026
Role:

Concept, Research, UX Strategy, Psychology, Ethical Framework, Interaction Design

An Ambient AI System for Emotional Balance in the Age of Negative Media

LIBRA is an AI-powered system integration designed to live quietly within a user’s phone, helping them become aware of the emotional impact of the content they consume, without surveillance, judgment, or restriction.

Rather than blocking content or enforcing limits, LIBRA acts as a reflective layer—helping users notice patterns and recalibrate when emotional strain accumulates.

What was built

LIBRA is not an app you open.
It is an ambient, system-level presence.

LIBRA:

  • Runs entirely on-device

  • Analyzes patterns, not individual moments

  • Detects emotional tone across news, video, audio, and social content

  • Tracks duration and repetition of emotionally intense exposure

  • Never reads messages or records conversations

  • Never diagnoses mental health conditions

  • Never sends data off-device

Its role is not constant intervention—but awareness when emotional weight accumulates.

The Problem

Modern digital platforms are optimized for engagement—not emotional health.

Users are:

  • Exposed to persistent negative news cycles

  • Doomscrolling late into the night

  • Absorbing emotionally intense content without pause

  • Rarely prompted to reflect on how that content affects them

Psychological research shows that prolonged exposure to negative media increases anxiety, emotional fatigue, and stress—especially when combined with repetition and lack of agency.

Most existing “digital wellbeing” tools measure time spent, while ignoring emotional impact.

The core question:
How might we design an AI system that helps people understand how media affects them emotionally—without becoming invasive, prescriptive, or paternalistic?

Previous
Previous

The Dugout (In-Progress)