Helping people with memory loss live independently, with dignity, and stay connected to their loved ones.
RealityHack 2026 Submission
We started with a simple question: What is memory, and how can technology enhance it?
Ray-Ban Meta glasses are like a second pair of eyes and ears—a friend who's always with you, experiencing life from your perspective. They capture what you see and hear throughout the day, turning fleeting moments into something that can be recalled later. For memory, this is transformative: the glasses become a living record of your day, ready to remind you of what just happened.
Meta Quest offers something different but equally powerful: the ability to relive experiences in a way that feels close to reality. That deep sense of immersion isn't just entertaining—it helps the brain consolidate memories more effectively. Reviewing the day's highlights in VR isn't passive watching; it's active re-experiencing.
Together, these devices form a complete memory loop: capture → recall → reinforce.
From there, we asked: Who would benefit most from enhanced memory?
The answer led us to people living with dementia and Alzheimer's—and the caregivers who love them.
For someone with dementia, every day can feel like walking through fog:
- "What was I just doing?" — Forgetting intentions within minutes leads to unfinished tasks and frustration.
- "Who is this person talking to me?" — Not recognizing a family member causes deep shame and social withdrawal.
- "Did I take my medication?" — Constant uncertainty creates anxiety and repetitive checking.
- "How do I get home?" — Wandering away from safe areas puts lives at risk and exhausts caregivers.
For caregivers, the burden is immense: constant supervision, sleepless nights, and the emotional weight of watching a loved one struggle.
We built a "memory scaffold" — a system that works quietly in the background to:
- Capture & Encode: Automatically turn the day's moments into simple, meaningful "memory cards."
- Recall & Guide: When the user is confused, provide gentle, step-by-step help to get back on track.
- Protect & Escalate: Detect risky situations (like leaving the house unexpectedly) and alert caregivers with context—not just an alarm.
The Ray-Ban Meta smart glasses are the always-present companion that sees and hears what the wearer experiences. They blend into everyday life like regular sunglasses while quietly providing support.
- Meta Wearables SDK (DAT): Uses
MWDATCamerafor H.264 video streaming and frame-by-frame capture from the head-mounted camera. - Multi-Modal Query Pipeline: When the user asks a question (via long-press), the system captures:
- A Head-POV Image (what the wearer is looking at)
- Speech-to-Text via Apple's
SFSpeechRecognizer - GPS Coordinates via CoreLocation
- WebSocket Transport: Sends JSON payloads to the backend:
{ "text": "Who is the person in front of me?", "imageURL": "https://storage.googleapis.com/query_123.jpg", "latitude": 42.3601, "longitude": -71.0942 } - Voice UI (VUI): The response is read aloud via
AVSpeechSynthesizerdirectly to the glasses' open-ear speakers. The caregiver companion app shows a live transcription HUD.
- 🕒 "What did I just do?" — 10-minute rewind with AI-summarized recap
- 👥 "Who is this?" — Gaze-triggered face recognition with discreet audio whisper
- 📍 "Where are my keys?" — Last-seen object tracking with photo + timestamp
- 🛡️ Safety Watch — Geofence monitoring with gentle escalation
The "Cloud Brain" that understands, remembers, and protects.
- Gemini 2.0 Flash: Multi-modal reasoning over images + text + location context
- Cloud Firestore: Persistent memory storage (people, places, events)
- Cloud Storage (GCS): Image upload pipeline for query snapshots
- Cloud TTS API: High-quality voice synthesis for audio responses
- Custom RAG Pipeline: Retrieval-Augmented Generation for personalized memory search
POST /upload/{capture_id}— Upload images from the glassesWSS /ws/query/{user_id}— Real-time query/response WebSocketGET /memories/{user_id}— Retrieve stored memories for the caregiver dashboard
At night, the Quest headset transforms captured moments into immersive cognitive therapy.
Research shows that immersive environments help the brain consolidate memories more effectively. Instead of passively watching a highlight reel, the wearer is inside their memories—the deep sense of presence strengthens recall.
- Unity 2022.3: Built with Meta XR SDK for Quest 3
- ElevenLabs API: Natural voice narration for the memory journal
- Daily Highlights Sync: Curated moments from Ray-Ban capture are packaged into a VR session
- Interactive Q&A: Gamified prompts ("Who visited today?") with progressive visual hints
- 🎬 Immersive Replay: Relive highlights in panoramic/theater mode
- 🧠 Cognitive Reinforcement: Spaced retrieval questions strengthen long-term memory
- 👨👩👧 Shared Viewing: Remote family can join the virtual space for social connection
- ✅ Positive Reinforcement: "You remembered!" feedback loop
Caregivers aren't just watching passively—they're configuring the system to work best for their loved one.
- 3D Home Map: Mark specific spots with custom reminders (e.g., "Kitchen: Turn off the stove")
- People Profiles: Upload photos and details for the "Who's Who" face database
- Scheduled Reminders: Time-based nudges synced to the glasses
- Location Boundaries: Define safe zones with exit alerts
- Memory Timeline: Review captured moments and AI interactions
We want to give people with memory loss more good days—days where they feel capable, respected, and connected. And we want to give caregivers peace of mind, knowing that their loved one has a gentle, always-present support system by their side.
RealityHack 2026 — Team OurLife