Dayli AI

Dayli AI

An AI-powered assistant connecting users with assistive technologies and community resources — built by a collaborative team at UNT.

Project Type Team Project
My Role UX Engineer (Technical Lead)
Course HCI 2 & Maker Lab
Duration 2026 – Present
In Progress

The Problem

People with disabilities and their caregivers often struggle to discover the right assistive technologies and community resources. Existing solutions are fragmented, hard to navigate, and not personalized to individual needs. There is no single platform that understands a user's context and proactively recommends relevant support.

The Team

Dayli AI is a collaborative effort across our UNT coursework. Each team member brings complementary skills to the project.

Dhivya Prabhakaran

UX Engineer — Technical Lead

Team Members

UX Research & Design
As the UX Engineer and technical lead, I bridge the gap between design decisions and technical implementation — ensuring what we design is not only user-centered but also technically feasible and production-ready.

My Contributions

Technical Architecture & Development

I own the technical side of Dayli AI, making key architecture decisions and building the working prototype from the ground up:

UX Engineering

Beyond pure development, I contribute directly to the design process with my unique perspective as someone who builds what we design:

What We're Building

Conversational AI

Natural language interactions that help users articulate their needs and receive personalized recommendations.

Smart Onboarding

A guided flow that captures user context — disability type, daily challenges, goals — without feeling clinical.

Role Toggle

Seamless switching between "I need help" and "I'm helping someone" modes, each with tailored UI and content.

Resource Matching

Semantic search connecting users with relevant assistive technologies, community programs, and support groups.

Design Process

1

Stakeholder Research

Conducting interviews with caregivers, disability advocates, and potential users to understand pain points and unmet needs.

2

Journey Mapping

Mapping the end-to-end experience of discovering and accessing assistive resources, identifying friction points and opportunities.

3

Wireframing & Prototyping

Creating low-fi and high-fi prototypes in Figma, testing interaction patterns for the AI conversational interface.

4

Technical Implementation

Building the working application with Next.js and Supabase — turning validated designs into production-ready code.

DLL community dailylivinglabs.com
Dayli AI Next.js 14+ on Vercel
Upload solutions
Read + write
Supabase (PostgreSQL + pgvector)   Auth + RLS
Community solutions
solutions solution_embeddings community_ratings moderation
External resources
external_resources resource_embeddings scrape_sources
User data
profiles needs_assessments challenge_categories solution_paths
AI agent layer (Next.js Route Handlers)
Needs assessment
Solution matching
Pattern recognition
Web scraping Firecrawl → chunk → embed
YouTube transcripts Supadata → chunk → embed
LLM provider Claude API (Sonnet)
Community-submitted (highest trust)
Web / YouTube scraped (verified by AI)
User interaction data (private per user)
Frontend apps
View in Figma

System architecture — data flow across community, AI, and infrastructure layers

Status & Next Steps

This project is actively in development as part of our HCI 2 and Maker Lab coursework at UNT. We are currently in the prototyping and development phase, with usability testing planned for the coming weeks.

Check back for the full case study with final designs, usability testing results, and a live prototype demo.