The Problem
People with disabilities and their caregivers often struggle to discover the right assistive technologies and community resources. Existing solutions are fragmented, hard to navigate, and not personalized to individual needs. There is no single platform that understands a user's context and proactively recommends relevant support.
The Team
Dayli AI is a collaborative effort across our UNT coursework. Each team member brings complementary skills to the project.
Dhivya Prabhakaran
UX Engineer — Technical LeadTeam Members
UX Research & DesignMy Contributions
Technical Architecture & Development
I own the technical side of Dayli AI, making key architecture decisions and building the working prototype from the ground up:
- Full-stack development — Building the application with Next.js, implementing the complete frontend and backend infrastructure
- Supabase backend — Designing the database schema, authentication system, and API layer
- Semantic search — Implementing vector embeddings to intelligently match user needs with relevant assistive technologies and community resources
- AI integration — Building conversational AI flows that feel natural and guide users to the right resources
UX Engineering
Beyond pure development, I contribute directly to the design process with my unique perspective as someone who builds what we design:
- User flows & journey maps — Creating detailed interaction models grounded in stakeholder research
- Wireframing & prototyping — Translating research findings into interactive Figma prototypes
- Accessibility-first implementation — Ensuring the app meets WCAG standards with large touch targets, high contrast, and screen reader support
- Design-to-code pipeline — Rapidly turning design concepts into working code, enabling fast iteration and validation
What We're Building
Conversational AI
Natural language interactions that help users articulate their needs and receive personalized recommendations.
Smart Onboarding
A guided flow that captures user context — disability type, daily challenges, goals — without feeling clinical.
Role Toggle
Seamless switching between "I need help" and "I'm helping someone" modes, each with tailored UI and content.
Resource Matching
Semantic search connecting users with relevant assistive technologies, community programs, and support groups.
Design Process
Stakeholder Research
Conducting interviews with caregivers, disability advocates, and potential users to understand pain points and unmet needs.
Journey Mapping
Mapping the end-to-end experience of discovering and accessing assistive resources, identifying friction points and opportunities.
Wireframing & Prototyping
Creating low-fi and high-fi prototypes in Figma, testing interaction patterns for the AI conversational interface.
Technical Implementation
Building the working application with Next.js and Supabase — turning validated designs into production-ready code.
System architecture — data flow across community, AI, and infrastructure layers
Status & Next Steps
This project is actively in development as part of our HCI 2 and Maker Lab coursework at UNT. We are currently in the prototyping and development phase, with usability testing planned for the coming weeks.
Check back for the full case study with final designs, usability testing results, and a live prototype demo.