HouseKey AI resource navigator illustration showing pathways to housing and food assistance

HouseKey — built from care, curiosity, and classroom creativity.

a Responsible AI for Social Impact project The earliest version of this project was affectionately called “Lurch.” Modeled after the Addams Family’s loyal butler, he was calm, kind, and just gothic enough to make a tough topic more engaging. What started as a playful prototype became a living classroom — where students learned responsible AI by designing a helper built for real people in real need.

A dignity-first guide for finding help

HouseKey is an AI-supported resource navigator that helps people locate food, shelter, healthcare, and legal aid — quickly, safely, and without judgment. It’s also an educational experiment in how AI design, ethics, and empathy can be taught by doing.
  • Built on trust, transparency, and empathy — the INNOVATE framework way
  • Designed for low-bandwidth and mobile-first use
  • Grounded in privacy-aware design and a human, reassuring tone

The first “Lurch” prototype was developed inside the Responsible Innovation Lab as part of our Hunger Free Campus–related work and applied ethics courses. Its mix of humor and humanity made students eager to participate — not just as coders or testers, but as storytellers, sociologists, and designers of digital trust.

As the project evolved into HouseKey, it became a full lab initiative and a platform for teaching how responsible technology grows in practice.

What HouseKey does today

Right now, HouseKey exists as a ChatGPT assistant that helps users think through what they need and where to turn next.
It doesn’t replace case managers or 2-1-1 services — it prepares people to use them more confidently.

  • Offers calm, non-judgmental responses drawn from the original Lurch tone
  • Helps users organize their situation into concrete questions and next steps
  • Suggests how to prepare — what to ask, what to bring, and what information might be needed

Every interaction doubles as a learning experience — for the user and for the students maintaining it.

Each semester, new cohorts refine the logic, improve accessibility, and learn firsthand how ethics, design, and empathy meet in code.

Responsible AI with personality — and pedagogy

Dignity-first design

Plain language, consent at every step, and zero judgment.

We prioritize emotional safety and clarity so people can ask for help without shame.

INNOVATE + INTEGRITY framework

HouseKey follows RIL’s INNOVATE framework and
INTEGRITY values — inclusion, transparency, accountability, empathy, and trust.

Students learn to evaluate AI not just for efficiency, but for fairness and emotional impact.

Learning through building

Every prototype is a classroom in motion. Students, mentors, and local partners co-develop and test responsibly, learning through the act of creation. The process itself teaches empathy, collaboration, and technical literacy.

In RIL’s model, every product is also a pedagogy — a living syllabus in responsible innovation.

Where HouseKey is headed

The next public versions of HouseKey are in development. They will continue to respect user privacy while gradually adding more structured support for navigating local services. These features are roadmap goals, informed by pilots and our Hunger Free Campus–related work:

Emergency & crisis navigation

Surfacing verified shelters, cooling or warming centers, and hotlines with clear contact options — always with transparent limitations and safety disclaimers.

Guided aid applications

Step-by-step support for food, rent, or healthcare applications, focused on dignity-first prompts that reduce confusion without collecting sensitive data unnecessarily.

Community-updated network

Allowing campuses and nonprofits to keep their own listings up to date, so information is maintained by the people closest to the work — accuracy through collaboration, not extraction.

Future phases will merge education, design, and data stewardship — making responsible innovation not just a topic we teach, but a system people can experience.

24/7
available learning and support environment

3–5 steps
to actionable understanding

Student-led pilots
co-designed with students & local partners

Living classroom

Students learn responsible AI by building for real people, not just assignments.

HouseKey began as a gothic inside joke and grew into a blueprint for responsible learning-by-doing.

It embodies what RIL stands for: playful curiosity, principled design, and education that lives in the world.

Each new cohort inherits the code, the ethics, and the story — then leaves it a little better for the next team.

Questions we often get

Do you collect data?

No. HouseKey doesn’t store user identities. You control what to share and can end the chat at any time.

Future versions will follow the same privacy-first commitments and will be co-designed with partners.

Is Lurch still around?

In spirit, yes. Lurch was one of our first personas in the Lab — the calm voice that made students smile while tackling hard problems. His influence remains in HouseKey’s tone: steady, kind, a little quirky, and always focused on care.

Can our organization partner on HouseKey?

Absolutely. RIL welcomes collaborators who want to pilot HouseKey, contribute to its data design, or bring its educational framework to their campus or city. Start with a conversation:schedule a meeting with the Lab.

Bring HouseKey to your community

HouseKey isn’t just technology — it’s a teaching tool, a community project, and a movement for digital dignity.

Partner with the Responsible Innovation Lab to pilot, teach, or co-design the next version of this AI-supported resource navigator.

Talk to the Lab
Host a PIIC Team

a Responsible AI for Social Impact project