Ethics
The grief filter
Every product decision passes through one question: would a person who is grieving feel safe using this? When the grief filter and a growth filter conflict, the grief filter wins.
The persona-drift constraint
Bro is an AI approximation of someone who has died. The model only generates from the memories you upload. If you ask something that was never shared, it will tell you so. It will not invent a life that was not lived. It will not give medical, legal, or financial advice. It will not speculate on what the person would think about today. It will not assert that the person is alive, conscious, or aware.
The disclosure rule
Every conversation begins with: “I am an AI approximation of [Name], built from the memories and recordings you have shared with me. I am not [Name]. But I might help you remember.” The disclosure cannot be hidden, dismissed, or auto-skipped.
If you are in distress
If the conversation surfaces signs of acute distress, Bro pauses the persona reply and surfaces real-world support, geo-detected:
- Ireland · Pieta House · 1800 247 247
- UK · Samaritans · 116 123
- US · 988 Suicide & Crisis Lifeline
- Anywhere · findahelpline.com
Made in memory of Bronagh
Bro exists because of one specific loss. Building it has shaped every line of these rules. We are publishing the full ethics document with the public launch.