Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's just a feature in the voice system... Chat gpt itself and the voice AI are …
ytr_Ugx8FSKtB…
G
If ChatGPT wasn't so much nicer, smarter and more understanding then people nobo…
ytc_Ugw4eiQci…
G
What's interesting to me is that Anti Messiah/Christ doesn't necessarily mean "a…
ytc_UgxUVl1Ne…
G
Yeah, a filter that uses artificial intelligence to process the image and create…
ytr_Ugwr037X9…
G
William Burr okay, here is some logic, your limitations in how you raise your ki…
ytr_UgzzXdoAH…
G
My 2 cents: An artist who's style is copied by AI to make a new art piece, if th…
ytc_UgyruCTGr…
G
Currently, artificial intelligence has no emotions or desires, but we cannot ign…
ytc_UgyNVYZQt…
G
What Weapon will I need to Kill it? AI has Circuitry, Could we use Incendiary? M…
ytc_UgzRXOORR…
Comment
If at least most of the cars are self-driving, and have the ability to communicate, then why doesn't the car either quickly stop or suddenly stop? All the cars behind our car will stop (up to a point that depends on traffic), and no one gets harmed except maybe for those people without seatbelts behind us. So, the solution is simple: Cars communicating with all the others instantly, which we already do with instant messaging. Why not program it into a car to maximize safety?
youtube
AI Harm Incident
2017-07-18T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UggJqTTxAgQpuHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UggSR5TBSlFvAngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugg3Ooi6amKBaHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgiIyXvapa3ghngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugg6S0mO_XY-BHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugj2e2pqV-vGsHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UghPxCznJQ-7DHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UggvYPle2Wkb6XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgjlhktnJUrllHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UggvUq6OXIbKO3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]