Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Should we expect a planet that would f-u-c-k up a three-item grocery list to sol…
ytc_UgxUd4kWd…
G
Sounds similar to sampling music. Taking parts and recreating. Also an AI is usi…
ytc_UgwajZC4p…
G
Restrictions on energy usage first che c k to ai, , ai replaced with animal insi…
ytc_UgxitDG39…
G
I would like Mr. Hinton to elaborate on the comment, Mr. Musk has no moral compa…
ytc_UgzNLzqbe…
G
Everyone: upset because their art is being used to train AI without their consen…
ytc_UgypdKgN4…
G
I use ai, but only when I want to screw around and make some dumb shit. I know i…
ytc_Ugzuq57FI…
G
All fun and game until they come out with a male robot that opens jars.…
ytc_Ugzp-CvY6…
G
I love how he talks about how shit AI art is when his gender looks like it was m…
ytc_UgzyoiDjx…
Comment
Robot laws do not actually exist. The programmers can change how it will react. It's up to policy makers also, not just the programmers.
youtube
AI Harm Incident
2024-04-15T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2Eam8enQJJ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2EbmVWX5QB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgzyoQHfkvKymBmesal4AaABAg.A-sXnd6yXDiA2F4IIrYzpR","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxO_Ujk5rSvOjWRKBB4AaABAg.9qpyK0rFpPmA2EdMqzaPey","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytr_Ugxr6E9-mHJqZTtbMkB4AaABAg.9n3_k2CWmk5A2EdhXW4VlG","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwCHoJvEcws5sbtNMJ4AaABAg.9gAd7y-h4HM9usT1V03sv4","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxhzhse5PJvVG9QIF54AaABAg.9eKDGjvyiy7A2Eet4FGYqd","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugw4jM93_9cAtGe9wgN4AaABAg.9Wp06dt0zPM9ckPtZde_5N","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyfZZIYhGkJqOVkj3p4AaABAg.9NJbV2HYicl9UBo94mRsqU","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugxx9whhaDWkfEJOjy14AaABAg.9GhF4K2osdN9KjCloO1jaE","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]