Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most difficult thing was to explain what life is in machine terms, like I wa…
ytc_UgyDKyjAf…
G
Blows my mind wondering what will be here 100, 200, 500 years down the line. M…
ytc_Ugz3bt_86…
G
Because the fraudster Electric Jesus Elon, deceived them with his false claims t…
ytr_Ugy519z97…
G
I’ve been in IT for many years and know one thing…automation kills jobs.
AI is …
ytc_UgwIyKFiv…
G
do whatever will minimize bodily harm period.
2 motorcycles on ether side? th…
ytc_UghwhUYMz…
G
To clarify, I know that AIs are currently not sentient. I am simply wondering if…
ytr_UgzTNyZky…
G
@27:00 Altman is coming across a bit political and walking around questions and …
ytc_UgxTDbcZ3…
G
Step 1: Create a super-powerful benevolent/neutral AI before a super-powerful m…
ytc_UgzCdwe6C…
Comment
Openai don't know actual sensitive stuff. But they know enough to tell some stuff about me but its stuff i just dont care if people know. Telling them secrets doesnt seem like a smart idea anyways. And also im pretty sure you can turn off them being allowed to use your data for training, i don't cus i think it's dope
youtube
AI Moral Status
2024-09-01T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxiJrCc1vGbQcnaBad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw-9M5wjVEQuLe-QB54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSEa_cSUa4dfeU_hx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxb9fcKmMxJ1ZFQPMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY2Qn8jYnKF78GqRB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSSXqUmwnuwWWqK7V4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxbuSmEY7lgWrRrWdF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRET7uRrb7lelFmyd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugy46Wf0e71bUX0wpaJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwKp3PgIfyQbbr_PnR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]