Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Root of Art and Humanity: The most fundamental argument you make against AI prod…
ytc_Ugzwbpecp…
G
Unfortunately, the issue is, does America make AI first or do we let China make …
ytc_UgzmrBmv-…
G
Currently in the South East, UK in areas of Kent there has been a mass outage of…
ytc_UgxFxasyz…
G
Funny thing- finding where conciousness is is both awesome and quite teriffying
…
ytc_Ugyk0KmeH…
G
I've tried creating some AI art from an app. What I discovered was that not ever…
ytc_UgzG0NrVv…
G
Ai not wanting to be turned off is same as person not wanting to die?…
ytc_Ugxu4a5yP…
G
I understand your concern! The portrayal of AI in movies like Terminator can def…
ytr_UgzfRXEMA…
G
One good reason the human race should not want AI, it’s only going to make Elon …
ytc_UgwaaN4au…
Comment
13:42 I don't understand how having a robot in the seat is better than a fully automated vehicle ... What a human can do with his meaty limb, any computer can do better without the laggy physical interface, so the whole argument is moot... Have an AI directly act upon the vehicle it'll be many times more efficient than having to interface that with a robot in a seat........
youtube
AI Governance
2024-01-01T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2ccdPuEoLltkGN1p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMCSkxd308zPwmLol4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMtAzeWxmO6cJoMXp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyG1pLmYun2zPPlA694AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWkez55RHCbe72jJF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwnIhrAFgCorCicaaJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxb_0HCh6NMOs6K9oB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8VnyvyxPhnqaQvgl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzD3XnQODovdBaHpRF4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxoqO8r96LKlBF54ZR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]