Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was speechless in the first moment and then u realize its not some meme from a…
ytc_UgyVukf4f…
G
10:57 I totally agree with you, though.
I feel like people don't understand ho…
ytc_UgwUcypM5…
G
Fast-moving car can't just "quickly stop", self-driving cars still abide the s…
ytr_UgwrxIeuE…
G
What exactly do WE value? Do we value anything anymore? We deserve to become the…
ytc_UgwWCqxQX…
G
I mean I've seen a lot of AI videos exactly like this but the one on Brett Coope…
ytc_UgwUpxTgZ…
G
Will we have full employment and good standard of living if to outlaw AI and gen…
ytc_UgwgOzTWU…
G
It's funny to me.... Sure, AI is not ready to replace lawyers.... YET. But in th…
ytc_Ugxf8DOO_…
G
Imagine...robots like this are designed as weapons, just like in the terminator …
ytc_UgyGL0tm7…
Comment
Isn't it common consensus amongst scientists that the feelings that make us humans *alive* are caused by chemicals in the brain? How would the AI even *be* self aware without a complex proccesing device like our brain?
Surely we'd give an AI a brain to actually be able to be alive if we wanted one to be so; and then that solves the question.
youtube
AI Moral Status
2023-08-21T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]