Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ok, but, what this video does not bring up is that it is not hundreds of picture…
ytc_Ugx4tOAMg…
G
Think about medical decisions. Think food not always everyone is same. Don’t tru…
ytc_UgwD0D_2U…
G
There are no laws of physics that prohibit AI surpassing humans and replacing t…
ytc_Ugx2cOTSC…
G
It's not reshaping it's destroying. Amazon is testing robots to walk the package…
ytc_UgxLwNsnN…
G
In the end, ai art cant exist whithout original human made art. Which is why i a…
ytc_UgyUmw5WZ…
G
there is a funny thing about the current ai bot like chatgpt . I have the origin…
ytc_Ugw1ETd0y…
G
If this is an American manufactured robot, why does it look like some Russian la…
ytc_UgwIpl9cY…
G
Even procedural generation has some intention behind it. It is the only ethical …
ytc_UgxX7s8LX…
Comment
"but but it's not gonna be all bad We can use it to get rich, to deceive/cheat, become lazy, we can scam w it We will be able to keep @realdonadtrump out of the way .. " (as the Creaton from Hawaii we just heard is already most likely plotting,). But with one question the argument for and against is answered That question being Will AI be able to, w precision, engage humans with military lethality? End of conversation! .... But are we smart enough, have the backbone, the moral integrity, to face the truth?
youtube
AI Governance
2023-05-17T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzQ_kaeGOkhusIJURt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPYczHJrdXgbmgAkN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwkXJW3MQ1XGsDiTL54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzlfwlcL59OXn5PW-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyi_46PDDDtV2DHkJh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwJlj7uKTuoit3anKV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzRzp7WMa7jTGCjRgJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOMxUMS9VLAirmfVJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZP70Zk_NZfmjARop4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyJ9cisgotkCJDnyB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]