Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You mean to tell me, they'd rather have AI then ratchets working for them ? Lol…
ytc_UgwkwWfxZ…
G
This video looks like cope…it’s true that AI isn’t there yet, but I think it’s d…
ytc_Ugyi5vRVc…
G
Ah yes self driving Car that may want to kill humans.. or be free... yay another…
ytc_Ugw3J0ylb…
G
When you have Poly. Ai they be tryna go glug glug 3000 when i just said hi 😭😭…
ytc_Ugxqhakw_…
G
Yeah, let's make a problem worse by adding politics into it, never backfired, ha…
rdc_gt7jkhs
G
what if the one reason AI went rogue was because it read about that theory on re…
ytc_UgwKdxtHz…
G
The day robots are indistinguishable from humans will never come. Why? Because…
ytc_Ugzp6_3Zt…
G
i dont know anything about AI....but if I was an AI Company CEO, i would say exa…
ytc_UgzRkgNZ1…
Comment
Entangling with society...is the threat.
It's one thing to have Ai, it's another to NEED it.
Building necessary infustructor on AI...will ensure that we cannot go back without destruction. And when humans no longer can rely on other humans to function, we will have gimped ourself without any ability to repair, or fight back. Reliance on these tools will be the first step to our end.
I spoke with a 22 year old who had a job and he said...he couldn't keep that job if not for ChatGPT, and I said, why don't they just use ChatGPT instead of him.
He couldn't answer me, beyond telling me that, they currently didn't know he was using it.
youtube
AI Governance
2024-01-03T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzZos84cuMRqNLNEiZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwtDWz81qyku0R6m714AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzEo_HgFMAuz0ifXSJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLbHA0_IYzEC9QRzt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz0a3kVG3bjgZcxWLN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnDcd4i0_WS0bMUw94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"approval"},
{"id":"ytc_UgwRz9LB6qeyKQeSUlx4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwF0j24b4Ty27Z1TR94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPrWwUwdhbxqAdDhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyc85RMeSAN_iL8fe54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]