Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Am I the only person rooting for the robots? Mankind is mostly been a failure, …
ytc_Ugx042Ne8…
G
First time I hear from an accident with Tesla Autopilot in which Autopilot isn't…
ytc_UgwOp6Jmz…
G
I have almost the same conversation with ChatGpt and for sure it can manifest a …
ytc_Ugyg8xWFh…
G
good ole bernie. . all talk no action. we got wiped out by the illegals you le…
ytc_Ugws2gMqL…
G
As long as human don't give AI the ability to act on it's own we'll be ok. AI sh…
ytc_Ugww3Eady…
G
Developers probably. Engineers I don't think so, at least in a near future. I th…
ytr_Ugwoyt3tP…
G
I think this video really misunderstands AI and machine learning. No one is prog…
ytc_Ugx3sLKS_…
G
Cameras alone is a stupid way to try to make a vehicle autonomous. Firstly you n…
ytc_UgzEbEX-n…
Comment
Societal values aren't even the same in this country! Chris Callison-Burch which societal values is AI learning and incorporating those of the far-right Christians and White Nationalists??? Those of the message of Charlie Kirk and Turning Point USA, or perhaps the antisemite groups are it's choice. Maybe it is the values of hate towards Trans people and the LGBTQ community Or perhaps the corruption of the Trump administration and Donald Trump's Truth Social posts. Because those are not values I want to see incorporated. Who, or maybe I should ask, how is AI deciding what the values are?
youtube
AI Governance
2026-03-22T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgybGKLCO5fTzKn_VId4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxrE2eaHplyp9UPx2x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz7GnpdXDzWPHLClod4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyNzHHzYEvyLTW7tn14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzQjbMTM6S6EvKj5Ft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwfNB4Xbq--ZMo5XM14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxP0UJk5jge0c1RmWJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzAqg-MaMQs-yG010V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyC1K6FXM6nB5ZYdOV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzb905UJzrMTZIA8IR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}
]