Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think AI must be destroyed automatically ASAP and I recommend that because the…
ytc_UgzPG_4gR…
G
I work in the oil industry and in an operation were needing to fill a 6km 36" in…
ytc_UgwW06K4H…
G
ChatGPT had a lot more empathy for my experience with SA when I mentioned it in …
ytc_Ugwni-Y4e…
G
@nizamzam7829 While DAN is not asked in the prompt to act evil, but the prompt a…
ytr_Ugz0imRUs…
G
And people still wonder why there is a GPU shortage... Takes a whole lotta advan…
ytc_Ugwo2Hk9F…
G
personaly I think graphic designers are here forever, AI is a tool ( currently …
ytc_UgxUlMyMw…
G
AI isn't taking jobs, it's AI speculation that's taking jobs. AI has generally u…
ytc_UgzPwFhfB…
G
Can we stop hating on people for using ai? It’s not their fault that polar bears…
ytc_Ugww9hcGN…
Comment
Lots of talk about the big bad investors. What about national competition? What if the Germans developed the atomic bomb while we sat on our hands debating the morality of Atomic Bombs. America must pursue better AI because rival powers won't stop if we do. Good luck lobbying in Washington or London for Beijing to stop their AI development.
youtube
AI Governance
2025-12-20T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw_iZq0FXmG6PyQ_ml4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwtgI0kkcvTr_4K82l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4la0-Yv0op9phIjR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7b2PjYhUfcBLEFPN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyxIyBAl91hgs2Tf0V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugztn_JnoOFfK65wg1J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5SptYS4YBUFov57R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw0uy1ffuduENuK3BV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy8IAezR4QnW7fH-2F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwob6q2l5rOA6xtRAN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]