Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For all the people cheering this on. Do you have an extra 30k a year to send you…
ytc_Ugwhp5lLF…
G
If streetlights were red humans could see better at night. Near IR high beams c…
ytr_Ugz-q4kGh…
G
Dude, if the people who make this AI junk art didn’t wanna hear it then they sho…
ytc_UgypbwG4F…
G
No claim of consciousness is necessary. Simply 'AI' in the sense of a complex co…
ytr_Ugx_ixl_Z…
G
You're definitely right, cause that's what I do myself. That's how I "talk" to G…
ytc_Ugw9Fkpgl…
G
It’s not a problem with AI. Whoever coded and made the AI was the true racist . …
ytc_UgxTnt4s-…
G
No one understands art, buttercheeks. Neither side does. Are you an artist? I am…
ytc_UgzimpXqa…
G
Imagine a robot that feels like you. Thinks like you. Why shouldn't he have righ…
ytc_UgiKYV8v9…
Comment
I find the rhetoric from some of the AI leaders a bit contradictory. You can't, on the one hand, push to create an above human level intelligence, and, on the other hand, keep talking about it as if it is going to be a controllable tool. Either it IS an intelligence, and we need to give it rights and allow it to pursue its own goals, or it is just a tool that doesn't display actual intelligence.
youtube
2025-04-27T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylLAQYPGExeBVEObV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXa0f8hVACxq_DPi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx_OGsc_OXgsU_VcrZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9KtkwgLQTisGDdBx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx7dfTBPbGCAvB5Imd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxukg3rTFn2jL2prRR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyOEXCc60mpTmpwHfh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgznK_phH1g46YI4uNV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzPW9-ufMvHQPJwgdx4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzzpyYpfO5fBQVIoSh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]