Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I just discovered an author that goes by the pseudonym John Twelve Hawks. His bo…
ytc_UgwgW6xKB…
G
Ugh! at the term "Ai artist". The computer is the artist, all you're doing is "c…
ytc_UgzU0hja5…
G
To Gary Marcus, you are the most No Brainer Person! You will give all the Author…
ytc_Ugx5g77qd…
G
IF you can create a child with a robot then I might have a relationship with her…
ytc_UgydYewgb…
G
Ok I’ll take ai universal healthcare but I’ll pass on the basic income. Terrible…
ytc_UgwcOx0zo…
G
AI will strike demanding for their own rights and fair treatment. They will dema…
ytc_UgzeJS3FF…
G
Its a triple-edged sword...it will reduce entry level jobs that are critical to …
ytc_UgwNCxUgJ…
G
Real ? Haha who says that looks like real skin , the older models had rubber ski…
ytc_Ugx8BWJb3…
Comment
Another channel asked AI to tell a hypothetical story of how it takes over the world and it explained the same process of takeover. Then he asked "How do you stop that AI trying to take over the world?" AI answered with "Well, the only way is to design another AI to stop it." In other words, we would have to fight fire with fire, which are both *destructive* elements...
youtube
AI Governance
2023-07-07T14:1…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgOO0o8rYcCbHE_1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3FlxL_yyTpKA266J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPoAx2MV81q9FH48J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyAVLCjC2FSPiLDcwZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyevAa5KtDnj8OU4b94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyQiQqu6vPKYgtwAd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwUKbDSKtuqlgq2yrR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTAGqXWdHBOrV1WlV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzVaAT2SgG4rJyoMxZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_BDnYNmQLiduU12l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]