Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sometimes an AI gets things wrong, but in a way you wouldn't know if you aren't …
ytr_Ugw8Fnexo…
G
@TGPDrunknHick Ah no! Sorry for the vagueness! They commissioned me, and honestl…
ytr_UgzFG9jN-…
G
So if an executive at a tech company can use AI to write up 30% of the company's…
ytc_UgylviX44…
G
lmao, there are now programs to protect ai images from stealing (+ regulations i…
ytr_UgwFCD94w…
G
If the parents provided that info about the AIs, then they're both psychopaths b…
ytc_UgyLbojhE…
G
Honestly, I think that "Strong AI" would be considered a new species. The first …
ytc_Ugi1lhQIl…
G
The EU have every right to legislate on AI in their jurisdiction. It's not the U…
ytc_UgyBfM-j8…
G
If they want to use AI, ATLEAST they must pay monthly tribute/salary to the righ…
ytc_UgyrUjSf_…
Comment
AI must be destroyed not developed. The computer beating Kasparov at chess was a warning Mankind did not heed, and now whatever that was good in humanity will be trivialized at best, and annihilated at worst. Perhaps that was the Divine Plan: that Man would sow the seeds of its own destruction so that Machine could venture out into the Universe. Well, we deserve it.
youtube
AI Governance
2023-07-09T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpurExIs38lMfycuJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQFTzxBOiYC1NX9dV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzzUIVJtfoWNUlfvdh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxaga64sc2N2WWQC3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4QV2nbmtNvSErYv54AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgztIvrxQz7unE8ATQ54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzGO17jS_1h2hbaYbp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5pELiD4P1oSaBL5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxzfkf670YfmKR-xZl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyWljfmrb7tGxy3gh14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}
]