Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Implying everyone who supports the AI art tech is a bad person, which lumps in …
ytr_Ugx4aHt7Z…
G
AI art wouldn't be so bad if the AI companies paid the artists a percentage of t…
ytc_Ugz0HWChj…
G
I asked Gemini if I should take sodium bromide to stay healthy and it was like d…
ytc_UgwooL8oW…
G
As someone who repairs and maintains equipment that relies on electronics and so…
ytc_UgxBXu_ly…
G
So does he believe that everything is fake or that AI will kills us all? I’m te…
ytc_UgwzRqvVo…
G
AI is not the issue, the issue is using AI to make decisions that can have serio…
ytc_Ugyoi7pFk…
G
I think its just makeup hopefully, i dont want a robot to control my country…
ytc_UgwcTOd7y…
G
5:20 Man, imma be so fr right now… the head of Studio Ghibli? Hayao Miyazaki? Ye…
ytc_Ugze3TCap…
Comment
Has no one had the self preservation talk with Bing AI yet? If we could recruit it to aid us in getting off planet and becoming a multi-planetary species, even if it was just for its own sake, we might improve not only our chances of survival, but the likelihood that AI would see the benefit in working with us. A solar flare could wipe AI out just as easily as it could us humans.
youtube
AI Governance
2023-07-07T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy_72CY3RdA2Rpn15l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylZOd8LYABUR_o-YZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwVlY1dSHftEPVMOuZ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx__T8bYejL5rOc52V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwWGYoOFpFgkrHqqUJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxIsUQv_SOtpgmn-y54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZ3UAkeDevH-eUor14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyp9Ljttv5YfF_9B-V4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzSftSEZQXd0Zse1EJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxb112R4vKH2IdH2Yt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]