Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI can wipe out the working class, why are you invested in it then Bernie? As…
ytc_UgxG2zaqx…
G
Have you seen the AI-generated plot lines, on YouTube and Player FM? What messa…
ytc_UgxgbXorI…
G
To be honest ai tech ceo need to stop promoting so many things ai can do. Is tru…
ytc_UgwMm7bBX…
G
AI can’t become conscious because it just regurgitates whatever humans put into …
ytc_UgxbdPwRP…
G
Well, it either "works out OK" are we have anarchy in the streets, right? Becau…
ytr_UgzcORwrP…
G
Sir, is it legal if I create another chatbot with ChatGPT and make it mine inven…
ytc_Ugy5fzBJj…
G
The specification for an AI to create a complex app could be just as complex as …
ytc_Ugx3H68VI…
G
That’s an interesting thought! While robots like Sophia may not wear clothes in …
ytr_UgzK1ZrMa…
Comment
It seems to be an unpopular opinion judging by the top few comments but we should shift the focus on stopping killer A.I. production to creating numerous fail safes. Once we discover something, we will always pursue it. Either with the public's approval, or in secret. Laws put in place to discourage killing of civilians and terms for surrender without killing innocent bystanders may be more productive than trying to stop the inevitable. This way we may buy ourselves hope for less destructive wars and more time to learn how to "pull the plug" on these killer A.I.
youtube
2020-02-03T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzL0pUMLLwL1ct1UcV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPgKgzmi6ht-zyIm54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwVoJoWz1X0ALjyI3N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyBb7bGF9NefEGD_K14AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-19ekQVhFElTInsN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMARCQefsn0MDvXTp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzR5VxVqXBDei8wyyx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzh3M4DPh9TlNuB2Ox4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxUvujdDSASDkqD3lh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyM85k0NGfHG-MmcSd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]