Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The section on chatbot hallucinations... I am aware of this concept, it is when …
ytc_UgzESSGPj…
G
i am totaly against self driving car cause i am one of the survivor of Megantic …
ytc_UgwXJIKkE…
G
There may be a concern that it impacts critical thinking skills but for a studen…
ytc_UgwFRlsN1…
G
Why does sam move in such an npc-like, ai way? Like he's a living npc i swear…
ytc_UgySTKiI7…
G
Mfs be like: ai gives the chance to us people who have no talent to express ours…
ytc_UgzAvcdoS…
G
Is that what OpenAI does with our data? You may have said some of the quiet stuf…
rdc_m9h6f11
G
At my current company every feature get's written faster, and either rolled back…
ytc_UgxDdhFVY…
G
do you think that all (or most) cars will be self driving in the distant future?…
ytc_UgwvSzGti…
Comment
Watching people with 0 knowledge of AI create videos like this is painful.
If an AI is trained on human data, it replicates human behaviour. Telling it 'im shutting u down' and it behaves like if u told a human 'im killing you'.
I get that u make videos in order to get views, like news agencies try to get views with 'big-impactful' stories - but still, all I hear is humans being scared bc they know how bad human behaviour actually is, and being scared of the AI replicating the worst kind of human behaviour. That's it. That's the depth of this problem.
Doesnt make for a good vid tho is it.
youtube
AI Moral Status
2025-12-24T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwm8_h2p9LNnmCDEj14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzI2bVlKYQMP1ZW_194AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy-zG3UoqRsCY7H6mJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-KJsXL8HqlOyNJmh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvrCnHEb3ujysE_Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzg3zUs-rnpzwByyzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxoOQ6HcazP_ip9cO54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXCTNRR8HPXt0DE7t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8W8tQQl8R7qCthJV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwPIrgUrwNvkP2-5EZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]