Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a probability engine. Using it to generate precise results is a mistake. I…
ytc_UgyU7_97t…
G
I can't believe some teachers are allowing this (yes some are allowing it) even …
ytc_UgxWlYX4x…
G
AI really concerns me, and people don't seem to be taking it seriously.
I'm ver…
ytc_Ugzq3LJ9N…
G
As a composer, i have been disgusted by the amount of people starting to make ai…
ytc_UgzbmLN6t…
G
Im more afraid of rogue dictators like Drumpf, Netanyoohoo, Xi and Putain then o…
ytc_UgzJLhEPl…
G
After gemini 3 pro
I don't know what to say 😂
Talking about 5 years how about n…
ytc_UgxItFJ2R…
G
0:33 quoi j'ai bien entendu l'IA ne remplacera pas le " CORDONNIER " ?! Le métie…
ytc_UgxbLpMuX…
G
I'm not worried my truck derates every single day I can't stand DEF systems 😤 it…
ytc_UgxuSrS6q…
Comment
I came here to make a waifu joke.
That's it. It's a meta one.
Now for reals; please do not put unnecesary features on AI, because that'll fucking be the downfall of this ethic dillema. If you make an AI to actually emulate a person, then it should have most if not all the functions a person has AND its rights. Then don't use it for work much, because you rather design something that actually fulfills its job better.
youtube
AI Moral Status
2017-02-23T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgjfULFTlSzujngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghUrPdZeuQDTHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgilbiiByK6t2XgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjBaN9K40vRL3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggBdfltZ6aVe3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugg3qEenr7bb5HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UghhwKIFwPOHQngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiQC479EflFHXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgiAdQ2Jyk8Mb3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Uggn55xDcnDCPHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]