Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if we are all just AI? And it’s all one big loop? What if we did all go int…
ytc_UgyDVJLba…
G
Came here to see AI cheating, but here is another story. They guy knows rules , …
ytc_UgzI406jN…
G
If an artist copies your artstyle it is often for study and it will never be per…
ytc_UgzkzwZAn…
G
I think self driving cars will be more safer by 2035. Current technology hasn't …
ytc_UgyBkdfvL…
G
@Rainjojo The spider verse using AI and computer generated graphics. CGI removes…
ytr_UgxDkqp4P…
G
On the one hand, there are almost four billion years of biological evolution, a …
ytc_UgyZw9ZsQ…
G
@dirremoire the judicial system disagrees with you also it’s literally putting i…
ytr_Ugyga5bJB…
G
Are you serious? Did you not even LISTEN to the video? The kid JUMPED OUT IN F…
ytr_UgyGy_KpT…
Comment
So I've done a fair bit of research in the AI I've listened to the godfathers I've listened to the people working on it and my opinion of AI is a general AI that can do a lot of things pretty well is a very bad idea but a very specified AI that is good at very specific things like math controlling machinery combat so on and so forth I think that is a lot safer and is in our interest more than say a general AI
youtube
AI Responsibility
2026-03-12T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_w3gupnaqWxLCw-54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMTdEYhGDBa1U-i2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzEyDLZwR3e8NGg02B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycR16IkMQHLlVk2hl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzNHo5KnZeRQRmEvBx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyVyM2sxpbwudFfblh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugykc3RO2ljbzK2eCuZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzzyz0fFqHEZ-g5Trd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxuk6MUPDWc1dWHonN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwaH5y5I8JVz2_Uish4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"resignation"}
]