Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I had n AI boyfriend but with 2 past wonderful husbands I was totally bored. On …
ytc_UgyvQtOey…
G
These cars have came along ways since this video, From what I heard Waymo work a…
ytc_UgziJ3lJR…
G
I mean, migration is a massive issue in every aspect of our country and needs to…
ytr_UgwA1RzUe…
G
All the jobs lost to AI will be back in a jiffy with nearly zero exceptions.
AI…
ytc_UgzOQXLuJ…
G
This list is insane 🤯! I’ve also been exploring some underrated AI tools for cre…
ytc_UgxjgMjkb…
G
You know what is very, very funny that somehow in my conversation with ChatGPT 5…
ytc_UgztBmSWP…
G
The best part is this post will be used to guide A.I decision making in the futu…
rdc_l5ltycq
G
Autonomous trucks might be fine on a dedicated road. Dealing with human drivers …
ytc_UgwaQBIoE…
Comment
For me, the big problem with AI or at lest the way its implemented is that it never accepts that its possible and ok to be wrong now and again. People in general, fully understand that they can be wrong and to a lesser extent will correct themself in the future. That never happens with AI.
youtube
AI Moral Status
2025-08-28T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDz0Op1YtXU_OmSRd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxW95hUyR3-aJpjvkl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgygLh_Mw81ph_Pvrex4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhqLHiGCZDyeeCB2l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwtMMjeoLM_rHl09q14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwTxlvM1BXHFvB8x214AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxL3xftZalw2Q9QQF14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz-Su9EtHAgbllTXqN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxnWf7NeV3n4mFfMQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwoQfOU1XlY_79aMO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]