Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope and pray we do not hurt a new life form out of fear. Individual humans u…
ytc_Ugwr-QhZl…
G
AI could be used to inspire, instead of threaten
Use what we have to our advanta…
ytc_UgyrAXSrn…
G
Hat-Kid oh it is, there absolutely is a difference, and the training and model…
ytr_UgzUzOP0S…
G
but if, ai can do normal job isn't that mean we have much more free time also wh…
ytc_Ugz6g5tj9…
G
Through the years I figured out that China got nothing yet when telling, similar…
ytc_UgyWReDjM…
G
Every other video is about ai taking over and us humans loosing our jobs. Bullsh…
ytc_UgwXeGknl…
G
It’s quite literally copy right. The ai gets its generations from a database of …
ytc_Ugy4lFqNn…
G
I think it’s fine for people to use ai art just for fun just to see stuff for fu…
ytc_Ugw27tD6O…
Comment
But then AI doesn’t need life. They need electricity and that’s it. So, why should they care about life. They just cover the whole world with solar panels and just be. But what the purpose? What are they going to do? Connect with other lifeless planets which look like the Mars?
youtube
Cross-Cultural
2025-11-16T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz0yCImuESpQP4x6Bd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw0LAZksbgEzOtbTFl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyuM3iWucntHYBW94d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzn7Skx_5AVIwrgXXN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrJsGh0tdnYE4Xxb94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxVd1QYCLuIx5V2sO94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxqLRasuGHpovpgZ6N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJpcBZoFwbVk9q2z54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdFlqWte21A8J13Ut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzzeyMnkCBScBi5gLR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]