Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The teachers today don't care most of them so A.I might do a better job…
ytc_UgyhzVNfM…
G
THIS is the worst thing to happen to Humanity. Demand legislation to destroy de…
ytc_Ugz8bZUoO…
G
Tldr when the ai learns enough empathy to make it clear in proveable ways.
Cuz p…
ytc_UgxTYNHVK…
G
[The A.I. Dilemma](https://www.youtube.com/watch?v=xoVJKj8lcNQ) by The Center fo…
rdc_jiicebe
G
This exactly.
If an ai can make your art or better art then you than you weren'…
ytr_UgzNInKH7…
G
I'm not sure of the legalities either my friend. But I'd imagine there's some re…
ytr_UgydFMEEs…
G
If there's any chance AI could eventually gain nuclear launch codes I would unpl…
ytc_UgyL2j28m…
G
So many things wrong with this
If you use ai, you're not an artist, you are an …
ytc_Ugwj3An3E…
Comment
The billionaires are not anti democractic they have the right like anyone else to pursue their values and interest there is nothing wrong with them opposing a political point of view they do not like when you work for someone else if you oppose their policy you can quit they have no obligation to to agree with you ai regulation is one of the most complex issues I have ever seen it must be approached slowly and carefully with the knowledge that you will get it wrong many times so you need to be able to change approach in mid stream Alex seems to be too arrogant to be involved in the issue
youtube
AI Responsibility
2026-04-21T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyNADIdU3CCPvZsyUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwq5b6ip3SrUzxqGLV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzmpequDmnr6Wivqwx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQ2kCCj7542G0O8fZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwGE0pB20yfj7KK4r54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzjv8jwuFF8ILNO1Rt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxQzxUlBN_U-KcXUlR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgysVrOmoqoms3G0UzR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx-aphWNapnS9F2dYN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4eJHbEUtELkFYMVd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}
]