Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That was one of my favorite plot points in The Expanse. Same thing happened esse…
ytc_Ugz0u1G_S…
G
I had an interesting conversation with a companies AI the other day, when we had…
ytc_UgwN1dktt…
G
I did a search on Economy Media and its a ghost. I did find via Linkedin a Nomi…
ytc_UgwLGm4_s…
G
I wanna take a picture of my butt and watch AI make it talk. You think you can h…
ytc_UgxgnWPEt…
G
I've never once seen this news from any major outlet. You guys celebrate this st…
ytc_UgxWGi5JT…
G
the governments not going to stop AI development since its an arms race against …
rdc_je6n4mr
G
0:25 Tim Heidecker is watching that robot and consider if it deserves a photo an…
ytc_Ugyg0egku…
G
The Automatons in Helldivers 2 are exactly what's gonna happen if we keep on thi…
ytc_Ugw5jN-ox…
Comment
Today I have received a link to this video (of 2 months ago). What is often missing, the governance. How much cost for me and for the planet to run all those models and technologies associated, to do the same that without it. For example, automated cars: I am able to drive, what is the extra cost for me and for the planet, to have automated driver for my car. Is not in the analysis. Then how u could decide ? With only direct costs ? I do not think so. My 1cent is that the reason some people is betting against AI still. Because the risk of not be able to afford AI in the big scale.
youtube
AI Responsibility
2025-12-15T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzJD4677wXn6ZZa2BJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_813MxAtv1gyK4u94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNAd02qBx7Noc0mrF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwXUSXxGlVLzkXcoG54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz3CmBvEbqmY9qZ6D54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtnqL2wcNYfTPSUgJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyGTmI9WYL0ou-ANXp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzLdppqLlP8mQaAQyN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYZrYtNmu4CTLbu6F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFo5pY00-f8IVodaJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]