Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“AI will help us with climate change…” jajajjaajjajajajjajajajajaja the accent. …
ytc_Ugysf0_BG…
G
I just saw trump's AI video about Gaza. If this is the way the USA wants to go w…
ytc_Ugx6NeNAn…
G
AI is being used as a psy-op cover. It will have nothing to do with deciding up…
ytc_Ugz-zWzo0…
G
That protest was authorized by the government. About a thousand people participa…
rdc_dy8od7m
G
So my take is that we merge with the machines, and time and space become somethi…
ytc_UgxTdCj2P…
G
It will be down to the 'good' AI, versus 'bad' AI, and the people/companies/$$ b…
ytc_UgybxP5ye…
G
This used to be true in the days of AI Will Smith eating spaghetti, but it's got…
ytr_Ugw7lejwL…
G
I’m going to stab ai images with pencil <3 that is so stupid though like I can’t…
ytc_Ugx5r-gyJ…
Comment
This is a fascinating conversation, that has really got me thinking. If we could determine and agree on moral boundaries of AI use, that would be a start, but that only works with honest people. The disaster happens with greed and power-obsession. There are plenty of powerful people who would clearly want to take advantage of the worst scenario. I fear for my heirs’ future more than my own.
youtube
AI Governance
2025-09-05T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx6gGG7FzPhOAlXoK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBRXPJ8LUMzuym8MJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyiJoxDWDUT03Yfuo14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwvwvXPzBCNK2No5uZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYVYrd6IzUbrdsaDJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy-ZkoADoJRVCBxf9h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz3gmEyCQ6_dGsxHJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygpZ1ETacGeO0Q75N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"sadness"},
{"id":"ytc_UgzOYM-l3ccsmedjnh54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybUXgbCiC3ZssaPKh4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"fear"}
]