Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@PauseAIAustralia it does better in bechmarks, because they train them for bench…
ytr_UgygaN88W…
G
Then there's the one who rizzes up the ai and the one who kills the ai…
ytc_UgxXRH_Zm…
G
When reporters speak of AI, they need to be more concrete. I use Scout to trans…
ytc_Ugwi4uxN1…
G
@AtomixTiger I think those of us who've been using it daily know that already. I…
ytr_Ugwzyw418…
G
There is the problem, though, of whether such an AI would really feel the need f…
ytr_Ugjn2HuW9…
G
Me being 60 going on 61 years old and of Christian faith, I do have some fear th…
ytc_Ugy1vl3g_…
G
"nooo you cant use AI to make art nooo"
"lol" said I "lmao" as I generated anot…
ytc_Ugz38t50V…
G
That's what happen when a moving vehicle and there's no eyes on the road it's st…
ytc_UgyI0Cmp8…
Comment
I think there should be some kind of ministry of AI safety and people who will control not only AI itself but also the ethical part of it. Same as we can clone people but we don't do it because of ethic and moral. Same should be with AI. Only because we can does not mean that we should
youtube
AI Governance
2026-02-04T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyLT1D91OnNA24FJTt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyafYdxzLkGeq-NXC54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPxL3RcK5zcJ7bk0V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNB0oMeeEqUcBJwb54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwbzg8HPHp8bhkZmCh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx62SOZjAw-CRXsWl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxHd2ehBMlFeqgbzyl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvTclhzFNHnfFYYkh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHSHPwiSU3nEpEwJN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxIoX3VY9X5-zw0eWB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}
]