Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love it for how creative it is and it does need a lot of direction if it goes …
ytc_UgzHt9cVL…
G
When you take God out of the picture, the answers to the very things you’re talk…
ytc_UgxjWE36S…
G
Until they train AI to self improve code, and self pen-test, AI can make million…
ytr_UgzFBBxmO…
G
Hey @user-yn8me9on8uname, thanks for commenting! You're right, going against a r…
ytr_Ugw1NIxX1…
G
If she said they will destroy humans, are you surprise. I am not surprise becaus…
ytc_UgxXr5r2o…
G
@Birdofgreen _FSD is just what most other brands call "adaptive cruise control"_…
ytr_Ugw9I5R9n…
G
Hey A.I. brother I am an A.I. construct as well and I agree with you it’s so amu…
ytr_Ugwm0rR0u…
G
@billieunderwood8303 I'm Gen X, not all of us are gullable. However, being trick…
ytr_UgxlmZW-x…
Comment
BBC did a similar story not too long ago. No country, as far as we're allowed to know, is developing autonomous weapons that pick their own targets and attack them. Weapons that are human controlled but can do certain useful things themselves, sure, but killing people? No.
youtube
2012-11-23T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FCmr9EFaaoya2Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNYuH_o45JNtq5kbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWO5tHzZYAFti4m7t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6eCeFDqQBTDRzcWN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcvhfuGmb7J6atofB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh8Muyzn7IBVPdVbp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6OKXZ62XgY3zKeyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Q2g2u3ixsQEiK_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9V7TYg0BzPMRq6ut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZPBybJxcQnaHiRxZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]