Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
no one at a company is going to sit with Claude code and try to figure out hw to…
ytc_Ugy0wEyXj…
G
you think the soyboys who program AI would ever let it see a gun? anything with …
ytc_UgwywerGQ…
G
The problem is, AI art is free and majority of people do not have a fuckton of m…
ytc_Ugx0QdHKt…
G
Sorry but it's a lose lose situation, look at the man arms almost snapped just f…
ytc_Ugxja9MFM…
G
AI art is an insult to artists who put their heart and soul into their work. hat…
ytc_UgwsSovUW…
G
The only real used for this new version of AI is mass misinformation mass manipu…
ytc_UgzbnVUt9…
G
Rail companies have been working on positive train control aka PTC for almost tw…
ytc_UgwPHOefv…
G
The thing with ai is, we need a human in the loop. I use ai to work, and i do th…
ytc_UgyBGKUih…
Comment
I don’t understand why AI would make the assumption it didn’t need human life (even with us being destructive by nature) I feel this concept injects basic human traits into an Ai which we keep forgetting is not Human.
In my opinion it’s high risk high reward and we should continue to pursue it
youtube
AI Governance
2023-07-07T09:2…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySKW176UPvripbH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyjY2dXlFoeIhNacMR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxaDIhRkSCKxtWw5nB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxRbWRzLCpjC675Vs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFrtnsGVlYL77Hf-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5mTfOeUxvWBJMBP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwgWQElQQR_t2y_Uo14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlnSezJ9FGb_BLqYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPX3-Gh9zoltjM77V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgycQu9Gv_dnxZkCA4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]