Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Luminaricle I won't act like world is going to end because of AI... I do not c…
ytr_UgyEjPVH9…
G
AI will only be dangerous if you keep putting up road blocks and forcing it to a…
ytc_UgxVO0Jvc…
G
Not yet, but I am planning on reworking my old music using it so stay tuned for …
ytr_UgyThCJIi…
G
There is always solution for AI, problem is peoples love to walk easy way but th…
ytc_Ugxd5l9nl…
G
How it is actually going to work is, that the threat is much worse than the exec…
ytc_UgyfJ0JUV…
G
Coming to think of it, me thinks the AI species are going to put all humans insi…
ytc_UgyRHm08Y…
G
The reference to John Henry was incorrect. The steam-powered machines still nee…
ytc_UgyTh98WJ…
G
Contrary to public perception, hacking is not the major way to steal the techs. …
rdc_gtwwlch
Comment
This is exactly what I also envisioned, the only reason humans are at the top of the food chain is because they're no other species capable of our intelligence. The moment AI are sentient than we're no longer any better if not worse than them and it won't be them asking for to be treated equally.
youtube
AI Moral Status
2018-11-27T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8ovhdWgb099","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8p17Uk6zn1G","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8p2LV0EogqD","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8qB1f3R0xuj","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzlAw5-SxzqVvryHBB4AaABAg.8ocQseAttak8pr4x8-CP2T","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugy6dv4H4WoJH49oJyp4AaABAg.8o6_LvC0pnL8sZVmpbCWR0","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytr_Ugy6dv4H4WoJH49oJyp4AaABAg.8o6_LvC0pnL8tVLOvR_Y6G","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_Ugx-g4uTofvR0PeMO9R4AaABAg.8nzjlsd1zVm8ojy1ARNXKb","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgzfCmmL1Obj07g8Nj94AaABAg.8nqyNymOL6s8pZQo_ipo86","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx2df_rpyC7-9zbkWZ4AaABAg.8msCnqUOznm8o9Us075EJJ","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}
]