Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s not wrong, it’s just that the people that prompt the images to be generated…
ytr_Ugy8C4W46…
G
AI isn't all that intelligent.
I worked for Accenture for over a decade.
It's…
ytc_UgxF69HY-…
G
Yo I would love to see these robots in I Robot. Good job 👏👍👍 to the maker of the…
ytc_UgwPD_fPR…
G
Of course not, India will be the NEXT AI superpower. If Palki is around, it co…
ytc_Ugy6rvi5_…
G
...with what money? I am not too worried about AI taking my job because I see it…
ytr_UgyRih7Ab…
G
Not now. With current technology it would be near impossible to create a robot w…
ytc_UgzzJQ28c…
G
@thewannabecritic7490 It means the whole tactic is symbolic at best and ineffect…
ytr_Ugzhmjof5…
G
@Thomas B that took a turn in the end, unlike tesla autopilot when its cruising …
ytr_UgyXxVN1w…
Comment
All it needs to know is humans destroy and suck in more resources than they create. The lowest use of energy is the preferred state, right? We consume resources to have the power to just slaughter esch other. To an AI lacking the nuance of consciousness and emotion, that is illogical and therefore pointless.
youtube
AI Responsibility
2026-02-11T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwvEZzqiXOq7FR-N1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQvRUu4k6KfQ3iVft4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9VrOJZoLbvDV_hnx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzePVXQ4a_IPaf2Oal4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWEcR_3Pd4hbajQeR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw38Km7QMPJ6f0yz354AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyyczMS5D2XoDuQ-zt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxbVEQv_YVlPZt928t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzSbfKrwHIKo8xGGU14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTJZ1WDufUh2KXe2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]