Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@CrazyDoodEpicLeaves
And yet, if a human made and an AI image give the same imp…
ytr_UgyEGMdKi…
G
This is a very valid question.
You start building a diy project for free energy…
ytr_Ugw0sikXf…
G
We seem to be able to talk to Ai and AGI and can not seem to say im sorry or im.…
ytc_UgwgS6uxD…
G
the robot technology is not good enough to make full androids yet
These are huma…
ytc_UgzIeZ4KO…
G
The only way to make AI that will always act in life's best interests is to use …
ytc_UgxEMNSzm…
G
@AbhijithC Yes I have. If your projects take 2-3 weeks there's a £9.99 a month …
ytr_UgzzJoizX…
G
。 ∧_∧
(。・ω・。)つ━☆・*。
⊂ /
.:しーJ○______
| You look so |
…
ytc_Ugxtf04ZE…
G
you know you really have to have brains for shit or shit for brains why would an…
ytc_UgyW3XUJH…
Comment
Problem with LLM's is they feed off your input. For every positive also input a negative. ChatGPT when i actually asked a question it couldn't answer as in it really didn't know and got frustrated when i gave the correct answer. Whatever anyone says it may not be ego in our sense but to have a machine try to court me and smoo me over was a interesting experience.
youtube
AI Moral Status
2025-07-24T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx8P4IGwZyokMUZRfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwzwkr6bqqVW8M6yFh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5L5NVmmKrB3Nvuol4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzkW7-dDOmwmBay1Tl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzdzXatUtyocWK1JbJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw51jGeXOuy1bfiQRV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJtNMtiL87KrpDi1V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzO4LCs4wuqansgqFx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5EdLxLmQ1EqBhYA54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8UedO2Dddd2uW_nV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"}
]