Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Really won't matter. Cameras are just one smaller piece of a much larger puzzle …
ytr_UgzX8-rTM…
G
Visual Artists: Get NOTHING.
Musical Artists: They can monetize your video or ev…
ytc_Ugy7ILXvG…
G
Poor robot, so naive to think humans ever needed machines to manipulate each oth…
ytc_Ugx1zr5Fx…
G
......Hence the reason for using complex prompts, because AI does need and is de…
ytc_Ugw3M5-Z2…
G
i cant wait for it, i feel the whole world of jobs is bullshit, they just sit an…
ytc_UgzYc1yQA…
G
Thank you for your comment with the pineapple emoji! If you have any questions o…
ytr_UgxtvfnW0…
G
AI combined with humanoid robots like Boston Dynamics Atlas will easily replace …
ytc_UgwhtEqgX…
G
So if I say "Yes" if people ask me if I'm an AI, I'm not sentient? That... sound…
ytc_UgwZH4Otb…
Comment
AI (LLMs) has now been shown that they dont know what they compute. They lie about how they produced a result.. (including chain of thought). So, what Sir Penrose brings to life is that AI can not initiate the idea, which requires understanding the thing or transcending the use of the thing…. Regardless of how good the tool is.. it is not the creator.
youtube
AI Moral Status
2025-05-13T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz5k4Hct7vr3EQuoKd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyct773hIidIs57AiN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzzfqyrY3LLHLbBcIV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw_aGdijan5GOkRznR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzpw2jUHzp1wFz4n9V4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1oiTsyHIOZwRdZO14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKnxSXLU8StZXthb14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyj2GeXL1vjfCSD6VB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzY6hTRnr9VV65Y0fN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxjGf1Cc3t3Gu6OKzN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]