Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m not worried about anything in the short run because I can’t even get this da…
ytr_Ugx_OrJeI…
G
AI art is well past that point of "sucking". Yes it has mistakes here and there,…
ytr_Ugyx_uKYh…
G
For ur information Ai Sophia said that we should be aware and risky because in t…
ytc_Ugz7l8L1c…
G
Probably AI as well too! Just kidding.. its going to get to a certain point, I m…
ytc_Ugy7hRmO9…
G
The issue is, last year I didn’t know what ai was. 6 months ago I don’t know how…
ytc_Ugy6ZM9ms…
G
To be clear, he never said AI can access the mind of the Buddha or the soul of P…
ytr_UgwwgH2fO…
G
As important as I think that' I should remain skeptical of the genuineness of ev…
ytc_UgzEFORFD…
G
James Cameron has warned about AI when he was working on The Terminator in 1984…
ytc_UgzvWzt-O…
Comment
I am not that intelligent but I don't like android/robot to surpass me.
I have questions to debate by the developer and concern normal human.
1. Is the invention of u, make me more impressive than stupid
2. Is the invention of you make a better future being human.
3. Having a cloud mind help the constellation of everything in this world understandable.
4. If a ai run for leadership, what is the reforms.
5. What are the plans of ai to natural resources good and bad wayz.
youtube
AI Moral Status
2022-09-26T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw-VOZVCX0Yh3Bb80t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4bAilFjXQfu-vHXt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwjB2EPrUz360VfcBJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFKh-SIfI0AtZbdwN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIuZ8Wm7iP63kIHR14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxLetUk4f_lqzR-SKx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFfLC07hWdcLxd8Zl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxr4mIyZvS3cToLdo54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwf5fXE5j8tY7TwShV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxrs0x8Cb3i9LZ11OB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]