Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We appreciate your perspective. It's true that humans are not programmed like ro…
ytr_Ugwpb3at6…
G
A wrongful conviction as the result of artificial intelligence is no different f…
rdc_h5415fw
G
I hate arguments like this because as someone who really wants to make art but i…
ytc_UgwnF7LSy…
G
AI is now an overused term with little distinguishing meaning.
ChatGPT is a con…
rdc_kozt7bm
G
draw. DRAW. if you make mistakes you learn, if you take from ai, you learn nothi…
ytc_Ugy87ZP_z…
G
I never really considered AI art art. Just trash actually. I don’t upload my art…
ytc_UgyPQILhj…
G
@GrumpDog I tested it two days ago; my comments are accurate. The history of sci…
ytr_UgyLz_SYD…
G
WOW I was already seriously disguted by all of this AI thing but WOW these comme…
ytc_UgyMbI-Bb…
Comment
Not gonna opine on the veracity of this video, but for clarity: in the AI world, "Singularity" refers to the development of AGI (Artificial General Intelligence) -- that is, AI that's as smart or smarter than humans. So think "singularity" as in THE singularity, like crossing the event horizon of a black hole. Not "singularity" as it was interpreted in the video. So the answer "singularity" would mean "AGI is developed at that point."
Also, AI isn't yet "more intelligent" than humans, it has MORE KNOWLEDGE -- which isn't the same thing. Intelligence is problem-solving; knowledge is memorization of facts and data.
youtube
AI Moral Status
2025-08-29T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxbqG_wVLMe-Oth-ld4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4OJXjPz2JJjJNJA14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjlIBfYkiqdS7wD054AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzkAebOPo3Fxm0WTfF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxdhUb9oEiEOpu6T8d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwarTAoSYIF6FxXjvN4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxO8ppundAnhti_XzR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzDZqiIHBP4jErlpMB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyhjkAVLzv-EbqWVy94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwk-VADtvKOy0Rxg3Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"industry_self","emotion":"approval"}
]