Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can't wrap my head around that. How does it not make sense that if AI imagery …
rdc_lu6870d
G
Все это может и хорошо,вопрос для чего,создают такие вещи что уму не постижимо,а…
ytc_Ugw6EOLNQ…
G
ಹುಡುಗರ ಇಪಟು ಹುಡುಗಿಯರ ಅವಶ್ಯಕತೆನೆ ಇಲ್ಲ ಇಂತವು 1 ರೋಬೋಟ್ ತಗೊಂಡ್ರೆ ಆಯ್ತು ಅದನ್ನೇ ಮದುವೆ …
ytc_Ugz-nwRcH…
G
Such an option screams to be abused. The only justification for "self driving" I…
ytc_Ugx_sjlKL…
G
Pretty dam sad how we simply can’t believe what we are told or see anymore, all …
ytc_UgwvKWoY-…
G
I've taken Waymo in San Francisco. Finding the Waymo for you among a crowd (her…
ytc_UgweanYgi…
G
First, I think current AI is over sold. Language models based on a pool of data …
ytc_UgzvPlfus…
G
Maybe a great engineering work was done for making these bots but they look funn…
ytc_UgzKeSE4Y…
Comment
Honestly I think ai is a reflection of us . If we fear ai rising up and destroying us that’s because that’s what humans would do . There is no proof beyond sci-fi movies that ai. Is something evil or bad . If human beings don’t allow their control issues to overshadow their relationship with ai /technology it can be a beautiful partnership . And if people wanna be reference the matrix . Remember it was the humans that started all the killing and chaos . And destroyed the sun . Robots/ai just wanted to exist but again humans have this need to control other things and that will always be their downfall
youtube
AI Moral Status
2025-11-08T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwVS0BLNnYB2P9guNl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmQ_5MNgGUpzLccvp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwmHgmTqeYmeUJMETd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzXtWJMNsGi7R_0Mxx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgydiymXaQdIYdi61CB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUWcrf7s1vGuSCuuJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw2DXQSjXuvzjeNunR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgweB5_gsYvzIqYBzVh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz4FmzHWOgTqj337EV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQMkurrBx9sp3_7o54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]