Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why didn't the son talk to his parents? Why didn't he feel like he could talk to…
ytc_Ugx9r3zfU…
G
i've said this once i'll say it again, deep fake will be the downfall of our soc…
ytc_UgzhIHsHJ…
G
Almost all of these discussions never mentioned physically skilled labour/trades…
ytc_UgztnfK-X…
G
The major truck crashes in Colorado are from poorly train foreigners. Are these …
ytc_Ugw9Jr-t9…
G
This may just be my arrogance on the subject but I do not see the issue with the…
rdc_exguk63
G
Making an AI that avoids.hitting cars with the carts would be absurdly difficult…
ytr_Ugy4SrfM5…
G
If you’re investing in A.I. in 2025 (or the Internet in 1998), you’re funding th…
ytc_UgyeYq5r3…
G
People wearing make up and part CGI making them look robotic I can tell this was…
ytc_UgzpNKEUB…
Comment
What exactly is certainty about safety? Why did require a 10-day suspension after much work. Is reaching certainty a process that cannot be automated?
reddit
AI Moral Status
1773343816.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_o9yxhay","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"rdc_oa3f13j","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"rdc_obkdvvb","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"rdc_o9vyf9z","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"rdc_o9vrtfg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]