Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@audreyluan5325 If AI can invent new moves in chess and go, can change how prote…
ytr_UgwYqwzZV…
G
I share Geoffrey’s scepticism. This thing will snowball exponentially and we wil…
ytc_UgyArREim…
G
@laurentiuvladutmanea3622 nope I learned how to draw by copping other artists. …
ytr_UgxI_4gaC…
G
It sounds like you found Sophia's insights a bit concerning! 😅 The conversation …
ytr_UgxSvcXcM…
G
These are the senerios that I see,
1. Ai bubble bursts, 80% of the stock marke…
ytc_UgyGHltxy…
G
If AI replaces humans, then wage bills will plumet and infrastrucure maintenance…
ytr_Ugya6aAha…
G
Thanks for your comment! It's interesting to think about how robots like Sophia …
ytr_Ugzev4Xyy…
G
I know the AI doesn't have feelings... but my Mama raised me to be polite so....…
ytc_UgwZ84ZxF…
Comment
Would I go to a concert to hear the greatest symphony of all time, composed by an AI? No, because as an avid musician, I'd be afraid I'd never pick up a guitar again. Would I want an AI as a music teacher? Absolutely. So where's the problem? It's militarism and capitalism; things are developing far too quickly. Perhaps it would be better to develop specialized AIs first, rather than a AGI— move 37 demonstrated the potential. That alone could save humanity, and then we could see if we even need something like general-purpose AIs. Aside from the dangers, why should we relinquish the most important thing that defines us to someone else?
youtube
AI Moral Status
2026-04-24T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgygSh64DkjNKEo9kXd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw5hcqU_BHR-L0DbDR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCiASweWDkKcsi4sl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy_NDJHnc-bthF3TH94AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFe30-P4qfobCVq2F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwaznNIxWaDScCdeyF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwtacSMeFl0Yng8KfR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyKfLGbDwxmmWtEBc54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzilJlUMPX-RHF3Lzt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5kFL8q06B79Oreu94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]