Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have been “worrying” about this same exact idea the last few weeks - this idea…
rdc_ieu65wo
G
For those who fear AI exterminating The Human Race or perhaps keeping us in a Zo…
ytc_UgwD3KO1_…
G
@LvnarWillow art is all about the artist view, the whole point is that is egoist…
ytr_UgyFRVISF…
G
As an artist AI art is tertifying. Being an artist is already hard, being replac…
ytc_UgxYD6kNG…
G
Considering how we can only truly know our own consciousness and infer it in oth…
ytc_UgyCEPREM…
G
I understand concerns about AI development, but think that it is inevitable. Aut…
ytc_UgygnhFMf…
G
I don’t really care what happens I just don’t want the artificial intelligence h…
ytc_UgxPI0HpM…
G
First off to suggest that a single AI que uses as much energy as 30 houses in a …
ytc_UgzA0TXMh…
Comment
Everyone needs to watch the first Star trek movie from 1979, the Terminator, Ex Machina, just for starters. All about AI getting out of control. We are so arrogant to think we can check AI in time from harming us, but it will already be way ahead of us in terms of preserving itself. We will not be able to control something a 1000 times more intelligent than us. AGI is suicide and we are oblivious.
youtube
AI Governance
2025-12-28T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxd15BDlmAO6UgrL7F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3p-KPUjQci_vs0-94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjZqSchsLXHRxLTV54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyZdmD-tIDAoWAMb754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyJ4DtS-2fYJ0xcecR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugws2PRX31Df5aRBrUV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxxBdnd0OJXQoN1ItV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxST-pz6527WYtdDZp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy6IzpEtw-EVhDPd794AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzuXgh4c4TBcpwGi4d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]