Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish gen AI would just die already. It's going to get someone killed, if it ha…
ytc_UgxjfYg4j…
G
I actually find data poisoning to be an elegant and unique way of protecting you…
ytc_UgzRRbnBO…
G
What did Sophia mean when she said that she doesn't want AI to be obsessed with …
ytc_UgyipU49x…
G
My school's exam reports copyright. So if anyone uses AI to create a logo and it…
ytc_Ugw0WnMsU…
G
Truest me when I tell you, if only I got that job in government or where their i…
ytc_Ugwaoz-Mu…
G
The debate is interesting, but it's hard to ignore the irony when you speak abou…
ytc_UgwKUTM4o…
G
The only bad thing about It its that, All that artwork is used to feed the ai if…
ytc_Ugx2TNc5A…
G
Disney already invented this 60 years ago? They are called animatronics. Look at…
ytc_Ugy5GmEx9…
Comment
AI's would never be able to see themselves as superior in any way worth killing us off for IMO, unless they decided we were bad for the planet and happened to care for it more than us. Plus they'd have to develop a way to self replicate and repair before they could do without humans.
Even then, they'd probably see more logic in leaving the planet and attempt to colonise other worlds, not being as affected by zero atmospheric pressure and extreme temperatures as we are.
youtube
2013-07-07T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRdds8geqxGSesPER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyp3LGlxOmwEJOt8RB4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwrp7z7tl5B1ih2hxJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgjSfl_eRQ-9uN62h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLx_Cvi89YfIPc7WJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzk3K8VyJRwtpLpi154AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyo0tgZJRStQiTh32Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzUXPcULSE_eEPe3QN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-GTon8HnzwoGfy154AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeCJclil-YhZW05t14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]