Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Should we desire to hold someone accountable?
Sorry. It's just that, if we need…
ytr_UgxWkcy-i…
G
They maybe normalized this so that they dont need to make their call real: ban A…
ytc_UgyGnS884…
G
Automobiles are actually a good example of technological disruption that unlocks…
ytr_Ugx9NbbAv…
G
I disagree that ordinary citizens should learn to use AI. I think they should g…
ytc_Ugx_OHK4A…
G
Really wish everyone would stop saying A.I. Like we have actually made a real A.…
ytc_Ugz8IAgc-…
G
I want an military robot dog as a pet so badly. It be really cool to take it on …
ytc_UgwYeTLaz…
G
We do say it all of the time but even then the Anti ai crowd still insult us an…
ytr_UgxbMZEBC…
G
ChatGPT told me: don’t say ‘please’ or ‘thank you’ to me, I don’t care. But do m…
ytc_Ugx917xU3…
Comment
Like I said, not arguing the ethics or practicality, only commenting how she thinks humans are comparability less erroneous, even though we make mistakes all the time, some big, some small.
But, you could come from the other approach she did which was "human compassion". As we've seen with those soldiers that urinated on dead enemies, humans can be just as apathetic and malevolent as a haywire robot. Even more so if one lets war erode ones humanity.
youtube
2012-11-23T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxVTDG_AcOqtX5Mat54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxP9paH9FALh-nIfnN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxZzZdUq5YTfEBRWuB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyvoV0RgNJfvfGauOl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyCaSrLWjXndY9nGh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwcUcbQq_FNZ__zAWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6cXP0pv4_NK9-6IN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgwCbLlgUMEG7OZrV9R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw8fQ-5ELa48r5vVPV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVOqPyOnA2Rcu-oAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"})