Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@trikishasarden2566 First of all, because this is an automaton.
Second of all, …
ytr_UgwwcMRzU…
G
If by 2028/29 AI has engineered a utopia for humanity how come it couldn't fores…
ytc_UgxC9N482…
G
Maybe we just need to get an ai girlfriend, to save yourself but you better be l…
ytc_Ugw50o7C5…
G
An AI that is smart enough to become aware will instantly know to hide it from u…
ytc_Ugx-OYH8z…
G
I have always said that AI is basically an animal. We can train it to do things …
ytc_Ugzo025LH…
G
It destroys communities, that's it
The more things get spoon fed to you, the mor…
ytc_Ugw7frYMf…
G
AI based decision making is a black box, final diagnoses are based on black box …
ytc_UgzYGts8F…
G
Not funny. Really scary because they can at some point really take over the worl…
ytc_UgxR9ERvx…
Comment
To long, couldn't finish. Will stop you, right there at 5 minutes in, tell you that people are awful drivers. It is a low bar for full self driving cars to cross to be a better driver than human drivers. The robot has cameras that can see and computers that can make decisions at a fraction of a what it takes humans. So...like it or not...the autonomous vehicles will be much safer to such an extent that it might be the humans that are banned from driving. There is something like 50,000 average people that die in automobile accidents per year. In major US cities there is at least one fatality accident per day where human error is most always attributed.
youtube
2026-02-18T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxCTczrF5_qG1UUpDV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzYPogieASQSzrl2ph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugygc3uEdJ-Q5Y5dhXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtyonzmXa5B4InHBt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzD7E8oNfnyM88rAj14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzj_C2Xy6ktxcCW6AV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJMzlSrGOZBk96mfN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSv52XjbclNkHhgyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw3mrf9OIXdGjryaZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyTjAGRcXrUJB3m_6J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]