Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have a very similar background as you do (even going back into IT as a teen in…
rdc_jiguidi
G
I'm more interested in what this says about the artificial nature of human/human…
ytc_UgyDB0mwb…
G
Hard disagree on the programming and software engineers. Even a 100% perfect AI …
ytc_UgwAKjcK6…
G
THE GREAT DEPRESSION: "I saw a record high of 24.9% unemployment in 1933"
ARTIF…
ytc_UgzBWlb84…
G
Don't forget: Artificial intelligence is Fake intelligence. It simply does not …
ytc_UgzLCZtHT…
G
Sorry I am late Tucker. I completely agree with Elon on this point. What is trul…
ytc_UgxOUju_v…
G
@sethtenrec The fact that they let this buggy software out into the wild, reckle…
ytr_UgyECy1Jr…
G
If you hate driving, get a self-driving car.
Or, _or_ , take a train. Hear me o…
ytc_UgzBL-P0x…
Comment
He is calling Musk as one of the dangers while Elon startet open AI, open source, so it shouldn't be used the wrong way. Sam Altman changed the game by selling it to Microsoft (Gates is one of the most dangerous persons on the planet) Elon always asked for regulations and warned for the dangers. When this didn't happend and because of the selling of open AI he started XAI to make the most truth seeking and safe AI.
Elon is the only one I trust!
youtube
AI Governance
2025-07-15T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwzBLxehcJ4diHeH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVEtyiQ8yPZDJV0MZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9Vhi9yRhxrkk3QDZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3T-Gpy3Z8T7WchMB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxXtdURylbpgKxG0R54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyrmBlUrFQo7biZ4C14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGBJ93YcCyhSWzF0F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"disapproval"},
{"id":"ytc_UgxX3croblVTURMmZSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyxLu6uRPV-DMxTBsh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz5FVFN436sK7VrwoF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"}
]