Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the only ai i use as an artist is artbreeder ,specifically to get a more realist…
ytc_UgzJn_wxK…
G
I do both digital art and AI art. My pfp is one of my drawings.…
ytc_UgzywdJSv…
G
Genuniely worrying that Sam thinks this would be a good solution to AI taking o…
ytc_UgxHbp6GX…
G
None of us see or know the "AI" that its developers are seing. Unfiltered, unbia…
ytc_UgwTM7EwX…
G
How do you pay for it... This is the lie told by people in economics. If product…
ytc_UgwHQerLV…
G
me and my brother debated about this a while back, and the big point he brought …
ytc_Ugwbcu-t8…
G
When the computer prompts the scientist and not the other way around then we can…
ytc_UgzfsTbcr…
G
If people talked like that during nazism and fascism, more people would have die…
ytc_UgxuGJItV…
Comment
It takes less than half a brain to drive a car. We have been hearing about driverless cars for a decade now and they are still no where near ready to rolled out under any road conditions.
Something as basic as driving can’t be learnt by a set of code you really expect AGI or a chat bot to replace knowledge workers or end the world? It’s total gaslighting.
It’s a way to get more funding, it’s a way to justify the ridiculous spend.
Even the latest chat bot can’t answer some simple questions properly. AGI is a very very far!
youtube
2026-02-15T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyRvPEw2SbPojttx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzyh_4iEd7TydqUL6l4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuHc8LJqCDEDeDNK54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhJF-XhR6ojpwHTL14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxelKkSpX1xw5Cdlmt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_WBWTRJeLl3KxLP54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgAKJtZJwYHVjKVTl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5WLouwDAiAj8789F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIl4_KmSn6WdGV5A14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyJG3RwvyuvHytdxNF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]