Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Every time we remember something, our brains actually recreate that memory from …
ytc_Ugy5qjYA1…
G
· IN US and Europe by US
Wi-Fi – Wireless networking (2.4 / 5 / 6 GHz).…
ytc_UgzdrD9UG…
G
I think AI bots will become more like the droids from Star Wars. And if a bunch …
ytc_UgyWqTn4j…
G
We need to put the brakes on AI development NOW! It’s already getting out of con…
ytc_UgyNvQFFg…
G
@jaywhoisit4863 ...
Exactly... without electricity computers are nothing more th…
ytr_UgyE5B-3t…
G
AI is inevitable. The key is to adapt society, the economy, and laws to integrat…
ytc_UgwK5kZK-…
G
Trump and the Republican's Big Beautiful Bill has provisions that make it so AI …
ytc_Ugyh_2wHX…
G
I find it fascinating how many people are ready to claim absolute certainty abou…
ytc_UgzC5pNSc…
Comment
There is a lot of holes in these arguments. Two points: on a relative scale, most humans are doing interpolation too and AI have a much bigger pool. If you work enough, you know humans make mistakes, they don’t learn from their problems, they push back feedbacks and etc. the effectiveness has to be measure comparatively. Second is more of scale, there r tasks that if u need to do reading comprehension, you need to hire lots of ppl, train them, walk their work and etc. much more efficient with AI and even with hallucination, it could be a more cost effective method. It is not like human are way better. Humans sometimes are worse
youtube
2026-01-24T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzlEX1w3yvGnqlbSit4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsYn-baN-vNCzxjnt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBfZJT-UmJKWBx9LB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_v7KHAhkY6dTBN3Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwnZy7whAUA_tuatlF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_U2D2QMa3l3cN1Jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzObe7Q9Tlpnzl9rm54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQ-p3ZCUWJ_4BcOoZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyMHRovwGoEmGmiYCJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxISdJQfke3IZvow2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]