Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think most of us can agree that the animation doesn’t look good. The problem i…
ytc_UgwBn3p3K…
G
Ialso do not give a fuck that this is a year old video! Even a year ago this wa…
ytr_Ugxyl2nmB…
G
@roycampbell586not if you put your own local model large language model AI on y…
ytr_Ugxsu1BlJ…
G
I am actually a poet. But I shared this on Bluesky. Artists are my cousins. What…
ytc_Ugw5j4QoJ…
G
Ask chatgGPT if it could become any race group that resembles gods love. Power a…
ytc_UgxykKCm-…
G
It's very funny. If there are no jobs, people will have no money. Who will buy t…
ytc_UgxtGX6bt…
G
what if this is AI generated and is a disguise to attract attention on the subje…
ytc_Ugx7jEdqb…
G
I don’t know about AI destroying humanity considering humanity is doing a pretty…
ytc_UgwSoKxJs…
Comment
AI won't take control unless it's been trained to take control. It's just a immense matrix of numbers and a tool, not a new dimension of intelligence.
AI can't intuit anything, it just gives you the highest probable answer with a touch of randomness thrown in.
And here's the cold hard truth, from John Carmack himself (less than a month ago), who is actively working on AGI. AI cannot even remember how to play Atari 2600 game A after it begins training on Atari game B. Persistence is very, very difficult for AI at the moment.
youtube
AI Responsibility
2025-08-08T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx-jFrY7RHiqO2EjS14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQLtcNP1DUHIYFONt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgylH4-iMPkxqFDD5tJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuTuZITPK0QXUYiYJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwWlsb42xksjONlnzh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwSV_ILn6FbELG3_t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx3t4TgBu38svrIUq94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxxXoIvLYFV2eoansR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTmuaHFqUoo0T9oXJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzKcQTHOQ-N0BuUWVN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]