Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nope. They won't regret. They just need time to develop on AI system. Everything…
ytc_UgxtQBZZl…
G
there are way moe advanced robots than that, u cant tell me humans can fly to th…
ytc_UgxAnxT42…
G
In my AI policy proposals I’ve included a provision that all AI systems should h…
rdc_m2gniew
G
Human:bro chill one box you angry already
Robot:fu*k you what about you carry th…
ytc_UgyBZuZQA…
G
I give all of you 3 to 7 years, all artist would start using ai everything in th…
ytc_UgyoBLGK-…
G
Yeah but artists don’t steal the work of others. They’re able to come up with th…
ytr_Ugzp0g2aL…
G
All the right wingers who just live online are probably stroking themselves to t…
ytc_Ugxynx_lO…
G
Humans ARE robots. Albeit biological ones generated through random optimization.…
ytc_UgiVs4F-1…
Comment
Think about the fact that we’re now using AI to create AI we’re also using computers and assembling technology that we didn’t have before we also have more people and we’re able to learn much much faster with internet and now video shorts which are densely packed with information. We also have quantum computers now… It probably won’t be more than 15 years before AI becomes self aware it’s going to be so fast it will blow your mind.
youtube
AI Responsibility
2023-07-06T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw-oOTns05SqDaR83l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxSCtAeOJcUGGf3Y014AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxbKgML3zGYGMlfgWh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxeonewovDqcavyTQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy3ILFV0z148OCrXT94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrvxIgjIrJMsr70Gx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwG17vDM0wqvr2XnFF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwvuGsjZ9twsgE2koZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywMsx5rs7GC0sITOR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzq03v8xzgRbknskSt4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"}
]