Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AI interviews sound intense! It’s like how I use Rumora to stay relevant i…
ytc_UgymAQ997…
G
Those old drawings are actually so cute it's like my art style when I draw human…
ytc_UgxtD9QIc…
G
^ this! The fastest growing market in AI generated code right now is in cleaning…
ytr_Ugx67Htkb…
G
All this BS on ai dominating humans is just plane old cap, that shit ain't ever …
ytc_UgxqlbtBV…
G
The fact that you're willing to automatically trust chatgpt is telling. I've bee…
ytc_Ugz95YDdd…
G
AI is unlikely to take over the legal profession for at least the next 50 years,…
ytc_UgyBFVMpE…
G
The more AI takes over, the lazier people will become. This could increase obesi…
ytc_UgxdgP6Su…
G
Expressing yourself through your art is making something that is you, that comes…
ytc_Ugzehaiqe…
Comment
AI is the invention of man. It's programmable. Whatever we program it to do it will do. It like the law. If there was not any laws what humans would have done and they can do. Now AI are simulated human beings. If laws and restrictions are not put in all aspects of AI what will they do. As simple as that, there should ve modules in AI which restricts them from doing this or that as if they do those things they shall be auto destroyed. These are programmable things.
youtube
AI Governance
2025-05-28T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgKfOaHDdN_rcNqd94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwU6jUHAwTtkwX2Af54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzsMY8cOXXACkmDZ0p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyUBASy2QNqZQdPjTx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1NCsO0rfG5cF3MUl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeNcbn6-8d7eBt-YR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugx3Ni37noR36ZwlllZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugxi0LEFrF4YxnhSde14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxMPZJ7fEqviJQs5Sp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyYSmgK1SIqJBjVM_l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]