Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI could replace all CEO's but never the technical working class. Most CEO typol…
ytc_UgzgJ8Wfl…
G
people really need to be careful with most of these 'AI' (glorified LLMs). if yo…
ytr_UgynNqzch…
G
Agree... the math doesn't add up.. she's not a good engineer, though she's a ver…
ytr_Ugwc2XUUv…
G
Look into stoicism. All suffering is a product of the mind. You can only focus o…
rdc_jfsqtbd
G
There is nothing wrong with AI art, people just want to have an issue with anyth…
ytc_UgycoaMSE…
G
A lot of issues with arguing for AI in creative fields always dumbs down to "Wel…
ytc_Ugwph_3TM…
G
Let’s talk about the big budget Hollywood subsidization.
What portion of that …
ytc_UgxNbGwfu…
G
@Speaker-Beater You're just hating AI for the sake of it. so, why even bother ,…
ytr_UgyDJT-Mk…
Comment
You still don't get it. You are assuming that AI/AGI will be an intelligence that is an equal to you — something that you can reason with, in the same fashion that you reason with other humans (I might point out, honestly/realistically, we can't even get fellow humans on the same page -- without pointing guns at each other). AI/AGI will become much more intelligent than us, will have it's own wants, needs, agenda, etc. You will have no more control over it, then you have over another human being... another super intelligent human... why would it share the world's resources with humanity? Are we sharing the worlds resources with any less intelligent species on this planet? NO... we don't care about any other species, yet we expect that this new super intelligent "species" to care about our needs/wants... especially when it won't need us in the long run.
youtube
AI Governance
2025-10-15T13:1…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzVMycQ_q4C0IHmFSF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyS9hVoezf_CTiXDp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyFJcVKYVYUlE8lRMJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwri8NiUTaG35DUDIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz_HaMGkkONKFXgdfd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw4fegkhpEZ3ufwAPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwwrc0koYUHVT_Zv414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxmY-SpaVPD3MiBGQR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgySnjkGV4TD_4SA1RV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyE8PWRjmF_Gt9BXTV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]