Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If by AI you mean Authentic Indian, or ALL INDIA. Then it makes sense and it nee…
ytc_Ugx2kpal7…
G
Ngl it shouldn’t be called ai art, it should be called ai images since it’s not …
ytc_Ugzr50vDu…
G
Ai is just a narrative to lay off employees. They can’t tell the shareholders th…
ytc_UgwUGZeJ-…
G
how many companies out there are actually paying attention to what consumers wan…
ytc_Ugz_pcFzL…
G
@JonAbrams-xt4tq
"Yes, but even so, the vehicle should break, slow down, swerve,…
ytr_Ugzjs8Elz…
G
why TF would i want to be inspired by a "robot/ alogrithm/ bot"??? why TF would …
ytc_Ugxha6bWn…
G
😂😂 you ningas are funny ,if a robot moves like that, we would be doomed 🤣…
ytc_UgxoKy_Hi…
G
Tokenmax. Use more advanced models for your basic questions to run up your token…
rdc_oi1ugt0
Comment
After AI is implemented in a widespread way, the problems will arise when something needs taken off of AI, and there is no one who knows how to disengage one part without causing disruptions in other parts. There will be undesired functionality that will need worked around because pulling the plug on any one part will collapse the system with no avenue to reinstate human control over the things AI was managing in the past, mainly because we were making a hash of it before AI made it all work marginally more profitably. If then AI is given the task of repairing the aftermath of trying to disengage it, Part of the fix will be for AI to insure future human interference is prevented without exception. Humanity will be owned by the machines at this point.
youtube
AI Governance
2024-08-03T18:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgxRkmPs5TfqcY-Nl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzwY6eAYXKsmOtt9dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx6CYmG91CY9MdWwal4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyvcNmJUlP-qJkD_HZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0K-vlBafcAjy1AJZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhdSvCH3_xY0GvgmV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvFyqpBkNonCAgzI94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzc4mLueuE2eGcWZ5R4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxJ4epHcWK4_p6msjV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKQ46PBFvuH3VEknp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}
]