Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
With every company doing this, where exactly do these companies think their prof…
ytc_UgxqQxba6…
G
That’s a very good point. Do you think LLMs can really get to that level of inte…
ytr_UgwD17JLs…
G
If trucks stop killing people then I'm all for AI drivers. America does not have…
ytc_UgxIPQP3i…
G
Remember artists, not every artpiece needs to have a human hand and backstory to…
ytc_UgygcdLoC…
G
I would prolly go to jail, not because i literally broke the AI but for violent …
ytc_UgwRlJZ5H…
G
I've said it before and ill say it again. There is no such thing as an AI Artist…
ytc_UgyM8840y…
G
Without third party verification, there is no reason we should be trusting Faceb…
rdc_hj2ewze
G
I think the key insight you might be looking for is the orthogonality thesis. Ba…
ytr_Ugzi4YgNY…
Comment
Is it really possible to stop any maniacs trying to do great evil and severe damage through the use any high tech (AI included) ? Until one day somebody could effectively and precisely control any maniacs and gangsters (whether human or otherwise) from committing evil, the likelihood is that all shall eventually be doomed, even if slowly and painfully. So the key is to shape and steer human thinking to lead towards good instead of evil, collectively, not keep producing potentially lethal tools while consistently ignoring what anybody would use them for.
youtube
AI Governance
2026-03-11T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwXa9a7d8-whjS4hGF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzG5DuRj9ommFuXxA14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdcBHoHpgOTbXJZAh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxd3AsmvxSTtk976E54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7usZPzgsnVlVenoh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7V3HDJc2fieWcwpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTawOf9Y_hqnX6A3V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlGoxIFx1XSqQldYx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxpQeYvVGQZmps25UN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSOPGd-lAhPn0pThZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]