Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As long as AI will get rid of dictators like Trump, Putin and Kim, i am okay wit…
ytc_UgxCPXgLS…
G
Software development is provably a hard problem, and can thus not be automated. …
ytc_UgzuaW6lX…
G
Bro is an ai who believes that there is a magic blue blue though technically hum…
ytc_UgxQuU8Jt…
G
@JulianaLimeMoon people nowadays act like people did when the steam engine was …
ytr_UgwTPe-gy…
G
I also sucked at drawing and felt too demotivated to truly try to improve, but n…
ytc_UgzTrk9Ih…
G
Trouble with dealing with A-I is it would take total honesty and responsibility …
ytc_UgxVSC8Bq…
G
Robot with guns , am a afraid another decade human will no longer in control an…
ytc_UgwpfqzSq…
G
Question do you believe all “woke” people against the trajectory of AI , just do…
ytr_UgzKjlDna…
Comment
SuperAI waking up one morning and plotting genocide? Really, that’s the best we come up with (no offense, just a rhetorical jab)? The odds of that are slim i.m.o. The current academic debate spans far richer scenarios: AI as a bumbling bureaucrat drowning us in optimization errors, or as a cold power-seeker bulldozing us with the indifference we show ants. The most immediate risk, though, is humans weaponizing AI for surveillance, manipulation, and profit. In the end, the biggest threat in the equation isn’t the machine at all, it’s still the human factor. Nevertheless I was very entertained by this conversation, thumbs up keep going 👏
youtube
AI Governance
2025-09-05T20:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzC3xDDiVTS1teWTpp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZAovwcJ-o_Qr9-jd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_Mf3Ke7o7fjuYZad4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyE49vvBljl92hFfXF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw59ynf1zrOUBZPMDd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwjrM2TzgLPg6X_7al4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy7mbgBTGwdTKVC4qN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRH5BKtJIgc2-xPOJ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwohdaZyLjU-vm1a8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzf6Ik9RvnJHsvc21Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]