Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Narrow AI needs guardrails from the Tech0ligarchs...Global (super intelligent) A…
ytc_Ugwtqa4hf…
G
I'm not against automation. Truck driving is dangerous and it takes a health tol…
ytc_UgwHdMENc…
G
Ai "art" is like hiring somone to cut up a bunch of newspapers across decades to…
ytc_Ugws6p2r_…
G
I’m still not sure if we all lose our jobs because of AI who will buy the produc…
rdc_oi2f67x
G
Before an AGI super-intelligence could take over the world, there is a good chan…
ytc_UgyOsbB9V…
G
Ngl I don't realize what is the problem, personally I don't see AI art in the sa…
ytc_UgywXIr-F…
G
As a college student currently, AI is the most powerful tool in terms of helping…
ytc_UgxPGMC3j…
G
What an incredibly eye-opening conversation on a subject which I admit I knew ve…
ytc_Ugw_ukmBt…
Comment
I don’t like how Steven refers to the few greedy billionaires deciding the fate of the human race as “us”. The few people risking our extinction don’t represent humanity or the human race. Literally everyone is against unregulated ai besides those greedy fucks that have never cared about any life. The majority of us want to live in harmony the planet.
youtube
AI Governance
2025-12-05T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyUxRpcW3m4Oa8MQOt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxNgHvetyJmP3wNpPp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyMDC8m8jDdWEVovjR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4lg1qUiS4XAVXo-t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRG72du5S7mL9FC2B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyQVBtuL3R9eNXG3yt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyq8p95FKR0z5RJl5d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxnrwHG1jZaYPrmbth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzdTgHylS6wkMQUOsd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyK9fPBALTcFds3HAR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]