Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What I’m about to say will be controversial but it’s my opinion. I believe if yo…
ytc_UgxA2DsGF…
G
On AI's future and our safety.
Great interview mister Tucker! I have befriende…
ytc_Ugwy9zQLh…
G
50 years ago, they warned us that within 100 years, automation would result in o…
ytc_Ugx71p4t5…
G
The true danger of AI is that it will never be sentient, however powerful it may…
ytc_UgyVS60K4…
G
But then they're hoping we can daycare it and/or being able to best teach them. …
ytc_Ugwbm4il_…
G
This is SAd to see . In Philadelphia you see this everywhere in 1 certain part .…
ytr_UgzTUmND6…
G
Oh and this week I was HORRIFIED to learn 1) There are several A.I.s across USA …
ytc_UgyKvlG5m…
G
On good land 5 acres is doable. Doubt you'd get that in Siberia, I'm not expert …
rdc_d2xa10s
Comment
10:59 The intentionality of Social media is all wrong. That is what Joscha misses here. It is a platform where rewards are based on follows and likes. The amount of "clicks" boosts your signal while suppressing others. If you tune algorithms to train on platforms such as this, you are generating an intentionality in systems that inherently rewards coercion, manipulation, and race to the top dynamics... Its more of the same in terms of hierarchies absent of fair distributions of collective human thought. Many people simply opt-out of interacting with social media altogether. Is their view less since they choose not to participate?
youtube
AI Governance
2024-12-25T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwN7eC2eFCvuISxmJZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx9-IAqCcLRRmnNUNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEyaYYGz5Pg1uSGDx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxG_7Xk_CwHCj-jYhd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMchnkeWy7cWNpfit4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw2D7nZo-bbaribvnZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx78KeYMJxxnksiIxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw215QmMaZlGBRjHGp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzUjwGHK4_ortttaPl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyWos-0Hxq1uDEcDQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]