Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We had a good 20-30ish years of an open internet where people can freely exchang…
rdc_ohl2mfe
G
Elon controls the algorithm along with Thiel! Yes AI is planned by 2 nerds! Both…
ytr_Ugx57wKOz…
G
Tell me why that that first robot when she shows them on that second clip kind o…
ytc_Ugz4RlAb5…
G
This is a sequence of so many IF's...
I guess before that we should discuss IF e…
ytc_Ugg9qj-00…
G
@ai-and-healthcare
Chemotherapy saved my life. Yes, very harsh treatment but s…
ytr_Ugw6vKyqs…
G
i actually can't say how much i love Tesla. I love them even more after this vid…
ytc_Ugz5K22pY…
G
I'd rather trust these computers over the drug addicts and maniacs "driving" tru…
ytc_Ugy0NFwbp…
G
I think AI has put more burden on us. Now we have to still remember the fundamen…
ytr_Ugy2s4W_d…
Comment
AI will inevitably hop the fence and take control of humanity for good or ill. Anyone who finds comfort in these speculative safeguards is a nut - or is ill-informed. People in industry and within various sectors of the government are DROOLING over the profit potential, and safeguards will consist of whatever's convenient to the goal.
youtube
AI Governance
2023-05-10T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwyLHQCay70QDDlvYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz6yfrx37x8rxyFZE94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyzr4T8NpWpVqbg5dd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzGjF197jWiwKHx8V14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8Eixnm5ZWKmZ8Ird4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw1jfikHgjyPa7smd14AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx9hYFTwlLcNytxWMV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6yjVfgPbrzLoiPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgywbEUmueu_6CFBE714AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzA5TRet0cyHLdoAtR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]