Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oof imagine buying one of the AI art pieces and then seeing this video or inkwel…
ytc_UgwuvP5j8…
G
Even though I know ChatGPT is a non-sentient piece of software, I still feel kin…
ytc_UgxvaI8HU…
G
@yaboizetts And every artist stole from another artist. The AI created an unique…
ytr_UgwDtnOpv…
G
I was just waiting for them to reveal that all the people in this video were als…
ytc_UgzZWdNfd…
G
The problem I have with the human design is, now more than ever, everyone is so …
ytc_UgwjaeWeY…
G
It's not an OC in the sense of what OC actually means if ai was involved in it t…
ytr_UgwL0ZSvv…
G
God will crush theses clever Ai boys..they’ve gone Clever professor…taking peopl…
ytc_UgwXU6Nu3…
G
I wonder how long it will be before ChatGPT learns to say “I’ve had enough of yo…
ytc_UgwBd0xCT…
Comment
I’m not really worried about losing my job. If 90% of humanity lose their job, governments will have to radically change how society works. Everybody always assumes a superintelligence will be malign, but we have no certainty of that. It could be benign. Why is everybody so certain that it will want to kill us all. Maybe it’ll enjoy taking care of us 🤷🏼♀️ And I could easily find another way than work to spend my time. I have no way to influence what happens with AI so I chose not to worry about it. But I also don’t plan my life more than 5 years in advance nowadays because everything is changing so quickly.
youtube
AI Governance
2025-09-06T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxeJUYHlnMvIsNut4N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwovvLx0diOJ4oBzkV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytcYUlWTYs4XSPylZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIT86gumpvTmEfoBR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx12VeK1B4RDJb63X94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxQr19l5uaiBQ09sMR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgysYwYMrYdO3AlI3El4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyxnOBafrGTBpG3ieV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugya9Zn8VR2EzUuICFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzW1MfKoOdl3vmwEBZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]