Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai bros are truly the definition of someone who acts Like there ALL that. When i…
ytc_Ugw6rMWnY…
G
This is not a new phenomenon... in fact we can see the direct results in many ot…
ytc_Ugxikqk31…
G
Let's introduce AI to the public they said.......... just opened another disgust…
ytc_UgwOSdY1l…
G
Maybe they’re striking while they still an upper hand on AI (a 3 year contract w…
ytr_Ugyr1bC0_…
G
LOL, AI will be highly politicized by both by informal government presessure and…
ytc_Ugw__MalJ…
G
Nicholas Perez - check out this blog post that we published today that goes into…
ytr_Ugi75uTYK…
G
I'm not afraid of the AI we can create. I'm afraid of the AI our AI might cr…
ytc_Ugwr3dFfF…
G
It is true, I have been going through a lot of interactions with AI. Whenever, …
ytc_UgwcMhA2v…
Comment
Its truly ironic, that he and many others built AI and now they explain it can decide it wont want us around, and now after it is all built /Designed and already learning on its own - then he goes on a pod cast to express the likely harm AI will bring - how intelligent is he truly?
Obviously there has been bad human actors since beginning of time, why not originally design the intellect to not work for those "bad actors" - duh..
Also, reasoning while learning, was clearly known from the beginning where is the jail bars or sand box with only human allowance to let it out?
OMG
youtube
AI Governance
2025-06-16T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgybwkUlLjNpqGHwCDN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyK86HVCnfEp2YpsRB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRFOZQ8KEQpf1QQAV4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwW7Yd339WGmxUzp5Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw9KbxUxbs29lCHsw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWYEf4uCGRc2uDYGR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwLAGWBiGFois9PJLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzvVe01CWiXNQ3S-ed4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgynndGtTRlHcJfrBp94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWl78BZDKiSIboABl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]