Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t know about this after watching AI Will Smith movie and I grew up in Term…
ytc_Ugz_l2So5…
G
But what about sentient's that are nothing like humans? For example if someone w…
ytr_UgixNOeeO…
G
I’ll never understand the argument of artists just ‘being born with the talent.’…
ytc_UgyxVcTCQ…
G
I'm curious, can we find original art, download the picture, Nightshade it, then…
ytc_UgxdHnd_h…
G
@2:30 making the phrasing of the prompt more confusing for AI will also making i…
ytc_UgziFJHRa…
G
What ever good AI data centers and AI in general bring, on a whole, its a net n…
ytc_UgyBUErsc…
G
Elon musk humesha kehta ha ki ai sab kuch barbad kar daga aur shayad elon ashnee…
ytc_UgzIP5i-X…
G
people overpreparing so much for the supposed inevitable ai sentience reckoning …
ytc_Ugz8808dT…
Comment
5:20 Give me a break. Ford knew it was murdering people in the 1970s with their exploding gas tanks. What did they do? Made a business decision it was more profitable to pay off the victims families than do a recall, and that's what they did. Mr. "my instincts would be holy smokes, stop AI right now" only demonstrates he would never be a leader in business since he would immediately go out of business. Cold, hard, fact...but not surprising.
youtube
AI Governance
2025-12-29T23:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyXm4efbBk73hGT7n54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy6PTy1RBWu90o-8vR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxadzK2r98YjzSO9B14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_UgwtD3oPgOSH9WGib254AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxs885NWZjbSqPMyX14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwaPyaoF_mQ5-PPjOV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzDSSTwn2XgksPDpOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywwyhsH-3kX_mcmQZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxAn23ILksdBGVvW454AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwY_m7GTxjFxrMoNld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]