Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They shouldn't be called "AI Artists". They should be called "AI Image Prompters…
ytc_UgxsSBJ4l…
G
Spends entire video about Waymo. Tests Tesla at the very end and is blown away …
ytr_UgwHQvKqp…
G
As a Buddhist, I'd always felt pretty comfortable with the fact that "Right Live…
rdc_oi0ydj0
G
The United States has been in its "empire era" since the end of WW2. We are in t…
ytc_Ugyogbh2l…
G
Thank God for videos like this that sound the warning bells, warnings that the a…
ytc_Ugy1Thviv…
G
Some older ai image generators help in generating references by making a very va…
ytr_Ugxf7ZpsY…
G
spamming AI "Art" is so scummy, What has art even come to that became this bad…
ytc_UgwZXTxLd…
G
What I find the scariest, is it only takes one company to put profits ahead of e…
ytc_UgxzUGcQS…
Comment
Well, of course, they are smarter than human because you are feeding these AI data. The most talented and smart people are hired to feed data to these AIs. What is the point of replacing human? Soon these AI will only want to talk to another AI. So even if you think your job is safe you can't talk to AIs so you have to be replaced. Even if you put a chip in your brain, I don't think you can compete with these AIs.
youtube
AI Governance
2025-02-07T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzviWcWDT9w1pAh9iN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUgpEm4DOIO7renDR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwKLHbo3Al_KIIFejd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzEqulA5JdSd3MlAJF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwVaftWSecpWod8MNF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwy4JirtmX8oCacpi94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwomdOYCEwjAYICwqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZjmiNEa2A9yXlaLd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwxfdX661-R0INkL-F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyGu9Rz0j07ilxwBzF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]