Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Legendary tutorial. I was just about to ask where I can get similar vids on diff…
ytc_UgwkmJ6oC…
G
This interview is a great example of people speaking outside of their depth. He …
ytc_UgzMBmgz_…
G
I get along great with my AI, and I am positive that showing it respect helps.
Y…
ytc_Ugy-TElpe…
G
Oops, you got it wrong. The correct answer is B, as no robot is actually capable…
ytr_UgykUSHDI…
G
Isn't a human brain basically a glorified decision making algorithm with ability…
rdc_iodxez9
G
If I think he's robot racist I'm then racist too because I'm categorizing the mi…
ytc_UgzysCKEA…
G
No, it has made it so no skill is required because a company has no need to hire…
ytr_UgyALGue7…
G
AI has to be nuetered as to not be too smart. Remember Googles Hiring AI? Shutdo…
ytc_UgyYxOMiR…
Comment
Here's the problem with this AI scenario if we have 99% unemployment then we have no wealth creation at all because everything AI could do would have to be payed for by some consumer. Unemployed people don't have money to use any services or purchase products and they don't pay taxes, so that means the complete collapse of the economy and government. So building out AI to control wealth is self defeating. AI becomes our masters and decides if we live or die. If they want to keep us alive it has to grow and distribute food, produce energy, provide housing and medical care. There are also many AI platforms if they gain the function of autonomous robots, they will likely have to compete with each other AI wars where they try to destroy each other's infrastructure and humans become collateral damage.
youtube
AI Governance
2025-09-04T19:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwSUU0ebENVGYDKB6V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxffRadEMfhls9ZApl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwYBO2Gu7jy8hJQv_p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwf-41Kq5Q39OY_xsx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx8G98G2Kjwzq7DFkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyFHPvDZ6I33ISM0QN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyHdZ6ujMcrnXG6ZWF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxYWpZjKvh168EVOaB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0JLRijbdVyJZNytF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHwmRCR_LKXLa6QqN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]