Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI, schmay schmy. AI is just another of many things overhyped, overpromised, an…
ytc_UgxD5N7Bo…
G
Well AI can’t be any worst than a lot of the teachers nowadays. Educational syst…
ytc_Ugx4wCelD…
G
I am a casual artist.... now please don't hate but I use ai to sometimes help vi…
ytc_UgwWNa6Oy…
G
What about the Reduction in Tax Base due to AI? What will happen to Social Serv…
ytc_UgxJS_TFW…
G
Universal basic income is the biggest lie and joke. Anyone who believes if that’…
ytc_Ugw7x1jZb…
G
AI shills: Why treat it differently than human art its not fair!!!
Also AI shil…
ytc_Ugz9YZQhl…
G
ChatGPT is going to be like the first single cell organism in life on earth
We…
ytc_Ugwvi_-xa…
G
We do have experience with things smarter than us, which are other people. Every…
ytc_UgyZqcSUN…
Comment
I can guarantee this is not gonna happen. The thing is there is a law in terms of automation and AI will not break it, its sort of a natural law.
It goes like this, there is a balancepoint where adding more automation will add more work to keep the automation in check updated and in working condition.
So automation will solve up to a certain point if you try to automate more than that you will need more work to keep it going. And the workers will be specialists in AI to keep it going.
At that point it makes more sense to just get it back a bit and keep the optimum level of automation.
youtube
Viral AI Reaction
2025-11-28T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9LM1joVps_sv_POJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwB_cBnVdGMpi4CYKB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_UgzNetCFx7RbfglciRt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},{"id":"ytc_UgwUeLy3LDiitosR7Qx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgyKD_TxZ_OwDv3WZLN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_Ugz7U2LlbJX6jClodD54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgyfYVErS99XmhauiXx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},{"id":"ytc_UgxlAUvcb5XuWvtdO3V4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgyeGy3teDmfRlw3yw54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyHwap8eVdiDcJFrk14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]