Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's not a very human like?the Bana time thing? Do you know intps? We literally …
ytc_Ugx0g7-OX…
G
Alright so because we are mocking lazy people who steal art that means you are b…
ytr_UgyHrNWNl…
G
@georgeforeman1097there is justification
Women (some) use their bodies on medi…
ytr_UgyCJ3nR5…
G
The difference is there’s not a single job that automation can’t do. Including …
ytr_UgzMOKcii…
G
My new Jeep GC has driver aids, lane keep, collision avoidance, and combined wit…
ytr_UgzPi2YSm…
G
Wait untill AI response would be manipulated, your prompts will suggest you affl…
ytc_UgwOc4tMk…
G
What is the point if no one can afford the goods and services ai will take over.…
ytc_UgxGhlcD4…
G
LOL no they are not. They are programmed by humans and trained by dumping massiv…
ytr_UgzwO74sC…
Comment
All this tells me is that homeschoolers were dead on the money all along. A child learns best when they have a truly unique education that is truly individualized to them. People are just doing that through robots now, instead of parents teaching children. But a robot will never be a proper substitute for a child’s parent – who knows them better than anyone else in the world – teaching them hands-on.
I get that not everyone can afford to have one parent stay home and homeschool their children, but a lot of people CAN, just don’t want to, and that’s what we should change.
I wish every child who COULD be taught by parents who loved them WAS being taught by parents who love them. And the robots could help the children who literally CAN’T be taught by loving parents, for whatever reason.
Being taught by a robot, no matter how individualized, will never truly be a good substitute for a child being taught by a human who loves them. But this sounds like a useful middle ground for kids who don’t have that.
youtube
2026-03-30T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxrfe9xmOTEHrrT4XF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy8ubuRRPcsH7KYo-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgygspBZkeQehmEqP3F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxAxHGjfgSihf2bzE14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwWpqQZU8VgRhrcijh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTfyEi7JcmgvRaAed4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzd6_J7LdELVetkJ-Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzbzne7MzuPeJq7-Z54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMVFxDtgN2dQQmJmV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw8FjYxa8FZkfO8T3B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]