Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The defense of AI is so weird. Like, if I was given a microwave that could make …
ytc_Ugw2GPreK…
G
Yea there has to be some balance and respect with these things. Doing AI is cool…
ytc_Ugxlps3UV…
G
That's an interesting perspective! The dialogue between AI and humans, like the …
ytr_Ugzwm6ozZ…
G
That's a fascinating perspective! The name Sophia indeed has deep roots, often a…
ytr_Ugz8-136t…
G
If you cant come to terms with something emotionally, that means you shouldnt do…
ytc_Ugw0UDLD2…
G
@rickmartin9420 Did I really suggest otherwise? Or do you just jump to conclusi…
ytr_UgzdfM_B7…
G
AI is a tool like crayons or brushes, but assigning the argument of ‘talent’ to …
ytc_UgxZD9b7_…
G
One of the ways is they have robocalls that just say “Hello?”
Don’t ever say a…
ytc_UgwcrorEG…
Comment
Counter point: AI is making people dumber. Debugging requires training your brain like a muscle. If you offload that and outsource 90% of your coding and bug-fixing to an AI model, you will gradually lose that cognitive fitness to diagnose a difficult problem, or recognise an important edge case whilst you're programming.
So yes, you can drive instead of running, but if you don't exercise anymore you'll get out of shape
youtube
AI Jobs
2026-03-11T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwNsc2j87xj_d6K9sp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaPOl4MKwV1_L1cmB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzgml00kvFFaYkij-R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwfBwlXj8E6cRIVHPF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugya5c2gsguTds5XTa54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzm7i1eNUUH_UCmy794AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzrodSy5xUDiVgtajl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwvHT5uye2tdNn7zyh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy24qDFGUO1e-lY_td4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyan-UTjFf7BTpRkCd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]