Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really don't understand why this became such a big deal. Like according to thi…
ytc_UgxSVfRJE…
G
I’m excited about the day the autonomous trucks kills 1000 people a month then l…
ytc_Ugx09WPyQ…
G
Sorry, but who do you think will want to read an article from a users blog that …
ytc_UgyBWvsMI…
G
"Look guys im creative" said the prompter as he takes a image MADE BY a AI (not …
ytc_UgyzHKZae…
G
interesting argument, have you considered AI companies need more money and capit…
ytc_Ugzi7P0Jg…
G
I prefer the modern art over this. At least those guys picked up brush’s or at t…
ytc_UgyzXWiYg…
G
Honestly? If 80% of the work of an engineer is spent in bullshit calls and waiti…
rdc_oi0qd2q
G
@LC-mq8iq if it can screw with AI thats good enough in my book. Anything to pois…
ytr_UgzIMjN_a…
Comment
Few months ago every CEO was on the record (as per large study) saying that AI so far had cost money, cut jobs, and generated nothing.
I've been using the tools to get a feel for myself. Jobs are at risk because most people low key suck on average. I think that's the angle. Because LLMs are NOT intelligent, and they do make mistakes.
If we assume that people on average are good to very good at what they do, then AI can't replace them because it's a predictive algorithm, it doesn't actually think, and it makes mistakes with full authority, so you really shouldn't trust the output.
youtube
AI Jobs
2026-02-12T18:1…
♥ 23
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxywt8GP-rIZkoPkGl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4RSX2RoxEn95ggxF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwLdIpWIR2YpaxFCWd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwLGGyde0nl1ue37jh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyCIKQruRBdmvqpAWR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx9PfdvB3Y4o5OtmoZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzxMy8wX8SymnAx1xt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx377-XLkGjDYMIq_l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWejNVDr8QwQfJoJ14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxwGjK4LDKuy2AV42h4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]