Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not only does Uber *not* have self driving cars but the other day I ordered one …
rdc_m6zektz
G
Book launch clearly.... kind of obvious and also c**tish really to be blowing th…
ytc_UgxInHOFO…
G
LLMS certainly wont do it. But its the idiots that keep playing around with ”wha…
ytc_UgwM7SO5i…
G
I'm a senior in hs and I want to study graphic design or architecture, but AI is…
ytc_UgzZ0ZB_z…
G
Why people keep letting this guys spew their doomsday scifi bullshit? LLM AI is …
ytc_UgxbaKv2F…
G
You don't need to tax the AI's.
As AI and automation accelerate, we're facing a …
ytc_UgyIXUkH9…
G
I think this just goes to show how A.I. used as an inspiration can actually be a…
ytc_UgyIR9O15…
G
I never been against AI and its development, but I still completely agree with e…
ytc_UgxbQJrmT…
Comment
My economics professor and I had a little back and forth about AI some time ago.
I argued that AI will take jobs en masse one way or another at some point in our future, and as such, an economic crisis will come in effect.
She argued that it's nonsense, in her view, the job market will adapt, under the guise of maintainance of the robots doing the work. "Someone will have to fix em you know"
To which I argued with an example.
A factory of a thousand people will lose their jobs to robots. Then, only about ten to fifty will be required to maintain these robots. Which is a net loss of 950 people. That's 95% of the workforce gone and jobless from that factory.
(real numbers I got while being an intern for a headlight manufacturing company from an analyst)
She argued it's nonsense and promptly shut the whole convo down.
youtube
AI Harm Incident
2026-04-21T18:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzWSN8lhksuIbJ9d7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxmOr_H-ycsdVdcGkJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzvtGGJeoQRrOS6VBx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQNHW4iIOUR0LBAeh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyIM7_Gcj5qFsnfFwZ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8iWZaug4BcbTUplZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxBUP4AAlBmdjsSYqF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx3m1FsWzDcu07sXgV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgymV8ahnOjnqc0Ah1p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz97hh9QMQM0WRwv_J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]