Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's clearly the fault of whoever loaded the truck. I have to ask: what the fuck…
ytc_Ugio-4_UV…
G
@nyccontrabass3489 Yes, no doubt. AI will get better at an alarming speed until …
ytr_UgymMtv1Z…
G
And you say all this using a digital tablet (monitor also? aka Cintiq or Chinese…
ytc_Ugyzm0N5Q…
G
50 does not come in between instead it could be 49
hence proved ai wrong…
ytc_UgxiX1wLJ…
G
If you gunna watch me and sell data of my activities to China at least make the …
ytc_UgyXJnFCT…
G
The thing is, AI can never generate exactly what a person wants. So thus, there’…
ytc_Ugyh9KmwC…
G
The computers wont kill us it will be the people who get the info because ai is …
ytc_Ugy3MmsXn…
G
It’s a program called Nightshade! I believe it’s only available on PC atm, it ad…
ytr_UgxQE5IDK…
Comment
This is still bad news. If chatgbt is doing simple programming tasks. That means no entry level programming positions. As someone who considered the coding career, this is not good because I can not do highlevel stuff without doing simple level stuff. It is basically another situation of I can’t do the complicated stuff without getting experience with simple stuff and if automation is eliminating the simple stuff, then that means there is no place for new programmers. Basically programming is good for Software vets, but bad for noobs. No wonder Bill Gates said people who study programming nowadays are idiots. When a billionaire tells the truth, you know the future of tech industry is bleak 😮
youtube
AI Jobs
2024-02-03T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx31T1AH2I4LIO_msp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzggzYGKWa_Kv956M94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxgfs5wJuRLwohOjgB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyRbn8LkIWc5hvzHxR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrFAdiroVnY1_b-xV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzdQKwYUvnbo9fIqp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9AZnflDYuxDrD4qZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5jMNqe8NBvh9QcyF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgykoQXupThpQRileV54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOrKvQKslG9ZGwFMt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}
]