Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AvianDnD sure I dont care. Would be hilarious if someone made a deepfake porn o…
ytr_UgzHb7jR9…
G
The movie Terminator gave us a hint but the general public thought that was just…
ytc_UgzdvPqMG…
G
AI steals, even if it's free, remember that when something is free you are the p…
ytr_UgybW44LD…
G
Agree ! 😂 I'm really starting to get annoyed whenever he disturb the robot's con…
ytr_UgwzNtFrp…
G
if they become self improveing themselves they will get our mental development ,…
ytr_UgzvdM7bK…
G
The thing is, ai will never fully replace us. Will just have a less people doing…
ytc_UgxxA14I8…
G
Claudes response to one of the argument that I had with it:
Imagine an AGI give…
ytc_UgxoZEhSg…
G
Well, the question is; Are police keeping track of all the times during their sh…
ytc_UgxpveMod…
Comment
The limits of AI today are fundamentally the limits of artificial neural networks and the software and training data built around them. AGI and ASI are mostly marketing buzzwords; no single, agreed-upon definition of intelligence even exists. AI applications are genuinely helpful and—they can also create gigantic, civilization-threatening messes—but they do not make GOD.
youtube
AI Responsibility
2026-01-18T22:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw5oX2HUlrH3bpwmhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwM_Ah1rJN83xlgeFd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLZeIKsCCvdT4g0GN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQOZZmu_1NjytbWg54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxIdgfnrdeLOZ7Z6u94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxX14ha11y6spCAeVN4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_Ugysycl1gJbRNgiOk4t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzBQHWZrZgiwZAZ3ml4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgybvY0629AX8qUS2qN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwpILzhvoXPusvNNqZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]