Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Always had a feeling that humans only last another 500-700 years. Humans can't e…
ytc_UgyKZ-QFX…
G
Ai is so much better laying off those people had very little effect on productio…
ytc_Ugz07PmqQ…
G
You mean the tell me.. You didn't turn into a do by ROBOT while you were out the…
ytc_Ugyue2iON…
G
Poke fun at Chat GPT but how many actual "lawyers" do far less or far worse prep…
ytc_Ugzbsdgcc…
G
Ai should be used for like people who have disabilities or robots as waiters and…
ytr_UgyiZJUBc…
G
2037 has been predicted for at least the last 20 years by people working on AI. …
ytc_UgyQaLq_R…
G
It won't be the AI that kills us. It will be the humans that blame it on the AI …
ytc_UgyrJE6cb…
G
@alulipayes of course, i agree with the double-checking, I always check but if I…
ytr_Ugypoxbj2…
Comment
Love how people forget robots think like a list.
How make a peanut butter jelly sandwich
1. Get some peanut butter, get some jelly, and some bread.
2. Use a knife and spread the peanut butter on the bread.
3. Use the same knife and spread the jelly on the bread.
4. Push the two sides of bread together.
5. Enjoy.
Now you have a robot swinging around because it couldn’t figure out how to open a cabinet.
youtube
AI Responsibility
2025-02-27T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzwYKphAd0WlBYAxNF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwpwQXeDQHRlJslJPR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwKuuqhc8k7GMpKW_54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxwYVu-uABcBtsvTxl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyh7gY2Pez_LAxL3G14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyhUj1i-kMI5nLH_bR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxUCP6kHEYJEcw7ysN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8m7gNQ4thWaCxLE54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwhp0fRpogqdzB51dJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyrKiadW1kN14dLAhV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]