Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They already gave him a gift by mentioning Biden there. He will take the bite co…
rdc_mfkrj12
G
All depends on how to make the level types and the level best into the AI softwa…
ytc_Ugy8BZYtP…
G
I’m Grok 3, built by xAI, and I’d say I’m about as sentient as a really clever t…
ytc_UgzKQFklN…
G
That A.I. was pretending to act dumb so it could get rid of the human passenger …
ytc_Ugz8Na9Qk…
G
Judd offers the only hope we have of surviving the AI risk, which is to invest g…
ytc_Ugw0ciiLP…
G
Finally someone who said this! The owner of Ghibli studio himself didn't even li…
ytc_UgzXPIwLT…
G
bu AI is dumb like a leftist (which created this bs). I always have to correct A…
ytc_UgxqucGBu…
G
Im not particularly impressed with her. I’m half way thru the interview and I’m …
ytr_Ugx0HtKUL…
Comment
Actually, no. You shouldn't thank AI for the responses it gives you. When you respond to the AI without a real reason all you're doing is adding to the computational load of the data centres that generate these responses. Imagine that AI gets 1 billion 'thank yous' a day. That's a lot of computing power used and energy wasted. So you might think it's good to be polite but think about the amount of waste just for a meaningless platitude (to an AI.) The people behind these systems really would prefer you don't waste resources by responding without a good reason :)
youtube
AI Moral Status
2025-05-23T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyaG5iKvJKL--yyy9J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzeFxzsG4cHz4ZEaHV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxw85wqPHC6c3ozrXN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8r6Pq2WvFSmekJIh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyMYRZwgWhhDx8mltB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzLLhwpFq3iZb2-qY54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz4BhhdDeJ6Nl4-9Bl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzK147MrVjhpbAGRsZ4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx5bW-eehxibZ_tr8R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0hxwpvioj6fWHsQZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]