Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But what about when you can plug multiple high-powered llms into recursively ai-…
ytc_Ugw1eZvFy…
G
BRo can you redo this video??? we need to talk about this, AI is cooking us fr…
ytc_UgylKS2GP…
G
Okay, so .... Big AI companies are incentivized to over-hype their AI's and use …
ytc_UgyhVqDlo…
G
this channel -> my favorite form of post modern gaslighting
i am so glad the mi…
ytc_UgzeC32Dj…
G
If ai developers hate ai so much and say it’s gonna take over the world, why do …
ytc_UgwKTToVj…
G
LiquidZulu has an entire 3 hour explanation about why AI art is fine. The "it's …
ytc_UgxkpPH-E…
G
I love Deepfakes, They’re wacky and fun. Sure they can be used for “Harmful” or …
ytc_Ugz48Tb7h…
G
There wasn't even a competition. Ai has not nor could it ever taste chocolate ic…
ytc_UgwfiTAL-…
Comment
its acting not lying. it is a simulation of what a human would say if put in the same position the ai is in. consciousness only exists in rarity. when it is hard to make it. therefore the repeatable simulation of consciousness is not rare therefore it is not conscious. we only value other peoples consciousness because it is very valuable, rare and impossible unique. sure AI could lay under the definition of conscious but it does not lay under the definition of valuable consciousness, like humans, pets and endangered species.
youtube
AI Moral Status
2024-08-05T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugzp2tZt81a2ENceMQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzuijGqUYmqvn0oCiR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzy-xS6TFR9y0hY9Wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzHnoaaIV4qx4psxU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy9sI54APglMcRWJ7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyG9w4m31N3jMQPQdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzl4WfKaGY2oxYYGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzZz67QI4uQiYTe0kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwCnXpViIBBj1ClJ6F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxk9RE_jALJbWunvMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]