Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
LLMs mirror one's unconscious drives depending on the prompts given. You are lit…
ytc_UgyuRBc-B…
G
I think the narrator of this video is AI (or an AI generated image of him if he …
ytc_Ugx0yeFrY…
G
which isn't the issue, people buying ai instead of hiring artists, the plagiaris…
ytr_UgzmCHTMu…
G
I programmed neural networks back in the 80s. I know that ChatGPT isn't conscio…
ytc_Ugy7bPY3u…
G
i know this is beside the point but as some who has read "i, robot" it IS about …
ytr_Ugwei_7KP…
G
I think AI is going to cause a whole generation of cyber criminals. AI code is n…
ytc_UgyeqgQCa…
G
16:37 I’m worried because how does it go from ‘does anyone have plans for the we…
ytc_UgyPa4tCW…
G
Around the 5 min mark....
Labor replaced by Machines
Intelligence replaced by A…
ytc_UgyfG4WmB…
Comment
Oh I was wondering why it gets so many questions so that I can't get to ask it something. Now I understand. It's those people who tend to ask it meaningless question to make YouTube money out of thin air. Making it impossible for other people to ask ChatGPT sensible questions like recommend recipe for a lunch.
I know nowadays we don't use inefficient light bulbs for those purposes, we've got transistors, but it's still some electricity that could've been delivered to a microwave, fridge, or a light in a school, where they prevent those "big brains", as you'd call yourself, from maturing.
youtube
AI Moral Status
2023-12-20T16:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy3pCOit3SS-dCDS5F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7omsBQzh3eZV1pvV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSJ94jhS09fmkVygh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxpFgrLFgRd6TToNFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxtCF3YR_NdrPhQ4vR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxzdzqQ0462Fj9WF1F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxjt5vPTHd490qGFHt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugys7AX0KUnA55QUCo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3XtB2Rg7g9GVlWdd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnjRX6IQ8mX2-lXMx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]