Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everything can be automated. The problem comes when they start to replace human.…
ytc_UgykxKo4O…
G
I havent watched the video yet but I do remember seeing your video about the con…
ytc_UgzNenvkK…
G
Eh. It only can manipulate you if you let it, I fucking hate the ai bots when it…
ytr_UgwpGuTmP…
G
This was clickbait imo. He asked the program to do something and then seemed sur…
ytc_UgyutrYA1…
G
China has used AI to plan it's invasion of Taiwan in the late winter. AI has sai…
ytc_UgzliEQHN…
G
Anyone else notice how much work is being done with our money to build AI which …
ytc_UgwkSMit_…
G
Ok now lets take a look at these 2 processes side by side. Whats the difference?…
ytc_UgyuOIalX…
G
LETS MAKE ONE THING CLEAR; I you use Gmail or Amazon services (many websites are…
ytc_UgwQb3Djn…
Comment
I have to ask a legitimate question and not to be seeming as though I am making fun of anything, but is there a piece of plexi glass between the backseat and the front seat which does not allow the man to get up into the driver seat and start controlling the car? Just asking because the day is get in a driverless car will be never and doubly so if I’m caged into the rear area.
youtube
AI Harm Incident
2025-02-05T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgyvwGRi2qMmVYNROkt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWv1mo30nnOMxaBTF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzm3dW1aevrzjYv6qV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzDynpJJ51wjBZpnS14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzcFtd0fSMcDDpDRzN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfIeeZ4ab_EmJAfFh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfAvm7tVclEyVAx6p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqyeC33_idYPktNZB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyzJyncP2J_mvUPYVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcPrvvvh3Y8v-GXCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]