Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
although I am always surprised with these video, as everytime I use AI, it sucks…
ytc_UgwQRPPEP…
G
Dear Sun in the sky please send a solar flear and fry ALL THE HIGH TECHNOLOGY AN…
ytc_UgwHttJyt…
G
There is a lot of people with out work there is a lot of people wanting to c…
ytc_Ugx7uETXR…
G
@itcouldbelupus2842 is it lazy to not want to do 1000 hours unenjoyable work for…
ytr_UgwMvECGA…
G
Softbank is a public company and as such, the CEO will say whatever is necessary…
rdc_n4d28a4
G
A person I used to play dungeons and dragons with told me they paid hundreds of …
ytc_UgzP3xVA7…
G
Remember its the operator of the car that is responsible not the car if you are …
ytc_UgzfaddCT…
G
ai can't work divergently from what it was fed from. that's why when you ask cha…
ytc_UgwkY-iCi…
Comment
AI regulating laws are just what we need though to make sure AI can be a helpful tool while not being (legally) misused for all kinds of suspicious activities.
I can’t understand why we have to let things run rampant for many years to come before finally deciding "it’s done enough damage, let’s regulate it somewhat"
youtube
AI Governance
2025-07-06T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwlha6Rz2aIsW_5pcF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyHuAlYx54crUsClDN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugzn67vra5tY3oqv_0x4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwBFYiceuTKOdJeEU54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxc9KRyYSh0ee0ikoR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgylzhoXoJ88CdPVXmR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxusP4YyM-R4rYOsbN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwpMVnvtUf8vSKWu5F4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwRleFrs6IbfifAIj94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxvaorQzRA1WivbED14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]