Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
dude I saw the thumbnail one day scrolling and then forgot about it until today …
ytc_UgxB00cRj…
G
Man cant get women so he has ai making them for him he'll be devastated if its g…
ytr_UgzRMhKIt…
G
The difference is that most studio musicians aside from symphonic members are no…
rdc_jtcnlaw
G
Nah bro obstruction of justice for political outcry is a battle you aren't going…
ytc_UgwqnJzp6…
G
So, in zeptoseconds how long do people think it will take from the moment an AG…
ytc_Ugy3H3HjH…
G
Hit my enterprise token cap today, been suffering through GPT 4.1 and it is just…
rdc_ohxzdmc
G
@Mrguy353Let me repeat Natu because clearly you don't understand: "these peopl…
ytr_UgxaKX7FL…
G
Remember, Trumps inauguration…everyone behind him was Silicon Valley and AI…thes…
ytc_Ugwyy2-0D…
Comment
I don't even see the purpose of self driving taxies if u still need a driver to watch or make corrections, for that, just leave the human to drive. It's gotta b pretty boring to babysit a car for 8 hrs and expect the driver to do nothing else but watch the road. It's almost a set up on the driver to get board and distracted! Makes no sense. But I do get the technology has to start somewhere and needs to be in situations and used and monitored to work out the kinks and expose the pros and cons not already foreseen.
youtube
AI Harm Incident
2018-04-07T20:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZSnnfj59UOcWUNUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzMUqIxyPPq82ZPwH14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznW18G3AMIE_uCDFB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtM4pOivcujKaP98B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxXlsYlR39y7k7YXLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0LPFwCBLHB8OSH4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKr-UyshtI6A1hDOd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKQZ3qrwjF4MfFJGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzBsJ6NOsaDQOsdEPx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw6xrdTx1uwcPo-yZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]