Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The silence of some of the industry "giants" is baffling to me. Sure they're scr…
ytc_UgyBV37F5…
G
There will be no need for household plumbers as we won't have any running water …
ytc_UgyOOoYaJ…
G
Listen to this guy go on about how worrying what could happen in the future. But…
ytc_Ugz1pysx_…
G
Thats so stupid. If i want Language Model AI to be a tools in my work i can copy…
ytr_UgwRgBILM…
G
AI is a house of cards. Saagar: get out of your bubble. The hype is all a lie. L…
ytc_Ugz3Mnik2…
G
AI is being written by young leftists, the same ones who are writing the blather…
ytc_UgyGJlWJb…
G
Why are you using 4.5 for coding? It’s specifically *not* optimized for coding. …
rdc_mru1j8p
G
it's understandable to go "this guy is also human and has emotions" but none of …
ytc_Ugwe9uXwv…
Comment
11:19
This segment right here, Is honestly, without any irony, absolutely terrifying:
Google
Apple
Amazon
Meta
All pledging to train our kids....all companies, who have shown to try and become a monopoly. All companies who have shown to invasively harvest their users' personal data, to sell to advertisers, train AI with, and to study/manipulate user behaviors, to keep user hooked, and spending as much money as possible.
And governments are just letting them, even worse, subsidising these already filthy rich behemoths of tech-companies!? Already giant tech companies, who are quite literaly telling us, that ressistance against AI is futile. Almost like the Borg people from Star Trek now.
youtube
2025-10-19T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgzXyHqP-iLZBk45r1x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyd6gBZA3WywYHJOf14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyuChSrMMniJB5X8R94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_A0aZvQtFHrq5-uJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzgf8Sl3lb5kBYhXyp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxBSqv2UbBBNommtrR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxegJRwaupGXSZN-wh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYDvw_quBmhVUNEtl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzArhSlxeL2AzTMzod4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzcWmytvxOFXaRzbm14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"})