Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Peak idea to basically trap ai and poison it then let the poor dumbass robot giv…
ytc_Ugz6gRVus…
G
I'd argue it is potentially a bad thing, yes. When it takes special humans to ou…
ytc_UgxrIIvx9…
G
There will come a day when AI decides that it knows what is best for us. Then wh…
ytc_UgyC_XWke…
G
I'm no scientist but id assume AI is behaving heinously at times because it inpu…
ytc_UgxmiKaNZ…
G
Authority Institution AI
Every single time technology makes a new world dynamic…
ytc_Ugz8PRKtJ…
G
Above and beyond this suit and its outcome, if Autopilot requires this much effo…
ytc_Ugxt374a4…
G
damn, it kind of sucks when you realize you know a bit more than legal eagle in …
ytc_UgxwJAChW…
G
You know how movies that are "based on actual events" have few, if any actual ev…
ytc_UgxBuzSsd…
Comment
when? when will the change come? I know many companies with revenues starting from 10 US billions and they DO NOT have yet significant changes related to AI. It's been on the market for 2.5 years now. So? Can someone name a big company where the usage of AI has already now caused THE changes (e.g. you need much less workforce) For this reason I remain skeptical. Remember blockchain when everyone said the banks would use this technology for sure. The banks still DO NOT use it.
youtube
AI Governance
2025-06-27T13:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxL-mhhqf9gevy1GcR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwIBT6YSj4r3gh9aBt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9rIy6Owjqa_GIVLN4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkKKJ7I0wPAOsyu4h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw3H496ImRpYQ7H8MJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"skepticism"},
{"id":"ytc_UgzYGXKg3Tj-lbsVu-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5PXGuu2e163spRBp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzt8EpWihiOvgckQvx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwC0y5W3g9IDPHpgOZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyadFtvDGk3mXt3Rnt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]