Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This makes no sense. How is this a win? What possible reason would there be to h…
ytc_UgwoWuoFx…
G
I came across this video through an AI search, and I really appreciate how it he…
ytc_UgzMqrAGB…
G
Grok uses the same AI that Israel uses to bomb Palestinian civilians, hospitals,…
ytc_UgwkZEJJ7…
G
I do scuplting, so I am safe from AI for now. But leave it to tech bros to screw…
ytc_Ugz8tztnl…
G
A.I. "Art" is like putting everything anyone ever found to be delicious into a p…
ytc_UgyF-dOUG…
G
Way the robot leaned into triggering a punch for a counter. Chefsss kiss. Fake i…
ytc_Ugygj1uIQ…
G
AI is dumbing down humanity not to mention that it will be used for nefarious pu…
ytc_Ugxi1X0KT…
G
AI is so dumb. It will devastate the consumer market. So all these “moguls” won’…
ytc_UgzGVbec7…
Comment
Not true. They only ground planes when they think the problem is systemic. Planes are rarely grounded after one crash, usually it takes at least two very similar ones.
And this has nothing to do with AI research anyway.
AI companies accept risk because there's no other choice. They are in a race, if anyone slows down while others don't, they gave up the potential reward without reducing the risks. The only way is for everyone to slow down at the same time and strongly cooperate on safety. But that's practically impossible when the price at the end of the race is not just a little profit, but enormous profit and unprecedented power.
What you can do is to win the race by such margin that you have time to solve the alignment problem before others catch up. That was Elon's plan with OpenAI, and now with xAI.
youtube
AI Governance
2025-08-30T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx-qzznYwo1reEFsad4AaABAg.AMA8-Pk0B_PAMSIEx5jYF7","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMAN33w51l2","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCC2_IYVmC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCR6jqsD9G","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugyalir_2NorClRhmXx4AaABAg.AMA2yQD-zr-AMCWnuuATiY","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_Ugz5jNs56uezLOZZwbV4AaABAg.AMA1IuTRIZAAMJNat_ztFI","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzOrZwZQyjwQTMXmBh4AaABAg.AMA1-19-Qz_AMGbw0V_JDN","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugwim9XQC9rU_cnMzhN4AaABAg.AM9y2mqUfvFAMB9yJeDbRx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AMSKFocRrSv","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyO6Ytj4-Ipljm9bO54AaABAg.AM9q5Q9W9e7AN3QkA82btd","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]