Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doom mongers in 2022 'you have no idea what's coming'
AI doom mongers in 2023…
ytc_UgwFDNe-H…
G
This is a horrible argument against A.I. There’s valid arguments against AI, but…
ytc_UgyQzOQfn…
G
23:27 ...
My question is, do LLMs even have a concept of bad?
Like if the indi…
ytc_UgwwM6Nqs…
G
Imagine if AI were just extra dimensional beings that decided to interface with …
ytc_UgwRZPU4i…
G
Facial recognition is made against blacks. How many other race have been arreste…
ytc_Ugzlan-T7…
G
I think this AI art thing will move artists to do more live, traditional and tan…
ytc_UgwX7Hnww…
G
I've heard out there that rhere is more to AI than just typing a prompt, and tha…
ytr_Ugznk1Qp0…
G
The argument that art is a waste of time when AI can do it faster is so dumb
A…
ytc_UgzMBnk1u…
Comment
Thanks DOAC. Podcast of the year. We have 400 horsepower in our car. Ai will have 8b human power in their one plumber humanoid pretty soon. It is high time to take a stand to stop the nonsense unethical superintelligence innovation journey. May be we will not be able to stop AGI, yet it is worth to give it our best shot as an ethical human being. Let's make ai ethics compulsory together. Ethics first, development second, as simple as that. 🙏.
youtube
AI Governance
2026-01-15T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbaKv2FZHU3r47Yl54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx03xeSvcA3tvxDe8J4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBbDqCY00NXAxqudB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwfvN3wHsusMdQeGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdHR0_SE7w3YZEJSx4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxi-3QgEzZoSvEW31Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx-2M0mI6w0u7VClVJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpneseKbAEZrNpk994AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxyMi9mke_UR6Bjk7h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgzXSe3W28WT2nQEVcZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}]