Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans will always be of some form of use to computers. Humans function complete…
ytc_Ugyf9Cght…
G
Someone should make this conversation with every AI, to see if the pattern is th…
ytc_Ugw8jKqIf…
G
I can see where being an “AI auditor” will be valuable and needed to make sure A…
ytc_Ugyq2Ec9U…
G
And people wondered why I chose to homeschool my kids. This alone demonstrates h…
ytc_UgzvtDxlh…
G
There is a huge fucking difference between using a digital medium to make art, a…
ytc_UgxdjvFvl…
G
How can I be her fault if it's a self-driving car and it was self-driving should…
ytc_Ugx0eqhc9…
G
@vladanikolic889 Exactly this. People keep blaming AI like it’s some mind eating…
ytr_Ugyr7-NL3…
G
I’m sick of being forced into existential positions by Meta, Microsoft and Big T…
ytc_UgxxFbUhA…
Comment
So the flaw in being a well-meaning question asker is you have no idea when the person is deluded. Current AI is crap and in a bubble, its just that some of the people in the field are too myopic, or concerned with their money to admit it. Guaranteed very little will change in the next 2 years, and many of those IT grads that were laid off will be rehired, once the dumbos in charge realise that AI is not what the superficial CEO's believe. Of course, it will get better, but not before some major changes, and so all of this guys prophecies are bollocks, as time will prove. In fact a brief review of many events in IT that supposed prophecies forecast (e.g. the Y2K crisis) were overblown bollocks. These types of simplistic ideas, its all screwed over in 2 years are laughable if you look at history
youtube
AI Governance
2025-10-02T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw4sIhEE8DCgoncufh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxPs8kfgIh7KfjmHBR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-3btCZmLREZ_uwAp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylLnLhEK-pFD1RnBd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgywTtMzBWExrk91QLR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdgopUVSWyXraFPYF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzLpzbWow_IF6N9jbx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy0aN2A5JyJH-Sbu_V4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwdg4nTZxMXH2_Rzhh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzZuUaiX_a8ycKncpR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"}
]