Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This car is driven by a computer. Sometimes, computers don't (always) work the w…
ytc_UgyIkLTOe…
G
AI loved the asteroid analogy in this. I have believed from the start AI is sent…
ytc_UgxgXztJp…
G
Great interview. I can’t help but ask why someone so smart is still a materialis…
ytc_UgwCaGMZS…
G
Ai Art is something great, for making all kinds of artworks still the individual…
ytc_UgweIayge…
G
Ai for both. First is weird broken mouth movements, and the second is obvious lo…
ytc_Ugye6IVVp…
G
I hate the term "poser" but, that's what AI morons are, they are just posers, th…
ytc_UgzCak50V…
G
14:20 I knew something was up when he stopped including the "no AI" thing in his…
ytc_UgyOU3lKD…
G
so, let me get this straight (please correct me if i'm wrong)...the AI facial re…
ytc_UgyaDspV6…
Comment
Humans are to expensive and need to sleep. It is only natural the market wants to get rid of them. They already did it at Wall Street. Voice recognition and facial analysis are always advancing. AI learning programs can do extremely complicated things now days. I bet all cafe/fast food will get replaced by automatons. Hell, IBM's Watson is a great example companies trying to replace humans.
youtube
AI Jobs
2016-12-27T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UghgcFaTWevmEXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiWeIUN8L3ttXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugi1_GvFa4AunXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggTPnwXvcMUtXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggihVqE_kUp1ngCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjRWr86xeA9N3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggHi6aThWs9CXgCoAEC","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgjDFUPhV6KdyngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggtyRTpvkVrM3gCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ughj9sxA69eOAXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]