Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lol dumbasses. Ai is the future, sure, but it's not there yet. It should be used…
ytc_UgxChjBp4…
G
I think it's just input/output. 1's and 0's. They have AI in video games to vide…
ytc_Ugzfjl8ZR…
G
You have to remember that some of these AI CEOs were bullied relentless in child…
ytc_UgyyQPGKs…
G
Advanced Technology INTERNATIONAL OVERSIGHT AND WATCHDOG GROUPS should volunteer…
ytc_UgxdYO-1D…
G
Oh, I definitely have an idea on what is coming, but I'm hoping I'm wrong.
(P.S…
ytc_UgzZrgzHi…
G
Nonsense. Mankind can not stop Progress. It would be like trying to hold a river…
ytc_UgwFNAzmN…
G
I don’t see what’s wrong, the AI predicted correctly, it’s the AI’s fault our hu…
ytc_UgwRNDExx…
G
You should have done this once a day for 5 years then posted it. See how long it…
rdc_mv52i5v
Comment
Movies might seem a little exaggerated when they refer to AI, but in reality, what is stopping an Artificial Intelligence from learning about all the bad things in the world and embracing them. Currently there would be no way to control it. Although they are movies, the Matrix and I, Robot are not to far-fetched.
youtube
2015-05-11T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugg4GInPcKkb-HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggbNiIQAwMhLXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggTwzbkp_XndHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghF9MR_ZEVqKngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghDbu_Pkj6s63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Uggcfz830ZCOfXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiAgMSu73Y2NHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjoAwGRrcSK63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggTE2bn7JIxjXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UggHT3fme4glYXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}
]