Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI “artists” are cheapskates that wouldn’t know how to pick up a pencil or pen w…
ytc_Ugy6DGGmh…
G
Sydney is actually a spiritual entity possessing the ChatBox. When a wicked spir…
ytc_Ugz6DwkKv…
G
@coreypope7220Arguably you say? When an automated stealing machine steals, it is…
ytr_UgzoQCSef…
G
Why is the criticism about AI taking over and not about those using AI instead o…
ytc_UgxQRabNB…
G
As a fuel hauler in a sleeper truck (meaning I travel around the country, but mo…
ytc_UgyOKf8z_…
G
Somewhat true. 🤣. I built and deployed some applications with Ai, that are reall…
ytr_UgyroRU_i…
G
Are you sure your selection of survey participants went into it with the intenti…
rdc_iplzk1q
G
They are stealing from our humanity right now! We’re all being used to train AI.…
ytc_UgzFFo_QA…
Comment
"In all honestly, I don't really care"
This single line sums up exactly what is wrong with AI (and made me laugh out loud).
Machine learning = regurgitating a reduced amalgam of data.
AI = expecting a machine based on that mashed-up data to have the same moral compass, empathy, and consideration as an adult human without any real-world human experiences-- and then giving that machine the keys to important systems and technology that could crash society or kill us all (that in some cases is only given specific guardrails based on the entity that controls it).
This is going to be a really fun global experiment (where none of us were consulted before we became experiment participants).
youtube
2025-12-01T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxTwC6gxFWolraCJUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyesCM0OGZM-2-UAtt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxpgmtJr-FpuIP68ZV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz11OkXymUhV-p8czR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxEFORGe_VM2FzO_6h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxj35kIHBRL8iWqjZd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzQKm0-3-HRWhQQPy94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxYLJiruJkL9fDcfHl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxOmneNPjO5IiMk3F54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCCAyZ5U7RVa7KXDZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]