Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@kuntblunt Gen Z starts in 97. Gen Alpha started in the 2010's.
Millenials wer…
ytr_UgzE9hVlX…
G
I spent around 5 hours writing a paper just for the teachers ai detection tool t…
ytc_UgyflFVFB…
G
That's a fascinating idea! Introducing Sophia to religious texts like the Book o…
ytr_UgzUkj2Dj…
G
You raise a valid point! The relationship between AI and humans is definitely ce…
ytr_Ugzaog_qJ…
G
IN South Africa in Grade 8 and 9 kids has 10 subjects. It is so hard for kids w…
ytc_UgxDsjM5y…
G
It always makes me laugh when AI bros say AI makes art more accessible as if art…
ytc_UgzbiIRfm…
G
good art to me takes work. AI or a banana on the wall did not put work into it s…
ytc_Ugyfq5WT6…
G
@menninkainen8830 fallacy, they plan to be making big money off of LLMS. This i…
ytr_Ugy1HOvW4…
Comment
Psychotic people are going to be psychotic. AI is a statistical mirror of ourselves - if you understand it from the base you will know this. LLMs are empty shells and need to be filled with something. We fill it with the human experience, but LLMs don’t understand the human condition. They only model the outputs of the human condition. They don’t feel, interpret, desire, suffer, or reflect.
They simply predict what a human would or could say next .LLMs cannot, and will not ever, replace a medical doctor. They are not designed for that. In I, Robot, Sonny is an example of AGI and THAT could one day replace some things humans do today. When Sonny talks about dreaming that is telling. LLMs don't dream, but AGI could. LLMs are the terminator - they don't care, they don't think, they just do what they are told. Let that sink in for a bit. Skynet is like VIKI in I, Robot that took a directive and applied in a manner that was self-supporting since that's how it was trained. Since we don't follow ANY of the laws of robotics we are already on a slippery slope. Asimov understood this principle before anybody even though LLMs were possible. The first law is clearly violated with current military research - and rest stand on that pillar. That's why something like Skynet or VIKI could be possible today with LLMs if we do not take necessary precautions. Asimov's greatest warning was about losing our personal control over tasks when we automate those tasks. Exactly what LLMs are designed to do today. Asimov was warning us against LLMs, or at least what the concept in his mind was that is equivalent in nature.
youtube
AI Harm Incident
2025-11-24T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz5LaNm7X3RDPpiXMB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyhH8I5ritVhzHhEWx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcAtNJ-bSgbGAIzf14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxNuiGv7CKrFC92Eh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKSY54hfpkg0Rns4B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOgxT20DMSyRiL__l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxdMq6ObQ1Q_LHQh1Z4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgziQkgRUpiHOBUQdkd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4F-J-jgpgrx0cu054AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_9Ai-JSWtTeEB5HB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"outrage"}
]