Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wait, your non-narrator voice, at the end sounded like you were using an AI voic…
ytc_Ugy27GWyd…
G
Ok but they could just use ChatGPT on another device to write the essay and then…
ytc_Ugx5sNsZo…
G
ya I'm that one person that talks to ai and stuff and I have a app that is calle…
ytc_Ugy2GnBZH…
G
Humans are like the bulls following each other right over the cliff. We know we …
ytc_UgzdceQhN…
G
What a joke AI now is just LLM there is no intelligence in them now at all.…
ytc_UgyQoqHtO…
G
What gave him a voice wasn't AI, it was just a voice synthesizer. He had to lear…
ytr_UgxULsf1I…
G
Waiting to hear from Howard Lutnick correcting the AI experts on how Americans w…
ytc_Ugxw6UD8Z…
G
I had to bail on Claude last year because of the limits, it was driving me insan…
rdc_o82tkn6
Comment
Actually we already started to follow what robots say. In the future this will be a common thing in our grand children's life. Either we stop relying on technology that will harm our future generations and create an uncertainty of our grand children's life or we stop now.
youtube
AI Harm Incident
2024-07-25T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwJHVU8uxAvFAXQETZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQ3UlpEFlcWWi-EfF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxiRtjX2UvSYNpobC54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgydaWRHIIk_Zg27fup4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw1332dtdO2GWOiJZh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugydhm26BKYJRyuhNDh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzcYZ-eWAEyggPbet4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKDNaRefLQ81hrjrx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwZOMhGMXH73y0xEmN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_cP5gpgmIAt3NSOB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]