Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have come across more than one job offer looking for deepfake specialists spec…
ytc_Ugx9H_BET…
G
The people that are over nurses need to be more compassionate they seem to forge…
ytc_Ugw84spLm…
G
The artists complaining about AI have no problem with automation when it is taki…
ytc_UgxMOXsHo…
G
Current AIs are programmable, and the inevitability of AI evolving into sentient…
ytc_UgwaO-a1p…
G
@OakenTomegtfo of course your angry that you jumped to the comment section. you…
ytr_UgyH5Ph2K…
G
If robots had rights, would that stop babies from being killed in wars, criminal…
ytr_UghEUEN58…
G
The driver was not paying attention to the road tesla autopilot is in developmen…
ytc_UgyY6XCkC…
G
Okay but why did you ask chatgpt when people have also been saying this since th…
ytr_Ugy8xkzc-…
Comment
My take on this is like the creator of Jurassic Park! He spent so.much focus on if he could that he never considered if he should! Why couldn't AI replace its own creators? Where does that leave mankind, Including these tech "geniuses"? Hacking could wreak havoc on the world they, thru their conceit, believe they control! They never consider when the machines/AI bots are smart enough won't they just eliminate the masters too? Again, when creating Frankenstein the doctor never thought it would come for him! Could AI create its own hackers without the masters knowing? Scary stuff! Pandoras box is deadly when opened! The cocky ones never believe that until it bites them!
youtube
Cross-Cultural
2026-02-10T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4b6FMf3DEiC_OyTd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzr9-g1_biZYcuZBs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzrw0sj4pJgs9dDxg94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxHct03OKzgTiD1lpt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzfbQ6YBnxYW9mhD894AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwzTPCANFN4GLYG8nt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwsimShUQw2tTomwPJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyXOr_Yreohy3K82W54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzjUtRv7CB-sZRu1_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyYg9zUL3DNZWXcx0N4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"}
]