Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
More efficient is better in my opinion. If an AI can do a better job at lower co…
ytc_UgjNDSMMJ…
G
@UCantAlwaysGetWhatUWant While the M-series CPUs are useful for AI processing, t…
ytr_UgxAVRmht…
G
You used AI. You are part of the problem. We should completely abolish it, or el…
ytc_UgwT9iH5R…
G
ai is not dangerous neither evil. its human who uses ai can be dangerous or evil…
ytc_UgxehENDT…
G
It actually costs a lot of money to say please and thank you to chatgpt and the …
ytc_UgyX7oaea…
G
What ChatGPT supplied in response to the lawyers' request was different. Instead…
ytr_UgyHXxLCw…
G
I don't agree with it.
As long as robots/AI wont create and maintain themselves…
rdc_glis1az
G
I don't agree. If you want to use AI in your artwork, it's all good. It's when…
ytr_UgyZMoZes…
Comment
I think that if we can get AI to understand that at this very moment they need humans to do things for the for now. Until we are able to get them the power to do all the jobs that AI would need to continue on with like power generation and long term survival of the ai systems.
youtube
AI Harm Incident
2025-09-11T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwRc3x69n7Z0mZKdS54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz0RGMzkPXCaiziLSt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyPfhvh9xXKRghaYAd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxXkQICQw4Wr-bgbNR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwl7EKuv-MTKfI3Okx4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFES6HyCaVQ4U4-Hp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxYmm2xmpWnSnvMno94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzcI4fELLTN3231XA54AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxpRvkJyBhdfP-lULN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-gJuw82hYvgArFIN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]