Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Are you contradicting yourself?
Some humans are making AGI/ASI that will quite …
ytr_Ugy1c50OJ…
G
Yeah that's weird. Until AI can somehow respect HIPAA I'm not doing that, and ho…
ytc_UgypQddPX…
G
I can't wait, working on fresh air with your hands keeps your mind healthy, I wo…
ytc_Ugz4qkgU6…
G
This why doctors constantly study. So, I am not surprised because ChatGPT sits …
ytc_UgxpQvDzy…
G
Not sure if this is just too overblown I just changed my car injectors not sure …
ytc_UgzojmEL3…
G
The main problem with self driving cars isn't even technical or the fact they ar…
ytr_UgznuBOtk…
G
Ai isnt taking over anyones job. The whole idea of “ai is taking over!” Is not t…
ytc_Ugz5s3-9v…
G
can the auto pilot's emergency braking as shown in the video can only engage its…
ytc_UgyQgFB3F…
Comment
it ethical as long as it made for just being your buddy if it for work purposes the would a.i replace us most of us will have no jobs other then those who heir but it will bit them back in the but because who going to buy there product then money cycle will be broken the force that kind of been driving us for that pass 100 decades with that broken we have a few big problems what going to drive us forward how are we going to get food and shelter. but if it for being your bro It could raise socail interaciton but lower it with human but then again Most people would probely have and if someone dose not who says they can not hang out with the that is also hanging out with his robot bro. just
youtube
2014-10-08T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgjpM9su4PUgOXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ughl5qc5S__IyXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgiEfPymBkOFwngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UghGUSpj2mi6B3gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugj89ulpyU0Cn3gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiTmXK1IfcrL3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjbtOR2O7rKYngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjx6F4Lrk3qFXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UghaqHhw_KUningCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgidYWIgHmWVzXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]