Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's understandable to feel that way sometimes, especially with advancing techno…
ytr_UgwSMFbcr…
G
How would hallucinations be fixed? The LLM is just guessing, you could never gue…
ytc_UgyMrgxf4…
G
First he made AI and then he talks about risks and controls? Did his research go…
ytc_Ugz3T1dOe…
G
AI could destroy the support of manufacturing due to increasing mass poverty whi…
ytc_UgzlzjS8-…
G
Jesus. A year ago chat gpt couldn't even do basic basic school maths now they'r…
ytc_UgybEdqOq…
G
Just the idea of someone thinking ChatGPT is a Search Engine hurts me. Though I …
ytc_Ugx7KCFRv…
G
No, Both should live with hydroxychloraquin, a Z pack, and zinc. A ventilator i…
ytc_UgzelFqcg…
G
I think the problem is not just Ai alone (a part of it is somewhat of a problem)…
ytc_UgwVVkIPj…
Comment
THIS is not good! Robots aren't good either. Whoever watched the movie "I, Robot" will understand what I'm talking about. Robots will take the control over us, if we don't stop them early. Some robots in the future, and I'm sure this will happen, will have access to the internet. And everything has a dark side, and the deep web could destroy a robot. Just one tiny mistake, and a robot can be controlled of whoever. And I don't wanna be killed from a robot, like, wtf is happening?! This isn't normal and we should stop it...
youtube
AI Moral Status
2017-04-23T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgivtIdgVocyFXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiYKx3M8o_e-XgCoAEC","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiMsYLdAJDJn3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgibSee-lUws8HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UghV74iRtFuu9XgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgiZm51uxnz_23gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjpMHMadcrd5XgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugi45nUFMM_AvXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghOeUpr1CRDF3gCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiJ0QWK2DdTGHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]