Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AITube-LiveAII talks a lot about this with chatgpt, it's very complicating top…
ytr_UgyMWmk1H…
G
Anyone who has ever worked with AI, deep learning, or machine learning clearly k…
ytc_UgwMwQ1kG…
G
computer science major talking to a yes man, it's not that hard to make the AI a…
ytc_UgyyzYyOM…
G
Thats such a mind game. Now people are going to want to do it more. I can’t wait…
ytc_UgzcmuI8C…
G
So learning from other users of ai such as chatgpt and changing its output is a …
ytc_Ugx4J9Voi…
G
What if we just destroy A.I. right now , and carry on with our lifes…
ytc_UgyuGgRoH…
G
Nope, I don’t need 100% reliable, I’ll take the safer record of Waymo before I g…
ytr_UgwSYprJ1…
G
Great video! Great points. Right now Disney finally sue an Ai company which will…
ytc_Ugy9OXOI2…
Comment
The real problem is misperception about human machine development. Instead of keep human doing what human do best is in seeing appropriateness , compassion , morale , and algorithm and machine best at repetitive task , counting , or see large details , now they try to make robot do what human do best like walking , pick up things , and make human do multitasking or repetitive task which is totally reverse or make it inefficiencies and redundancy. In fact if robotic truly applied they would replace people in sorting mail , inventory , warehouse operation , and accounting , and security , road repairing where human would not efficiently ,
youtube
AI Harm Incident
2025-01-10T07:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyP2nTH3RDpu21v1J14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTLigvsA4Vuvfa5mJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw2bYEo9OPuL8LNP2Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzlm1MgSyZ7KQR5noh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxC5JcELuixejKzaEZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_eh3cy4BFoYjOEV54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw8-_tTEMViwQdR4754AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxTf3K1W6q0VD6hmdt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwXuLTva6UZxhTMx3p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx1CS9D01w-fWsOCkV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]