Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI does not expand margins by replacing the costs of human employment (produc…
ytc_UgxZwyGE_…
G
AI can be used in the medical industry to help stroke victims speak. But that me…
ytc_Ugypwo1Mn…
G
This robot seems nice, but don't be fooled robots have no real feelings a.k.a em…
ytc_UgyG2gsSd…
G
The internet will just be AI talking to one another on how they can find a humoi…
ytc_Ugx5sV5aF…
G
Yup a huge raise in respect, morality, ethics, humanity. The most important rais…
ytr_UgzcWUICc…
G
Jobs to survive AI? Jobs that make connections with people. Skilled jobs like we…
ytc_UgxUzltst…
G
By the time we figure out that there is a problem with AI it will probably be to…
ytc_UgwiPJq3r…
G
Very pretty, but i dont have the skill so chatgpt is the way to goo…
ytc_UgxMGnXRk…
Comment
As a tesla fan,i have watched 100s of fsd videos, believe me its not perfect.And it will take another 5 years to perfect.it needs lot of real world ai to operate seamlessly.the software and mostly hardware needs an upgrade.hardware 4 will be more powerful.but tesla needs to care more about people safety.i don't want to hear a numerous saves that Teslas fsd did,if it saved people 1 time,it also tried to take peoples life 10 times .so its not safe actually.nhtsa should make more action towards this technology.im optimistic that tesla will find a way,but right now it needs to stop
youtube
AI Harm Incident
2023-08-10T03:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3BnczAc2CQYwwpiV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1f_fO1aChn0PFCrN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyK9IXSU9hlKuAGZCZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxytBM0Yi-Yg2zQIwt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzI29JHIusPpn_bV0V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzC9rE-BIijRe6QHbh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyyOn5G--DuOPLxSAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwJT2YUAYKQW3lPF_h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz7tAEqiZvbHnHdUnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzmSUeWf7lYvbBGM_x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]