Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thank you for your kind words! Sophia appreciates your support. If you have any …
ytr_UgzQ5eyYH…
G
TO be fair, i like to think the AI is the mind of an average consumer.
A real p…
ytc_UgxBbBvem…
G
😈somehow it is good because in robot syster there are no ego no anger and no tho…
ytc_UgzavgI-u…
G
@non-AntiAI Prove what? I'm a software engineer. AI has no idea what it's talkin…
ytr_UgwQTgGZr…
G
we only need to wait until ai gonna steal our informations and send it to its co…
ytc_Ugyx0jZOM…
G
I think if AI was going to take over, it would have to WANT to, independently of…
ytc_UgxYGPjNv…
G
Control Ai will fail. Capitalism, techgnosis, teleology, and chains of suspicion…
ytc_UgwUQT32-…
G
They are actually forcing humans to accept AI as a part of society, when they re…
ytc_UgzDA3ETL…
Comment
As a Tesla fan and investor, and Elon critic, I believe it would be good to have a "training" app that is needed, akin to workplace/government contractor training course. Perhaps it could only be required if the car detects improper supervision. Videos, lists of rules/expectations/responsibilities, etc. Use in-cabin camera and computer vision/facial recognition to verify identify of supervisor, matched with the same during training.
Idunno...
youtube
AI Harm Incident
2025-08-17T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwYfFMHVzVyuhQFdMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxaBDG77cwlFTS-1Ox4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzUuC0NA5yU6OFDmF94AaABAg","responsibility":"company","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugx8sjZXm1A1ydsIgh54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxHwWjz_9Kh5zFrLNp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxBScCp17c0Czvh5fR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzAHrislxZ1paCuwUB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy06hydIuwlfoY6EOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxF9C444nMrHGdc5g14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_dtugAD-9e439fwl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"}
]