Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hey, a quick note to the viewer. Will AI replace her one day? Yes, and I suppose…
ytc_UgwfQaVjO…
G
Elon wants to further his Artificial intelligence agenda (which by his own words…
ytc_UgxUgpEm4…
G
Many people thought their jobs could never be replaced by AI. Now, they are rea…
ytc_Ugx05m0FB…
G
they kept saying 12 codes of collapse is the one book that connects all the dots…
ytc_UgwA_uxA1…
G
Dude built an AI chatbot with the primary intention of having it be used as a re…
ytc_Ugx7rwzJI…
G
They should use AI to do jobs that are dangerous for humans, or to do jobs that …
ytc_UgwXNU-XY…
G
No one who says they have a 40 hour work week actually has a 40 hour work week. …
rdc_dv0h9ay
G
Remember when that "ai" was trained to recognize cancer imaging, but it made the…
ytc_UgyY4UGRm…
Comment
I agree. The girl doesn't know what shes talking about. There are many reasons why sharing personal data with AI is risky and needs to be weighed carefully against its benefits.
But her points/explanations have nothing to do with it, aside from the fact that you can refuse to let your input be used as training material as option.
To most people their own minds are their biggest threat anyway.
youtube
AI Moral Status
2025-08-13T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxrTzoiaWuQL8NzXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOMVqWe6G_dMZXHL94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_dpzI-mbykYAlsD94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7LtcudTObch4Yu_V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkHHosMpBLErEfN2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyiwstRSUIRd7e6kV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw-HpkQJ1vjICK-vuV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyr3RU63o7QeIfaAMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgywUhh4ZCbYo1UOEpl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4J9VoiMJSEROW7YZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]