Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow I hope the Robot put the safety on after firing the machine gun! He was wavi…
ytc_Ugwyri9Pn…
G
As a new animator making animations for some of my new videos...I GOD DARN HATE …
ytc_Ugxp_1-b6…
G
lol you sure Musk isn’t AI? The way he stares and thinks about his answers.…
ytc_Ugz8ke8T6…
G
Remember: software developers and engineers won't be automated. Programming migh…
ytc_UgxUEzq4Q…
G
Yes she is a robot and no washing the dishes is still a no go.…
ytc_Ugz2BBJQH…
G
I think because you said stopwatch, Robotaxi just braked. He was too smart so he…
ytc_Ugw7qT9tN…
G
Do u know about Palantir? They have contracts with the military for their AEye.…
ytr_UgyOZmWYW…
G
Sometimes an AI gets things wrong, but in a way you wouldn't know if you aren't …
ytr_Ugw8Fnexo…
Comment
There’s so much missing from this interview. Obviously they only have so much time but the blasè response of “when automation or ai takes your job, find one that involves a creative element that ai can’t replicate” is absolutely bullshit. So many of the entry level jobs and jobs that are accessible to those most in need are vulnerable to replacement and there is no other simple alternative for those kinds of jobs. People need them for money and to exist, not because we want to exercise our creativity and find purpose in work. Tyson is also totally blind to the value of philosophy in a way that infuriates me (as a person getting their PhD in philosophy).
youtube
AI Moral Status
2025-07-23T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz9d6FyGRZ40uldQbB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1-lhBPCSDAc3hloF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzFq1DyfE22Z7fNsUt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugym36X_SkmSHa1zArV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzlwebDTaRCoosYVgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyQFDLpT8XOmXVuy-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwAPxa0IOyujo2DNl14AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyYsKh6_bbOXh54Fcp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxWeBhhKEUXeF0MLmN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwlIJ56sXAXOifqag94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]