Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As an artist who just struggles with digital art and prefers painting and drawin…
ytc_UgwjgFz81…
G
Your concept of machine learning is too far off to correct. I don't have time fo…
ytc_Ugz2IJym1…
G
But don't you think forward scanning lidar would have prevented those? I mean, a…
ytr_UgzaaQAN2…
G
This is where you are not looking ahead far enough. Soon humans won’t be involve…
ytr_UgwrDKeQF…
G
On Facebook I activated my facial recognition. So far I've been identified as 2 …
ytc_Ugw32UXmF…
G
Big Tech Companies: “you guys, AI is absolutely NOT coming for your jobs”
[proc…
ytc_Ugy3aJ0tE…
G
Just want to give an example as a developer, Today almost all engineers use chat…
ytc_Ugwdy1Xqa…
G
So many assumptions assume it to become "evil".
What if it reaches a point of i…
ytc_UgwC8w229…
Comment
Matthew Jaroslawski I don't exactly agree with your reasoning, but an agi level AI will soon become asi, then we it's a role of the dice of whether the new sky net wants to keep humans as pets
youtube
AI Moral Status
2017-02-23T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgiNKzzbbtFetXgCoAEC.8PL4_BkYKhH8PL7-06PJyx","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgiNKzzbbtFetXgCoAEC.8PL4_BkYKhH8PLC4xIffA-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ughcmty2iMMsFHgCoAEC.8PL3q3krKEk8PL7m7PJYg9","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggRPiq5dwY9P3gCoAEC.8PL2mhsalPI8PLABX7lpAr","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgiOaXexrKY_SXgCoAEC.8PL12S-v8fO8PL5FFgDJRd","responsibility":"ai_itself","reasoning":"contractualist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL4sqa1Ty2","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL7cIsRhFh","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugjrf2Y85YKyVngCoAEC.8PL-Q9tvEeN8PL92CZfAg3","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgjIvUEfE5r063gCoAEC.8PKz6lrvdh48PLB9sOe8Te","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_UggWa3AcKd7cHXgCoAEC.8PKyWZiGd_R8PL60F186iM","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]