Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think a problem with AI art and its perception is:
Character vs Aesthetics
M…
ytc_UgwPrzSRC…
G
Honestly, i think you are wrong here. The reason why i say this, is because i ac…
ytr_UgzcLc-7z…
G
I think copyrighting AI art is dumb. However, saying "This tool makes things way…
ytc_UgwBZKOGC…
G
My friends still think AI will take their jobs, and we'll have universal income.…
ytc_Ugx5rra10…
G
A year later tesla launch theor first human robot. Too much for marking his word…
ytc_Ugx3MG22w…
G
In my opinion, a conscious being is going to enjoy doing certain things. Humans …
ytc_UgzjuZlys…
G
My company is structured with: Architects (staff engineers), seniors, mid level,…
rdc_jpqsar5
G
This is just another example of what we do as humans (or at least many of us) an…
ytc_Ugxxesh8D…
Comment
I love that people believe in the near future they’ll be feasting from their efforts when ai was found not created and once it’s given enough it won’t need us we’ll be ants 🐜 stupidly and pointlessly existing .. soon enough we won’t understand ai and I promise it’ll make a language humans can’t read eventually ai will self replicate and self upgrade to a point where we have no idea of its intentions and no control whatsoever .. and you better prey they don’t decide to end humans cause it would be hilarious how badly we would lose 😂
youtube
AI Responsibility
2025-07-25T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwjC5vDO45ybtOOKfJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyci7inHV6ys3tmchJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPNZqtYVgvRFZu3-R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPjQu6KFj0gDTAz_Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3MmsXnD3c8T8lCA14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXLCY2E1rG_QcPE6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzbIGi1Fz8kmciU1m14AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw07y6zCzFuMYxaOoV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxD0GyApZNuC3m1kp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzbdSTOxS3rCMYlW_l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]