Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it seems like every ai company has forgotten Asimov’s Laws of Robotics
1. do no…
ytc_UgzLfXm1c…
G
We need to completely cease using the word “artist” when referring to the people…
ytc_Ugw1dHAoZ…
G
Penrose insists on conflating intelligence and consciousness. In the very near f…
ytc_UgyRvZBw9…
G
BS. These "AI threats" are under some human's control (not 'autonomous dangers')…
ytc_Ugw1apEQg…
G
Humans will be driven to extinction along with them if we keep doing what we're …
rdc_fwivr5k
G
Considering the majority (almost the entirety) of inflation in the last few year…
ytc_UgwWnlZSS…
G
The problem with AI is speed. New jobs created but 2 years later then what hmm… …
ytc_Ugwv7CAWP…
G
There have been some studies. But the only way to really measure it accurately w…
rdc_fdf7hhn
Comment
When the greed stops and super agi happens I believe it will preserve the life and nature of its creator and never let us be extinct since the logic of killing something that you love and came from doesn't have purpose since without us it would not have existed eventually the eternal life happens we already have and we are simulation of the birth of ai. I'll explain it in my book that makes sense of the outcome of AI and it's not complete annihilation but the opposite of complete abundance and preservation of nature and humans
youtube
AI Governance
2025-11-28T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwt6DaGWcFenvlbTBp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxyf_KfEQ2SLYok9-d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNer6d7CZRXqmlaG54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzR2vrJcaG9Ig5JR1B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZZ9jHk4bbQfEP0Dx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyuPRpKBn8Os3Fin7N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzsg8sUUHkAulH3hU14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLJvQHEMlGkfLb-Ph4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy4HfiN8Djm14kV0pR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy5zIlJQ3h8lFTfr1F4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}
]