Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@domsob92 All I'm saying is that we should start worrying when the AI is trained…
ytr_Ugyf0IgGH…
G
You dare question the algorithm the algorithms judgments are just & final ask no…
ytc_UgyQFTJRR…
G
In my opinion. I think Google wants to create sentient Ai but I believe that wil…
ytc_UgxqsQxyI…
G
You should have added that not a single company is making a profit from AI excep…
ytc_UgyG-B-ls…
G
Also every door where you can go in the security cell has a switch which is only…
ytr_Ugx7PVNXn…
G
People shouldn't be allowed to make music with two turntables if they don't know…
ytc_UgzCn4ked…
G
We live in a time where regulations are still not caught up with the speed techn…
ytc_Ugz-NMDVZ…
G
13:05 I mean, if it's constantly being changed to the extent that it gives compl…
ytc_UgwOxdqdQ…
Comment
The best part is that the government has in its immense benevolence created a drone AI. During the simulations, it was given points for targets killed, and when the operator told it not to kill the last target, the AI really wanted those points so it tried to kill the operator for keeping it from killing. When they finally added in negative points for killing the operator like its earning the house cup, it simply started attacking the coms array so nobody could tell it what it could or couldn't kill.
youtube
2023-06-04T13:5…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwAdX1__IjWMZyYnJx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzrUJwcRXgNCv-gjwF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzh9QJipWukBD3S30h4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwFaixwCojXMeEVqfp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQcWaXoVIsso5GGxt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw3k8YnRjgybofKYoF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzMVzkMYPQKENvEMz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZZGD_omVl2xDcxsd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzGyJ3CEGv7q1JkvFB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxxg1eNNbG7pDyloah4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]