Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is a hype around AI just as it was about the computer age in the 70s, 80s …
ytr_UgwXFhCZf…
G
Wait till everyone finds out it is all about robotics to replace employees as we…
ytc_Ugyiqhpbb…
G
Luckily China has many,many coal burning power plants and building more all the …
ytc_UgzW29jIO…
G
But ai toh still didn't give him anyother thoughts balke woh mana hee kar rah de…
ytc_Ugz6JkkHN…
G
AI can be used for help with medication, and even for Torque Vectoring with Spor…
ytr_UgywmFJ_M…
G
Well well well bad news is AI is gonna takeover half of this field in 2026-2028 …
ytc_UgyD5KnUo…
G
Well. then those who own the AI and machines will just have it make whatever the…
ytr_Ugx3cXk-1…
G
Senator Sanders - I'm a Vermonter and have been in the process of writing you a …
ytc_UgxCJLo9w…
Comment
How about creating some kind of tool to stop dangerous technology, robots or apps, from acting without approval. Like the old "Ctrl,Alt,Delete" to stop an
program? An electromagnetic pulse tool to stop a machine? Or a netting simulating a faraday cage to grapple a machine? Lol im just thinking about stuff i have seen in the movies. I do feel this is going to be a problem if the good people lose control over AI or robots.
I feel if we prevent or lessen the ability of AI to move about , there could be a solution there.
youtube
AI Governance
2026-03-07T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4CX-2_WXCDRmoIVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1PpLGOwLbGql9iJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzJ24f4wsEG2iXJ4994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJkHQ9FyOLuJlESxV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzP74pxa68mYhrUOLp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1dZOuE1Dnpjw5Hnd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbPDqLKgUKdlP6iHd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugwi6TuNYd7QUvSwYUF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxU2T8pX0KuW2C22ol4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyoH8jH8ljC9ZzuPuJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]