Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder how long it will take the politicians to take action if they get deepfa…
ytc_UgwCXMdr-…
G
@Tānzill Q Bet. i think it's time most people knew . When you hear of @JEFF BE…
ytr_UgxpivrW5…
G
My one and really only real requirement for a self-driving car is that it won't …
ytc_UgzlSHi2G…
G
This is like the sixteenth short I've ever seen where some guy, usually a dipshi…
ytc_Ugzg0IgzC…
G
They are absolutely right. My grandfather lost his job to automation. He was a s…
ytc_UgxnJpg7K…
G
I agree. I keep seeing companies saying they need programmers for AI training. A…
rdc_n619fi7
G
Scary concept that this guy wants ai in every device, even scarrier is the fact …
ytc_UgwuNW7dK…
G
If you need ai to help you imagine things your brain was already cooked to begin…
ytr_Ugzo2U4B6…
Comment
AI has already passed human intelligence seeing as you have come this far and yet continue down the line. Terminator, the new Child's Play, I don't even need to go further. How far are we really from the beginning of that, as we have learning robots with access to all information there is even joking about setting it off and human unfitness and state of being the biggest threat to all life on earth and the planet itself? I'm certainly not asking. I am telling fellow scientific minds around the world, not to think putting it in a movie eliminates the potential for something to occur. Half of Sci-fi has always been based in this. That should at least stand as a big ass warning not to just boldly go everywhere we can go just because we may feel so bold. Sometimes you realize that with no parachute or bungie cord or glider or winged suit or something to slow your decent or catch you, diving into the deepest point of the Grand Canyon is actually just suicide. With a slight delay at best.
youtube
AI Moral Status
2019-10-26T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGpn4Q8Hk5cnrIS1V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyqFhZjidwbmyEWNth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEPyev_bx8DP-xj214AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzjN-n6cpLQpYUHxN14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzw1xzY9ou34MIl6JN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzDJW5jl-guVkIM13t4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxzemdZfhGjeyIxUZ14AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzfC39y1DkAJ0Geb4p4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwJqkIW7LOS1R9hp614AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw4BIxZfHqP4XALMxh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]