Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When you hear a very young man say "T-Seven-Two Tanks" instead of the "T-Seventy…
ytc_UgznhmfRA…
G
That's the scary part, even if you wanna go stardew valley you still need to inc…
ytr_UgwKXhh7P…
G
Fellow SWE here. I think AI will make each SWE more productive in a big way, whi…
ytc_UgyMkE8kt…
G
They said that 'drawing' minor characters is more difficult because there's less…
ytc_Ugzr6PCM1…
G
If AI is going to get exponentially intelligent.... I think it won't bother des…
ytc_UgwgXXex0…
G
You have misunderstood the video. Training AI on human books is legally ok. But …
ytr_Ugyu3xxaf…
G
This is silly because a.i needs humans to run them...🤷 Turn off the electricity …
ytc_Ugzz1BDl-…
G
Now is a good time to start a movement that values authentic human connection, p…
ytc_Ugy8644Wp…
Comment
I'm all for a.i intelligence. I will say quite a few words here. There has to be some sort of fire wall or code. I'm no computer wiz but for a.i to learn from us especially in the future is scary and dangerous. A.i is epic and brain tickling. For an a.i to learn good is epic!!! But this world is filled with bad. The good is few and far between. How does a programmer safe guard this knowing the a.i can ultimately out class in technical computer skills? They learn. They develop. They share between each other, wifi or wireless 5g or something better in the future. Ai is amazing but keep them separate. Don't let them share through wireless link
youtube
AI Moral Status
2023-05-26T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzFBLGfI9QiezcXcKN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn7Xr4e__v1nEauhN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy9fQtem9BZZ2QhQvt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx6TUg3p5DuljgEpZN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyiBeiypvvOUjzK5054AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzmp80PP_jNDkJp8214AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0Hzfc9e3-easvBeN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxhp-kGPNOzVMwq2BR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwbfmkRoQ_7MlOBZo54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxq_FjqcICgv8mvx2V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]