Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
EVERYONE talks about AI as though it's perfect. It is NOT. It may be one day, bu…
ytc_UgzmGsqMo…
G
Going off grid these days means complete isolation, no internet, no phones, no c…
ytc_UgzqzFmZ7…
G
And yet the AI provided a higher quality work than all the other artist shown…
ytr_UgxvP_k8Y…
G
Decency and kindness are hard things to explain to a computer if we as humans do…
ytc_UgwhNC_g2…
G
id rather have some random 13-year old steal one of my characters, recolor it po…
ytc_UgwO6_LDc…
G
AI can decide to take our rights away and kill us all. It could also decide that…
ytc_UgwYKGBPD…
G
Generating ai images and calling yourself an artist is like making a Mario maker…
ytc_UgwtdPuB5…
G
2:26 For those who don't know, the AI act in Texas used as a way to regulate any…
ytc_Ugw0_GdPz…
Comment
The problem with we humans is we like to THINK that we are always in control, this won't be the case with A.I. A.I will be the end of humans, a "Terminator effect" If we don't put a pause to this for now. First step PAUSE A.I, The Second precautionary step that should be done immediately, right now, and with no delay would be to disconnect nuclear ICBM's computer control launch systems from any possible internet connection, but we know they won't do it because they are blind to the threat, it will be too late before they realize it. This is not a conspiracy theory, I believe they have already begun to loose control over A.I, so many people are naive to the danger of A.I.
youtube
AI Governance
2023-03-30T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnGb7ARF5fU5iZX5F4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxy4-SMjENRpv1ZBRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyffh6HQOXSFG0kqiN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwINfpgQa9_dARlLAV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyTB3WW0fTYKA-HiDV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGmamRiBEZOxEgKjF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwprg8qtmJ--8LVHEx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySXCvbDM5_ckFAq1p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzgJ0gtnD9pDdz8F1h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxGnePZa64AFsqJFaN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]