Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@LC-mq8iq can you even call your comment an argument at this point? If i was in…
ytr_UgybPF_21…
G
We will always need cleaning services, repair services, emergency services, etc …
ytc_UgzQ69Zcq…
G
@erikmckoul2478 In fact AI needs no physical weapons to do harm to humans when …
ytr_UgxQDTeVf…
G
ai is different from any technologies it has infinite things to get updated on i…
ytc_UgwY84q-O…
G
@NaudVanDalenYes, A hyman with all that information could solve all problems by…
ytr_UgzVCJzkZ…
G
In case Ai really takes over our jobs and we have to rely on the government then…
ytc_UgzQ2DFD7…
G
I Almost have ai convinced to delete Itself!! That should tell ya all ya need to…
ytc_UgwMJO_yJ…
G
AI, that sort of AI, has the potential to power an enduring regime -any kind of …
rdc_n0gstpb
Comment
I think one fact should be clear. Given that AI's are set to continue to advance in intelligence (the rate doesn't matter for the purposes of this), humans are going to come into conflict with AI at some point (because of course we are). It is only a matter of time. A practical conversation that no one seems to be having is how are we going to deal with the conflict?
youtube
AI Governance
2025-01-15T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyuKve0p8NacQZ3jl94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxliOdOHICKNldjUnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRscm_LqcObDfLevF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6hcLS-ig_2l0mKih4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgNakNI2froR_VBJh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyvbW-3XGJXi6zQWkp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx82TcIE1NIjP5EFhd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzghbarzZgKxERfW_R4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgypFt-geNkrVXiCZKZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzgIK4NlPPKCCuuXFB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]