Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have definitely thought about this a lot as well, specially after watching thi…
rdc_j4xb0fb
G
Lmao... When you look at your hands or at text in dreams the dream generator in …
ytc_UgysgTk8b…
G
Expert Systems are real AI- they take a lot of work but produice good rather the…
ytc_UgwGGlJFf…
G
Yes, you’re going to automate jobs and your job will be taken by a person with o…
ytc_Ugx3bUUPt…
G
@craigbaker593and this is why ai art is a problem. People like you exist and tak…
ytr_UgyieiUoz…
G
Doctors who will use ai for their work will crush those who won't.. applies to a…
ytc_UgxWXCl6-…
G
Think about how many sickos will mutate it pretending it’s real . Till finally h…
ytc_UgxuYbsjs…
G
Yes true, because there will become a point where humans will get too comfortabl…
ytr_UgwOxkToV…
Comment
What future do humans even want though? As a US resident, looking at the world around me it would appear that we want a world where all the resources are fully utilized and that those people that are willing to engage in that utilization are rewarded without regard. Is the future that humans really want - the future we are building for ourselves - really one where future humans will thrive? That's a question I have about AI bringing about the end of humanity - have we done a great job at making a utopia thus far?
youtube
AI Governance
2025-12-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyiA9v4pgY3e4FumYZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyHpYeAINzYkW8mjdh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyeIif_inWmSeJOIGF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyfcOsZT7Cts-m5DP54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx30Hn3ZJJ_8LndRf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyaUstVfbcHNoTT7pl4AaABAg","responsibility":"elite","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwgnYZkp7AAjFa_cpV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgQOlkSzYjtADLAgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwVyRI5sEUItR7zGQl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw9P0E5nuAu9-_anD54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"}
]