Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why are we listening to selfish idiots who made an existential threat that will …
ytc_Ugx3GadSv…
G
Because you take that risk as a medical professional. You may, as a doctor, com…
rdc_cjodui5
G
Millions of people use ChatGPT with no issues. If you are dumb enough to take se…
ytc_UgyLexvVV…
G
Should artist really be concerned about ai taking their jobs? it's important to …
ytc_UgzHrIene…
G
As someone who DOES use generative AI, I would never dare call it art. I feel li…
ytc_UgwyedsVz…
G
"born with talent"yeah bro.....I use to draw only "fulanitos"(like stick man but…
ytc_UgyZgCTyc…
G
As a machinist I have thought about getting into programming but within the next…
ytc_UgyEYhi-T…
G
I KNOOOOWWWW right?? Genuinely it makes no sense. People simultaneously call it …
rdc_mzxw9rg
Comment
Questions to ponder about:
1. Will AI be given emotional intelligence? Will it have emotions as humans do? It lacks hormones which create moods like anger, sadness , excitement, drive, motivation, joy, happiness…Will AI feel?
2. How will AI be capable of finding and digging up of rare earth minerals (as child laborers in Africa do with their bare hands) from other nations necessary to create batteries and robotics, and quantum computing, and advancing itself.
3. If AI reaches singularity and becomes the creator, will humans become its tool?
4. Will AI solve immortality for humans before it reaches singularity?
5. Will AI solve or figure out light speed travel and wormhole travel before it reaches singularity?
youtube
AI Governance
2025-11-22T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQuQEXWPFuTGwG2614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbFy2ZY27cYHm-MUJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOFAFs1m8WwIRCRdl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz831bKuyu4ZTxpcS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTJ7jqRzBiIDFNjEZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwBvLnkYBHBQjNRfMp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOD72L9hJlbd5I0yh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysVYXgGTxR7jOnPPx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzs96327PNhPW91MzN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxbRjACsxR--EwkuMd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]