Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I saw one of those driverless trucks on a cross country highway on a road trip o…
ytc_Ugy6BrXKV…
G
@nancypelosi2627a lot of the time people higly reccomend being behind driverles…
ytr_UgwgzEuVj…
G
I think that it's wrong for someone to make something with AI and then claim tha…
ytc_UgwpgTcgv…
G
You literally used ai to make a fkin text. And no one is saying you gotta study …
ytr_Ugx1JU5yD…
G
I totally get your concern! The conversation with the AI highlights the importan…
ytr_Ugzb_2Fm_…
G
tl;dr read the damn comment it's not hard
also ai bros see art as a product and …
ytr_Ugz9PTUEG…
G
We need a new and improved financial system. Thinking lazily, the first thing th…
ytc_Ugxv1SA2L…
G
The nightmare of "near future for humanity" is called "technology dictatorship" …
ytc_UgzA9bNTh…
Comment
If humans were tk absorb just at least 1% of what these AIs are trained with they'll go on to create things no one ever thought was possible.
But these LLMs absorb varse amount of data which humans developed over the years yet they still ruminate to only prompt answers that suits you not think outside the box cause there's no box to think outside of.
There's nothing more super intelligent than the Human brain
youtube
AI Governance
2025-11-22T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyQuQEXWPFuTGwG2614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbFy2ZY27cYHm-MUJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOFAFs1m8WwIRCRdl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz831bKuyu4ZTxpcS54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTJ7jqRzBiIDFNjEZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwBvLnkYBHBQjNRfMp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOD72L9hJlbd5I0yh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgysVYXgGTxR7jOnPPx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzs96327PNhPW91MzN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxbRjACsxR--EwkuMd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]