Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just tried the Clever AI Humanizer and it's impressive! 100% free is a huge bonu…
ytc_UgxdJywT4…
G
AI has helped me become an artist because I can ask it something like, "Which gl…
ytc_UgxkB1H5c…
G
@Il_panda Not good writing, technically perfect writing that lacks style. I've …
ytr_UgwUALI19…
G
Yes, the risk is real — but not just because AI is powerful.
The deeper risk is …
ytc_Ugz7PmjDB…
G
People need to WAKE UP FAST to all the lies that we are told and stop believing…
ytc_Ugz76ZS76…
G
We don't know of anything that can permanently learn at an exponential or lineal…
ytc_Ugw8PGe7m…
G
As the creator, man had the chance to NOT create something that will exceed him,…
ytc_Ugzw90_lk…
G
In the not-so-distant future, the world had become increasingly reliant on advan…
ytc_UgyHWz8MM…
Comment
Easy answer to stop AI: get enough ppl to refuse to use it, assign a stigma to it, akin to the Scarlet Letter and users of it, and the tech companies will most likely shrink away much like what happened to cigarette companies and from the stigma of smoking, presently relative to what it was like 50yrs ago.
Of course the probability of that actually occurring is near zero - let’s be serious here. But at least it’s still possible (not probable) regardless of how infinitesimally small said chance is lol
youtube
AI Governance
2025-12-07T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyDXEgSpJstWwVlQb54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgybH8K8zW88UlnzmwZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzyKnoF91hKRI084Ut4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxh7QKDzdLUaFEDQ3J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzKZaN4Q0_tTRu12UN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6qkPpR9DzOM6VmS14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0WrBpVjMARChTgIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzCchvF4PiN3gX2XYl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwCU8dhkEq_BHWXMzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9lAFmMVu74piZrWN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]