Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t think anyone got consent before they started harvesting human stem cells t…
ytc_UgyOhxTAt…
G
@bapa39 If you were to pit a person who’s sole source of information was ChatGPT…
ytr_Ugzb3UHvH…
G
_I didn’t see a random guy in the Waymo…_
You would have if you had seen a Waym…
ytr_UgwoVNEJP…
G
Chinese people are aware AI advancement but since its open and being used for pu…
ytc_UgyrghViD…
G
Yeah right 😂I want to see robot picking cabbage, 5 o'clock in the morning, under…
ytc_Ugy9nCytX…
G
until the software can identify all hazards and not act irregularly leading to h…
ytc_Ugwt2VqYl…
G
Let’s make it clear that AI isn’t displacing jobs. Employers are. They had way m…
ytc_Ugzwr2g1K…
G
@stevenbeebe35Then I'm sorry my friend but you have no idea how AI works, it d…
ytr_UgyC7hJRJ…
Comment
Those on the pro side already lose based on it being a slippery slope fallacy. Classic way to prove this is by using examples of past technology. Who's to say someone with a fully automatic weapon will not just go around constantly killing? Or a rocket launcher? Or a nuke? Because we have systems in place that guard rail against these actions. If it's possible for there to be an existential threat in the pros perspective, then there is equally just enough chance for humans to develop defensive systems around AI. The pros perspective essentially is suppressing technology by default.
youtube
AI Governance
2025-02-16T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyUGgUlM0sPlrHlY5B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXWgnfVXMyJcEoaAN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy19qng_0mU1MAslHN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyqwOM9j68xGe2aCDV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz-XTIElT6SaBW8e1p4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxLKvjVULwWpr_9zLB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy9wRiF9AtUw3XsMs94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw-YUD7igKvxKOso-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSIcgpqCmm1DQ2PDJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7pjPwQp36kXUqWMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]