Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is like one of those videos which said "Nukes would destroy the entire worl…
ytc_UgykHj5kA…
G
We're gonna kill ourselves
What is suicide on a species level called, does it h…
rdc_fwic303
G
Why are self driving vehicles even allowed on public roads? Have they passed sta…
ytc_UgwKovd3c…
G
when governments do not have criminals,illegals,law offenders,Traffic violators,…
ytc_UgzsyJlZU…
G
My theory is that AI will be the driving force to chipping humans…what humans am…
ytc_Ugw9pPkyR…
G
AI doesn't have its own interests, AI has human- programmed interests that comb…
ytc_UgyNWZ7nh…
G
You're wrong. Ai isn't going to take art jobs first. It will take jobs related t…
ytc_UgyE_AJgP…
G
Well said, at first I thought this was against the very idea of easing human wor…
ytc_UgxwXiWR-…
Comment
First it will be I Robot, then Terminator, and finally, the Matrix. If we're lucky.
youtube
AI Governance
2024-03-30T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyW5t9aiqD3qVBNGV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTrh36IHregjJiQkR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxC-c1fCLRXLrmoBh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznRwRNVD70Qps00594AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMdILj8KsK67H4UvR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwIENoN5Qh4T9hLJ1N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYnqRriy_rPFzqwHV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxVUwCzmMYK9Gfuzdh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxejlR28ozyyJazTGV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzHjbMVoP8m2uWkS5N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]