Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m in my late 30’s. I work in IT at a pretty high level. Lately, I’ve been doin…
rdc_mthe4jr
G
some people can only afford a free ai therapist 🤷♂️ any therapy is better than …
ytc_UgzUzW-do…
G
An AI model doesn't get bored so it will also not care if it's turned off.…
ytc_Ugz3oQNyF…
G
@noth606 Well, I my opinion, that seems to be a very good method of machine lear…
ytr_UgyvaVyyS…
G
The role of most humans and successful people will be to become good AI communic…
ytc_Ugz2dX8oG…
G
The "Echo" story us a JOKE. A.I. or 'algorithms' cannot, will not EVER Attain co…
ytc_UgzWZ5ceH…
G
For this an any technology one death is too many. I wonder what is the average o…
ytc_UgzcMDePj…
G
43:11 "Taking your idea to another AI with a different history... actually helps…
ytc_UgyX7eo-u…
Comment
Well we can all learn from the Anime. Sword Art Online: Alicization. If we design an A.I that learn from the ground and up. Maybe it will understand us Humans better. And not destroy us. Hopefully. But instead help us
youtube
AI Governance
2023-07-07T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwzsvT0R8QaLUyNUIx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxtb04KfZMnmnwKn4x4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx5zeaDZkjk9MLbG_54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOy-FBaa-ajhMXoMZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykacu35_BkzEzD8hN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyEH-u-qtlab019hkB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxl48isDCChCc0PY594AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzKB-h2aVWk7JxR03x4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzR0GPh_1T1t2ExF0h4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzX0cewoR8nHZWY4Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]