Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think alot of people don't have fears for AI because we aren't really surround…
ytc_UgwRWCEAq…
G
I don't understand why this isn't done more. It doesn't seem necessary to chop t…
rdc_deuf4x7
G
The ai was being blatantly racist and was using stereotypes to put black people …
ytr_UgyYwbHGY…
G
I just want to point up that the cruise in wall-e was a luxury cruise, so whatev…
ytc_Ugw8CLkBf…
G
if she will get smarter over time then if he will learn how to build a robot lik…
ytc_UghriRFYh…
G
This is typically missing the entire point.
First, we don't have intelligent AI,…
ytc_UgzMkwZBw…
G
I'm always imagining the AI to be another person so I'm starting with a hello an…
ytc_Ugx3fBodO…
G
What are the odds of Euromaidan developing into full-on Syria-style civil warfar…
rdc_cfks86a
Comment
For me the key measurement of when we reach GAI is when AI can conceive and create the questions and AI can do this better than a human. Once they can ask the question they will very quickly think of questions humans can not conceive of and at that point for me we have GAI.
youtube
2025-10-21T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyduUax8aXAaxWBCZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxkegbUjbDGXRs1R7l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxZwugD7yGkSe4x25B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySTUwVMxYZqUVI6s14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyarYjr72BHEDgJpNx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxKlsyJZTcEhJRysnx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwkiZi6ACF1Ypjv2_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3h8OillGziL183h14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgySBj6_hE2RJPw6AZ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxw4lP_wFRhxbG6B-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]