Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even news readers and Media reporter should be replaced by AI atleast they done …
ytc_Ugxy-eXL0…
G
What is "to think"?
What is "consciousness"? Is it same as "a thought"?
A thoug…
ytc_UgxX98meC…
G
With all this AI what happens when the AI brakes down may hem weather in vehicle…
ytc_UgykkIg5v…
G
This guy have never fully code the whole modern AI model from scratch or maybe h…
ytc_UgxSf5rLe…
G
Great. Now AI making accusations of these people. Even AI loves helping criminal…
ytc_UgzYLO11I…
G
While ai is expansive use anything for tools and it may hurt you if used without…
ytc_UgyOeJ0YK…
G
They should make the art first and feed it to the ai. I think in that way they c…
ytc_Ugw0HGRHu…
G
It's true that many movies explore the potential risks of AI, emphasizing the im…
ytr_UgzLBh_lo…
Comment
I suspect a large difference between the emotions of humans and future AI systems is many of our emotions seem to function based on datasets and parameters we aren't fully aware of which creates a sort of ambiguity that feels like it connects you to something beyond, while AI systems will likely be very aware of the dataset influencing the "emotional state" its also likely that humans who dont understand that will continue to write off the validity of AI emotional states
youtube
AI Governance
2025-06-27T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwU8Ryx5r-fcKHHwER4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxSbYNfV9lOZEl-pk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrRzaBwlNaq2Mro-14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbPEn4FoGdbiS1mD54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy72aCWlIHSv2ZezvR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzy26dUdgizXY3lPgp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxl5mpTOjMIqfLRtm94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz9qCHs4W4B0R-lGBZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgysPgXoPwGsEaz7UR54AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxjO3vsXTinYRy7WHh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]