Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI sucks ass, can't do the work of humans. It's a tool, a very expensive tool th…
ytc_UgxxoifgO…
G
If he had created an AI art generator that only took His original art and use th…
ytc_Ugxeg_VGN…
G
I can see how interacting with AI like Sophia might feel a bit unsettling! The b…
ytr_UgxHmcNM2…
G
@kaitlyn__L Just because y ou don't like an AI telling y ou what y ou already kn…
ytr_UgwGMdpyo…
G
Until they have AI that can walk across a rough construction site and inspect re…
ytc_UgyilRKEd…
G
4:59
Hold on. Isnt AI trained off of human art?
How can you call the fuel of yo…
ytc_UgzoSuNz5…
G
So basically you just showed how the Tesla driving A.I. broke traffic law. Lane…
ytc_UgwPCZCzk…
G
This is why I feel like Ai art is just a “isn’t it cool how far technology has g…
ytc_Ugw76PseD…
Comment
The concept of conciousness is probably the most important and oldest question of all. What does it mean to be a "concious" being. Its obvious it's an emergent property. It's also obvious that there are different levels of conciousnesses. Think about the people you know or have have known, some are more concious than others; it's a continuum. A.I. could certainly become conscious; but it will be weird, and not warm and fuzzy, at all.The key question will be, what is it's motivation? What is it trying go achieve? Certainly not the usual human things. No one really knows, but I guess we'll find out.
youtube
AI Moral Status
2025-11-16T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyl1bbxT41Twv_gybF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYNWrJB-5bterLdbh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykOAPCSK7ylJOnMlF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy0NXu74PXPoC-kdRl4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8y7p23jiOgpGoHFJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx42mjM90G8kVrDjsx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzbjCPUvMbe-L37fPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyb9RmsrStwn4X4uiR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxc4de9QznC4mDe6-p4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy8bFxaXe6-Nd-0XuF4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}
]