Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
i’m so sick of this narrative that all people are OWED the opportunity to be cre…
ytc_Ugy-BpLGS…
G
The market will not take care of it, but it's really a question of how quickly t…
rdc_kt9bijh
G
There's all sorts of things that Ai can do that humans can't;
Coding animated SV…
ytc_UgyNG5bdZ…
G
Defeatism is the easy way out. Don't let AI companies kill you and your family. …
ytc_UgxG34KTL…
G
@ArreiosWasHere And still we dont compete in AI nor in the new space. Where are …
ytr_UgwQ1zl8i…
G
@41-Haiku But in order to rank people in terms of credibility on AI risks, the p…
ytr_UgzlL3VQZ…
G
I would argue that AI is being used for harm, a recent case that happend was wit…
ytc_Ugx6ZkXt4…
G
There’s should be some pretty good ways to test if Nightshade is working or not:…
ytc_UgxANmEAM…
Comment
"What would be missing for a AI to be person-like but not a person?"
I think the answer lies in consciousness (as opposed to, say, the idea of the soul). Is the AI *conscious*? An AI that passes the Turing Test could easily pass as being person like, but lack consciousness.
How do we figure out if an AI is conscious? I think this is the big question, and I have no idea. Can we even build a conscious AI? Can consciousness arise from man-made, inorganic, "artificial" processes? I'd assume theoretically, it could. Practically, however, we may never get there.
youtube
2016-08-09T15:0…
♥ 69
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugi1yh8I7Gqn_3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjXrER6VVXb0XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgijjsMpDmX7z3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Uggfndq2J7NnqngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghYgtQFZIyZQHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghH74kqk0AFK3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggxw68cAiW-M3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiKQ4yawP8DjngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghOhj5JFXSTzngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh42eOQdjXgzXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"}]