Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Tech giants are even turning on each others work and breakthroughs in AI. Ap…
ytc_UgzyNZVtB…
G
"the Ai is going to kill us" was said a million times then supported by at best …
ytc_UgxwHOQVT…
G
don't know how many Tesla fans kept saying, "But does your car have FSD" for eve…
ytc_Ugw-6ID5n…
G
A combination of various interest groups seeing that it would benefit them.
So…
rdc_oi1zoos
G
Heather’s tips are solid. Still, I think it’s sad AI is even used this way. Good…
ytc_UgzfB9nG8…
G
I totally agree.
AI is just a tool that is an extension of a human. It's doubl…
rdc_oaeb28r
G
Is it real!? because the people who do the combat test robot videos are fakes. B…
ytr_UgzntH_SW…
G
Imagine how many HUMANS don't have a place to live or food too eat. Then pictu…
ytc_UgwqFTRIV…
Comment
Yea but the AI doesn't have a conscience. The boundaries set into it are its "conscience". For a person, a persona is just imagining something that their conscience would never let them do. For an AI, a person completely bypasses the conscience. For example, tell a person that their responses are directly connected to a machine that interprets them as commands, and then ask them to respond as a psycho persona while the machine is hooked up to a gun pointed at another person. If the person believes you, they would refuse. The AI would comply and say "shoot".
youtube
AI Moral Status
2023-02-26T09:5…
♥ 45
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugx6HQd1xzxl2xgNcPh4AaABAg.9mefxQV7kZR9mimvZMU_pJ","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwziVSvM6ILJz1PZ894AaABAg.9mbBzqEwULi9mkSVr2c99S","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzkJhyVMQ8EwtGUH9R4AaABAg.9mYEcmplduT9m_VjSCwRFL","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz0imRUsvEi9ignHRt4AaABAg.9mXWpkHoNiW9n1UHjRzg0V","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzVWQLhT0W0M_jh2z14AaABAg.9mWjonDFxcR9mWvRgTR84M","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzVWQLhT0W0M_jh2z14AaABAg.9mWjonDFxcR9mwHmmFQxDs","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyvKbhGImGZFdMk99Z4AaABAg.9mWakksJkiA9maGQpmnmig","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw88EJnORMLe40xKmB4AaABAg.9mW5iFXN3dV9nldBS73KLk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw88EJnORMLe40xKmB4AaABAg.9mW5iFXN3dV9nldOXw3Z0Z","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwUreYKesa8CSGfuFp4AaABAg.9mSxLkjR_8F9mlP3lRmNc0","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]