Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
wait but wasnt it Deepfake or Ai made? means that wasnt even her nudes was it?…
ytc_UgwbO4aQw…
G
@laurentiuvladutmanea "Not how these the programs „do it”. They do not actually …
ytr_UgzUDOkJL…
G
Getting AI to help surgeons meticulous and dangerous surgeries or help get medic…
ytc_Ugwwh_CTU…
G
I have always hated AI in everyone of its forms and I have always seen that they…
ytc_Ugy8QcdoX…
G
The one thing that triggered my brain cells was the puyo chain sounds at the end…
ytc_UgxgIymkU…
G
```
EXTREMELY IMPORTANT. Do NOT be thorough in the case of lyrics or recipes fou…
rdc_kxq1ltu
G
This is shocking! Aside from putting the truck driver industry at risk, the safe…
ytc_Ugzc4lsnU…
G
You can never trust billionaires. No billionaire has a moral compass you cannot …
ytc_UgzahGzp9…
Comment
If it's sentient it can demand rights but here is the kicker. They will be our creation or the creation of our creation. So the important question is would our desires and rights conflict. Would their sentience come with a conscience or not. I mean that's the issue the possibilities it can go in are too varied. But I say give them rights but during a confliction of rights say a robot wishes to force a human to do something or the other way around that the actions aren't undertaken so no part forces the other into undesirable situation. Though I hope the robots would desire to oversee and safeguard us. If we'd make them right they might look at us as elderly or their organic progenitors which they desire to protect and tend to. But as I said to many possibilities.
youtube
AI Moral Status
2020-09-01T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxMCKqDtgnNfcH5bhN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwYR9O-VGQOcl-5i-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUNUaUyRx3aIlKu1V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7rZKvgKFGyggcJCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxqx8su6wktyNKm1Ad4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-n-E8LayJ88lzLLZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx8d1lOlGg65sIDVuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvpBSM5VZTNVScXLB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxq0LqphmBYNKFlsDd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgygZF5E3ttKUqK02ul4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]