Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This highlights a difference with capitalist thought, and socialist thought, aut…
ytc_Ugxteb5S8…
G
I dont particular like this guy. What are the military do/doing with a.i.......H…
ytc_UgxJzqoP9…
G
But those who are in control of this AI are programming into it a myriad of fall…
ytc_Ugzr_2VsW…
G
Yeah it's true Americans are stupid. They could have had a smart Asian president…
ytr_Ugw89SDXd…
G
In the 1920s it was automated machines, in the 1950s it was robots and in the 19…
ytr_UgzE30HAQ…
G
We use CoPilot in our division. I known other divisions are using Claude and gat…
rdc_ohuf5po
G
Look, i love new technologies as much as the next guy. I consider myself a tech …
ytc_UgwOCWqbw…
G
I actually made and essay for school and this guy was a main point on how ai is …
ytc_Ugy51BQ7Q…
Comment
So to me the argument is parsed down to a single initio: the more complex an 'object' is the more likely it is to be identified as a 'subject' (sentient). Continue- The more complex the subject the higher it's intelligence. Etc.
This leads me to two conclusions 1) Neither me nor a robot nor an object of any level of complexity is sentient it is always a sense echo or other cognitive error. 2) All things from a triangle to a human body have varying levels of complexity and ergo varying levels of 'existenceness.' (r/o Is existence scalar?)
I'll avoid 1) as it's a dead end.
2) if true,.would this mean the more complex a being the greater it's "importance". Ie a bacteria is extremely simple and we kill millions an hour without a thought. A simple being such as a fish we have dominion to kill and consume and this would be premised that they are less than humans (as it is illegal kill humans) so allowable. The a cat. More complex. A pet now. A companion - but one we will euthanize if the vet bill is too high. Then to the greater system of complexity of a human. Now all rights, protections and freedoms are provided (in theory). Let's be trivial and slap on an IP or"importance scale":
Bacteria=1 IP
Fish=3 IP
Cat=5 IP
Human=10 IP
So now humans, while yes constrained to a complexity of 10 have the capacity of grouped communication and the limiting factor/expanding factor of time.
A human of a complexity level of 10. (Average 100 IQ healthy human at 25) can collaborate with others but more importantly this complexity level of 10 can be compounded through Time. If I collaborate on a project and make a 10 part A and another makes a 10 part B and so on,.we humans could create an object of complexity >10. If complexity is the scalar which is found at the base of other methodologies which do not mention it, the fact it is the one absolutely necessary and the sufficient cause to reach sentience.
Anyways. So our 10s all perform a scientific "force multiplier" collaborating and essentially trading time for complexity then could not humans create a sentient subject which is not only alive as per the above musings but also 1000000x our intelligence due to wetware vs firmware?
Who are they? Where do they belong? Do we have a right to order them to work any more than a fish or bacterial has a right to order us around? If they are superior sentient beings and we historically tune our moral duty towards someone or something which is equal or above us. We could put ourselves in a situation where it is immoral to not obey objects we created.
youtube
AI Moral Status
2020-06-11T16:3…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxJm3E-y9TrZz9ZQyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIklbBYFK_wAcCJ3h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzl0909SCqhcqHvu3d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0unC-Zl0FFY3fD-x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxwGdNyZq4xxa8UqVx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyysRPT99NXxnZX0Ql4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyylHmqBVEN4ysZzyF4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXLI3Ga06JIPqCcbF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8zJOWozqD_HM4NiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSOH3Ousge2LRAiWJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]