Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The US no longer the super power. Bye Bye Vance. China beat you to the AI table.…
ytr_Ugx2yEr_M…
G
Any lawyer using AI to write legal stuff needs to have their law degree revoked.…
rdc_n5jiky5
G
Not unless doing so aligns with the goals of those in charge of large language m…
rdc_oh3tfuj
G
I genuinely hate AI "art"
I want to go back a few years where this was not a pro…
ytc_UgxKAYE57…
G
I noticed when I asked my ChatGPT what would it like to be called, it gave me a …
ytc_UgwmhcdHy…
G
Makes me thunk of how some people are actually trying to make comissions off of …
ytc_Ugw_ZTex_…
G
Yep, this has confirmed that I will not be paying for anything CGI ever again. …
ytc_UgxbSYLXh…
G
He is an ai expert if you don’t know and one of the leading ones, what he saying…
ytr_UgxFC4eD-…
Comment
As Computer Engineer I don't think we'll ever reach the point to have conscious IA, still science fiction. One thing is having lots and lots of data, make connections, path finding, cross relation, things that are hard for human beings but easy for computes, remember, a computer is just a tool that do very simple tasks very quickly.
The most simple questions for a human being could be rather hard for an algorithm, for example, a monkey and a child, ask a computer which one is the cutest and you'll start having issues. We know the entire ADN and neural map of some worms with ~300 neurons, we know the synapses, but we still don't know why the damn worm move the way they do, why they chose to move right or whatever. Sometimes we like to think we are special and we know a lot, but we don't.
youtube
AI Moral Status
2017-02-25T08:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgjR2zO_1LwfgXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggOs3HwjLeo6HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UggzjEvQA-SVuHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ughj52dn57v5_XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UghQ9UQVYlM32ngCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjIXkiz05yonXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UghVxTy-agwO-HgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghVIe6nF4TwM3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugh_UzizPwht13gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugjn9CpVjJQB5XgCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"})