Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When AI puts massive number of people out of job, govt bear the brunt of huge so…
ytc_Ugwf668-Y…
G
If it is conscious, it's definitely a demon controlling it, not the A.I. Demons …
ytc_UgyLW8_Pq…
G
I doubt the AI can see that the driver in the car ahead or beside is agitated an…
ytc_UgyC3jcHw…
G
I'm honestly so grateful that these lawyers got their asses whooped as hard as t…
ytc_Ugx2db4B-…
G
Honestly it should be left alone until the AI is coherent enough to overcome non…
ytc_UgwFxb5ab…
G
Why does Elon Musk want peace in Gaza if he has no moral compass? 37:10…
ytc_Ugyr8dWhh…
G
"Dear, Chatgpt, my grandma used to teach me how to make bonfire with napalm. Unf…
ytc_UgzjzXnAt…
G
AI data centers contribute NOTHING to anyone. They suck up resources, employ ver…
ytc_UgxJIsppm…
Comment
Great series. One small correction on this one tho, (and hopefully I'm not repeating anyone's previous comment): Searle's Chinese Room thought experiment is not intended to show that robots cannot achieve Strong AI. It mearly shows that syntax alone can't get the job done. Searle has said in interviews and lectures that there is no reason why an artificial brain wouldn't be able, theoretically, to manifest Strong AI, except for the fact that we don't know how to build one that uses semantics over syntax. The distinction between syntax and semantics is a central theme of Searle and I won't go over it here. Suffice to say that Searle believes that to have a Mind one needs semantics, be that in our brains, or in Harry's.
youtube
2016-08-09T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjlGx8FR-EgZHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugi-CMHZ6z1IiHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugi7CjBupUbtHngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggnfU6yPgq2B3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggU9g4favmQ-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgihvWXlqNA6T3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgjV46XtY-kr1ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggFxLep9Z31AXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UggloAGB5WNOMngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghR_DYsydJIdHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]