Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's not just ai or computers in classrooms that's the problem imo, the school s…
ytc_UgwmNakW2…
G
Wow! It seems like ChatGPT tells a person what they want to hear which spirals t…
ytc_Ugy5qexX6…
G
and it’s even worse considering just the sheer amount of artists who use social …
ytr_UgwT9uT3G…
G
I would think most of those negative comments from "AI supporters" are actually …
ytc_UgyAEzsve…
G
@6:40 talk about facial recognition deployment, make me recall the movie: Minori…
ytc_UgwcuVHQ_…
G
You couldnt compete with china in manufacturing with humans. Now you want to tr…
ytc_UgxzSbHrg…
G
the AI is trained on actual artists pieces as well... it didnt come up with anyt…
ytc_UgxHyFC-f…
G
How would you feel about an AI-art system that credited the artists used to trai…
ytc_Ugz2aVqpV…
Comment
What Penrose is saying is basically that AI doesn't or can't have the ability to become conscious. It seems as good as it does due to its tremendous computational power. But does AI even need to become conscious to be potentially dangerous? Does it need to reach that level of understanding to want to take control for itself? Control, escape and freedom are all ideas that are present in the data that we're feeding AI. So even if AI doesn't "understand" them, could it still not want to carry them out by some chance?
youtube
AI Moral Status
2025-04-28T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx3a_d7edn_inipW_p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxUskOaN8W0FDkiyFR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgxZ7pJ7BlrJf1lfGL94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},{"id":"ytc_UgxKWF9Luw31fV8f5PF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugzu-0Tq4hJ_GwIRk2p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_Ugxv8uM4UPYebzitiwN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_Ugxb0LkDuWslZ5P4CnB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgzLUk5iuhfTBX1CDAh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgwlxVzrvxNq-jcdzpB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},{"id":"ytc_UgzSepwfJ85cE_ZpkQB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]