Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Students in high school may not complain about AI because these tools help them …
ytc_UgwGIsiOp…
G
In 20 years we’ll hear a repeat of the “COBOL Story”, where the now-retiring poo…
rdc_oaf07vz
G
The male robot seems to be uninhibited about what it is saying the female is m…
ytc_UgxUWdKy4…
G
Automation and universal income are indisputably the avenue that makes the most …
ytc_Ugx_3G8QH…
G
Has AI make your quality of life better? Is your relationship with your loved on…
ytc_UgydMpwJv…
G
I use ai for music generation and to inspire my lyric writing. We have to accept…
ytc_UgwCiTgVS…
G
@queencatherineofaragon938 only time will tell ☺, oh i need to say, i see almost…
ytr_UgwbKpCmM…
G
What exactly do WE value? Do we value anything anymore? We deserve to become the…
ytc_UgwWCqxQX…
Comment
Really good conversation here, Bostrom and Greene. I noticed that there was a component about uploading brains into a silicon substrating, this is not physically possible. Mapping a human connectome would require 1 zetaflop of processing speed. In the future these power requirements might come down, but you won't get more efficient than the actual representation. You mentioned a lack of constraints and there are 2 I'm going to point out here. What is the point of representing someone's brain in hardware when the physical system is going to be up to a data centre at current capabilities for a single individual? The second, and this is really important, we aren't getting past natural constraints, ever. We currently have a 400-500 year supply of helium. That is required for sub 7nm scale architecture that runs AI today. There are very real, real world constraints on these systems.
youtube
AI Moral Status
2026-04-18T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyJECAjA4cKDle7yi14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzTB2lRtltj9JIchvJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxE7oq5bwGBAJKL0hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgybLro555urp6mFjXZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx7hPb9NyZLEabEWFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgymoSTSivP3Yvqzuwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzxGV09VIdCeoM_S1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz0zpDzqxoW3D6bxVV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwV3kI42R6NKOwtDJV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgymQV48gf6405jxcah4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]