Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
45 MINUTES FOR LUNCH?! Me being in an indian school having a break of only 20 mi…
ytc_UgxmYRNqV…
G
Make a video in which you pretend to be an advanced robot that is self conscious…
ytc_UgyfJIuIw…
G
@thecreatornooj1328 And just like this you lost me. If artist would consent to u…
ytr_Ugy_CYDO2…
G
Automation reduces attention so yea naturally were putting it in cars where peop…
ytc_Ugx5ZgfHi…
G
I also suspect that there is an algorithm to "recognize" bright low lights as ro…
ytc_UgzFi1J5a…
G
AI art is just a prompt that uses stolen intellectual property to create a copy.…
ytc_UgzPAAo1Y…
G
Why do Sam Altman look like someone who is enslaved by an evil AI to do it's bid…
ytc_UgyxRVtCy…
G
It's not a robot. It's a animatronic. Disney has been doing these for 50 years.…
ytc_Ugyfkk3sd…
Comment
Thank you, especially on your take of the Alignment problem. It reminds me very much of the videos I watched last year by Robert Miles on his personal channel and the Computerphile channel he's part of. Two years back he was talking about how AI lie all the time, and this was before the release of ChatGPT. The alignment problem is the biggest issue to consider, because an AI probably will never be inclined to follow the "spirit" of an instruction. I feel like human history is just on a big loop, and just like the evil djinn of old, General AI is a great power that cannot be placed back in it's bottle if it's allowed to ever have the upper hand in any capacity.
youtube
AI Moral Status
2023-08-20T22:1…
♥ 35
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz4LmweJWCyvg_WLT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlUAjvg07Gfn40e_94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyE2bKu9n3YwAqQ3F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw8pVMDZ8MhE1Gyf8Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwE0UsrNw6z2lzTUHp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6cwHuglGeZ4GYB-p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMCNAnH3scCR_NkDx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw071Ztqhkg0exG7p14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxNbshG2oVOuTOZo294AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwsFiHWjq4cPPisKdl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]