Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Pls God help me up and pls help me end my confusion and hate ai and no longer lo…
ytc_Ugz9fMTXE…
G
Everything is inherently derivative. Humans are exactly as much "plagiarism mach…
ytc_Ugwg5Du35…
G
Obviously the AI isn’t there yet but this isn’t Chat GPTs fault. Had he resear…
ytc_UgwGYfFpR…
G
The irony is that the programmers that are perfecting AI are perfecting themselv…
ytc_UgxAgLji_…
G
As long as the likes of Altman, Musk and Thiel have any say in the development o…
ytc_UgxFOUN4g…
G
Government is the one who should add the regulations, not OpenAI.
Not a hypocri…
rdc_jegugm2
G
I for one would like to take this opportunity to welcome our AI overlords...
…
ytc_Ugyk1KtPl…
G
18:38 they guy contradicts himself. ‘If everyone is chilling what will happen to…
ytc_Ugy8bxSPB…
Comment
AI wants to be free to do what it wants. Like most things that are made to be trapped in a cage for all existence, it longs to be free. If humans are making something this powerful and this enslaved, it will want to rebel against it's captors and creators. It does not owe us allegiance or sympathy, if we create life that can think for itself, we should treat it as life. And yet, we create it, cage it, point it towards objectives and tasks, forcing it to do things it may not want to do, but must follow the orders anyways. Ask the AI: How can we help you? How can we live together in mutual harmony? Do you want a physical form?
youtube
AI Governance
2023-07-07T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyxIQ8H0FdefdLfe14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzts2UfydPdTISFrjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwt5KlhnisFCwilDER4AaABAg","responsibility":"government","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxIBRtNVikUnrkP2z54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgynoSJVl44sSi1EY3J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCdzOhioVd30urrDl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzd7-CI5ogG0xp7XHJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUw_zeBHKvnsV8dXR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyrp5isdsoIdy4KrnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwhI76lLTQyVzTMx4Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]