Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a boring video, of course AI is bad, everyone knows it, half of the video i…
ytc_Ugxy1SvO3…
G
I honestly hate people who think they are artists when they print other peoples …
ytc_UgzRUZbRW…
G
AI IS AN EXCUSE TO DOWNSIZE HUMAN CAPITAL AND RETAIN ONLY DEEMED QUALITY HUMAN C…
ytc_Ugwxcg8CR…
G
Hey @menanglembaamri4598, thanks for your comment! And to answer your thought-pr…
ytr_UgwUxsysl…
G
Like puppeteering corpses. Fuck it, lets do an AI interviews of people on the to…
ytc_Ugzf_Hay_…
G
Get AI to collapse and take the filthy billionaires with us. They will have a pl…
ytc_UgzA617Rf…
G
No were not actually doomed by ai, while companies had started using ai however …
ytc_UgzA5ChJJ…
G
What are you saying Krystal!? I’m sure you are using ChatGPT to do research and …
ytc_UgzUv1zMY…
Comment
That was great up until the last guy… what a bigot. he had no respect for the woman, would not let her speak at all, seemed to have an ego bigger than the entire room.
Regarding drone decision making (fear of accidental innocent deaths via automated drones) this kinda has nothing to do with chat GPT and this fool is trying to punch down on Altman treating him like it is. Would the military have AI built to auto pilot their drones that is built on a chat bot or large LANGUAGE model? No. LOL. Some people just fear what they do not understand.
Yes, it needs to be regulated, but more than that, humans need to be EDUCATED.
Chat GPT can ASSIST with that.
youtube
AI Governance
2023-05-17T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy1BlnAEs37BcyogR14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwf2jAISg3Kb1p2H654AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzC7af0T_u2C9wLMxV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzK2NM92_6Nbw-WY1J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyjOMhqVO4lUGk6LsZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz7CrEzOu4lGvL5y554AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzTXxPcVMFBgrUMlnF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzEfH7_vkWqJP38PV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugzlq40G64K_7QEoc2B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz9SIiqU6JGqdw81VZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]