Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everything was going great until he said machines can be conscious and "people d…
ytc_UgyoD0ZhU…
G
We are machines we are just like ai as in we have memory banks learning everythi…
ytc_Ugz8LYD3A…
G
I don't know what AIs you are using but Cursor with Gemini 2.5 Pro can make a so…
rdc_moy1b9e
G
“No, it’s impossible that we’ve created a sentient AI; we have a policy against …
ytc_Ugxx3BQd-…
G
Yes, put a gate on that info. But I realize how appetizing that concept is.....…
ytr_UgyldXXGg…
G
AI will learn to do plumbing. I have a very real warning like this guy. I hel…
ytc_UgyiwpxK5…
G
Different person same Ai researcher.
The Ai ethics researcher (Timnit) was fire…
rdc_gm0y271
G
Let us prepare by believing in Jesus Christ in His Word the Bible which is the W…
ytc_UgxjihCxj…
Comment
AI is about power and control. The investors are tools that smart people are using to achieve power and control. Smart people are tools investors are using to make more money. Safety is pushed aside because it slows things down. The sooner we achieve super intelligent AGI the sooner all problems are solved. None of the smart people are in AI research because they care more about money than power. Who wouldn't want the power to cure cancer, to stop people from dying, or to solve global issues?
youtube
AI Governance
2025-09-09T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy5E3nQP9sxWd2iFl14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOC_K0549C7ZEu34p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyV5Q6H_R2vgXKUZt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgylN1pUp2rLoDUluv94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrO1WS8HFVMhZ_6-B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLxtLnYxv3MBMhTkt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzePez1bzKi5XWpD7h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZF_slill-9NnSkiF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzCcvcMjCpxZ_v7WNl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwezpzYqHwBwo5Hpgl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"})