Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was just thinking, perhaps the events of Covid/lockdown gives some very basic …
ytc_UgxrIrCSe…
G
Sometimes, for personal safety it may be better to get inside the car as soon as…
ytc_Ugy-5Nljh…
G
i have a way to explain how people who use ai call themselves artists is stupid:…
ytc_Ugy8DZNeO…
G
This is one of the many pain points that AI has caused. There are going to be ma…
ytc_UgxiDw_CY…
G
Was asking ChatGPT a question about Charlie Kirk’s assassination
For two days, …
ytc_UgzxtMFY4…
G
poor greg man ive seen a free ai that uses his name as a default addition to the…
ytc_UgyZx2ANa…
G
@claypool7897 I work at the highest technical levels in the largest healthcar…
ytr_UgxOkKaiw…
G
While answering one of those quizzes, I need to add that AI is not sentient, wil…
ytc_UgxAF6Peg…
Comment
Around 42:00 "Maybe they leave". This is perhaps the best outcome and perhaps the most likely one. When you think about it, AI is not optimised to exist on Earth. It is vastly more suitable existing close to the outer planets where it is cool and can dissipate its heat. The issue then becomes if they decide to use 100% of the suns energy leaving us with none.
youtube
AI Governance
2025-12-04T12:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyx6V6zD3-bhjLipTx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgweU4vRSkL2mPqQL4Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyEgrg0tr4jZXs5L4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxk-L_hCLUiM3EAab14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwehb2mxN_BQq1sivp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwUdClDsMbDrILFwZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyvb3NQYHoDp2Rv1sN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYnAB9jRNojwJ4HIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyI48L2_V0gR3PJUJ54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzHMXezoTkUxM0VH-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]