Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI companies are starting to panic. Don't fall for the hype. AI agents still…
ytc_Ugy-1b7Dq…
G
I guess you’re unaware that the domestic violence hotline is a chat bot along wi…
ytc_Ugz0lOr4i…
G
AI is still in its infancy. Truly good AI won't arrive for many decades. When it…
ytc_Ugyr6ioN5…
G
From my book, Prompting Happiness, which predicts the future of the pursuit of h…
ytc_Ugx7wSLzc…
G
AI companies have the right to use your material to make their own shit as long …
ytc_Ugw0InYNG…
G
The unemployment rate during the great depressions was 25%. Currently the US une…
ytc_Ugx3opiT4…
G
boeing lost its way to keep the "safety" above all when competition from airbus …
ytc_Ugyx_Xx33…
G
AI workers would not be free of cost. There would need to be either a profit, o…
ytc_Ugx4NuifC…
Comment
If it took humans less than 100 years of technology to advance and build AI, something hundreds of thousands of times smarter than the smartest human, how long will it take for AI to create something thats hundreds of thousands of times smarter than them. 😮
youtube
AI Governance
2023-04-18T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxw1ARKXXpIZO6dxvl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgXH8V7DPa1z5g8ch4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwNPsCU-6IKHR0jaUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyX8blT0orS2OEOXjZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhoEEPrlKEhTse7qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdQYLuGCk34JdKBLN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTO8J7cuilpnHpDNN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgzEDrkrK2wmJrw0qx54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyP8Q_jk1ByJa5m5j94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEPwEEQ0L8cgg7PLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}
]