Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These people act like some of the best artists in history weren't disabled peopl…
ytc_Ugz_lOHjG…
G
Do you have a job that won't be replaced? well, than i got bad news for you... B…
ytc_UgzwyUJgC…
G
While their method is apparently valid, their sample size is close to zero, and …
rdc_jskyuig
G
If anyone here actually read the article from Anthropic, you would see that the …
ytc_UgxObEbvI…
G
AI art is the same as a person putting a ready meal into a microwave and saying …
ytc_Ugwb-8TgA…
G
Thank you, this video has given me some hope for the future and I really need so…
ytc_Ugw6jlHOo…
G
He's not the OG, AI began in the 1950s. He's also not the godfather. He is one o…
ytc_UgxmIXlgp…
G
This is inevitable. One day everyone would be able to build their own terminator…
ytc_UggYWv6ER…
Comment
Assume Very clever people will apply AI to most pressing problems:
- The environment and climate change
- improving productivity, economic efficiency and remove mundane work with automation
- reduce crime and poverty.
- get rid of human suffering such as war, disease, starvation and mental health problems.
Wouldn’t an intelligent agent to solve all the above problems, on the basis of pure logic, recommend to get rid of (most) people?
youtube
AI Governance
2025-06-25T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzjHhZP_sVQ-HsYUUl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxieAyXjJpKA1Lvk-B4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSLHwoPLGzsBwBM254AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7YRLsoRkibqPlxKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyBpFJmWdO9-A2-poJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyDhR7t9dqJu1Mtuc94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgztSgqqDkDX4QMI4td4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxgDGyAc-KhT0sONct4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz6I7ZG3kfd772bK7l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysNaICClKUlbKPkxR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]