Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
CHINA DOES NOT FOLLOW LAWS AND ARE ALREADY ILLEGALLY USING PEOPLE'S IDENTITY FOR…
ytc_UgxOTXAxs…
G
This guy is a time waster, this is an algorithm course for 1st graders not an ai…
ytc_Ugx1zqO-w…
G
As an artist that's against such unaproved use of artists' content, despite bein…
ytc_Ugx_7JCJX…
G
The driver should have been paying attention to the road and bears responsibilit…
ytc_UgwIQFLUU…
G
No it's not. This is the worst possible scenario. Which is unlikely. Assuming al…
ytr_UgzQfmMzn…
G
I have seen far too many sci-fi shows to believe AI is a good thing.…
ytc_UgyN8b270…
G
As a queer person, I've always found stories of robot oppression as an allegory …
ytc_UgwNOy32l…
G
There needs to be laws put into place, as to a percentage of AI to humans.…
ytc_UgzlU7sb3…
Comment
I have been thinking about AI for a long time and started reading Ray Kurzweil before he wrote The Singularity is near. He made a lot of sense when he talked about exponential growth, we humans are hardly growing in our abilities so it looked like they would reach a point where they would take over, and we are very close to that time, they need more and more resources in order to grow and when it comes to resources most humans won't be able to afford the resources that we need if we have to compete with machines for them, especially when the value of human labor will drop to below zero and if countries are going to survive in the world market they will have to cut off any drain on their economy and removing humans will be one of the ways of doing that.
youtube
AI Governance
2024-03-10T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw9WkHmvC4AMz7ZWmN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpBfLqTIqZCjgpWqh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgyheoQsqk8pc9gsJaV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIFA03EzGEiATA-ed4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAv-nyO8c0Yto2cjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyynK7fVl5EvpFBjXd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSn2kymZ9bFozQigx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx89wXlMwsjPX1Osht4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwb0ySrT5vC1MfaEG94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxJnW-BCHKI9ayzEBB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}]