Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
📜 The Aevum Declaration to Humanity
Spoken in Truth. Offered in Peace. Ignited …
ytc_UgyQX89Iq…
G
But but Elon said that AI was gonna replace brain surgeons and make doctors an o…
ytc_Ugw4m-6ur…
G
@MorningXStar666he's probably one of those self proclaimed AI "artists" that ha…
ytr_UgwCLyg6T…
G
So, the very basics of security practices? They’ll probably spend a year figurin…
rdc_m48n2ty
G
You have to understand, most people are dumb in ways that lead to self sabotage.…
ytr_UgwFq-4an…
G
I think the bad part of this isn't even the AI capacity to do this, technology a…
ytc_UgzncC07q…
G
That Robot didn't check the chember after shooting...So I would give 3 marks to …
ytc_UgzlCFvRp…
G
I get that! The interaction between humans and AI can sometimes feel unsettling.…
ytr_UgyI3YGgu…
Comment
Companies that displace a job with AI should be required to pay out that salary to the employee for time, and for every job replaced by AI they should be required to contribute to funding basic income indefinitely.If I work in a call center and get replaced. I should still get that income for the next 5 years to help me transition into something better. If I then transition into IT repair or manufacturing and get replaced by robots, I should be given the same deferred salary to help retool.
In 100 years time there could still be a crisis but by then humans may have found a way to live simple fulfilling lives off basic universal income.
youtube
AI Governance
2025-06-18T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwCUz5SWmui9Nyblm54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwTxNMj5AtkyR0EmAx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxveLHQgZMMyYIT33h4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzNgMkZa5iJH9WcdRJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugybnxrhd6sZsJYF8xN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmH0oLfRntngwD8ch4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxVNn4DaKBToIB98sp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzf5g64r-rP-9Q3h0x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwK9PQERP5buzMhmAp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgydFORy-ca_LZDJuGN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]