Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Human language and linguistics is a process of hallucinations. Especially Americ…
ytc_UgwN0blEU…
G
i dont approve robot cars, but i admit there are a lot of idiots driving around,…
ytc_UgwPinJXc…
G
I quit looking up medical stuff on the internet after asking the easiest way to …
ytc_UgwdCUxC1…
G
I'm a professional programmer, and in the last year my job has changed completel…
ytc_UgzEXGK2s…
G
Hi Sofya! It's great to see you engaging with our content. If you have any quest…
ytr_UgzK4LEF0…
G
AI *prompters* telling artists to quit is like lunchly lovers telling restaurant…
ytc_UgzJuLSOF…
G
Not a cscareer response, but moreso an in general response.
The reason to not w…
rdc_kyzghxj
G
Watch this place! People that lives in a bubble can't comprehend what the AI can…
ytr_UgwdgBaU3…
Comment
All this is going to require an enormous amount of electricity.
There are no AI's that run on gas or petrol.
Therefore to take over the world then AI had better be good at drilling for oil - LNG and shovelling coal or handling Uranium/Plutonium and wiring a plug.
Robot soldiers only need a bullet through the battery ( electricity again ) and that's the end of them.
AI will only do what it is asked to do - told to do.
The phrase Garbage in Garbage out springs to mind.
If you let psychopaths and lunatics who worship money teach it morals then guess what?
it's morals will be the teachers morals.
Or lack of morals.
youtube
AI Moral Status
2025-06-07T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyNSmddZbEoT2KYXtJ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugx4XQAt7ZyMzWR1YYh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},{"id":"ytc_UgygNwi_Ytbeal71QMp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxVO41nQejFAXwObgF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxfIFdsYukbbnb4Iop4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzpP1H_Dv1ZihYbfKt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz0uCMtRRlsPK4gVBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgwCdFQSuqIuv8OG-CV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgyoITgJlHp9xJYX5eJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},{"id":"ytc_UgxXhs11PIqq13RQOTR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]