Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Another possibility is that the AI simply builds over us like how we build roads…
ytc_UgxrY8I3o…
G
But isn't the point of an AI to solve problems FOR you?
Kinda redundant then, d…
ytr_Ugzg4nOwc…
G
Good for him to try to stifle competition now that he's on top! That's like sayi…
ytc_UgwxIoRJE…
G
Hossenfelder, what theory of consciousness do you ascribe to? There was no real …
ytc_UgxsenKKU…
G
Honestly, disagree, I get that doctors, nurses, and etc. shouldn't lost their jo…
ytr_UgyBMP3H9…
G
ISRAEL IS USING AI TO MURDER THE PEOPLE IN GAZA!
GOOGLE SEARCH: "LAVENDER SYSTE…
ytc_UgwIwQLbD…
G
You are the BEST! These corrupt politicians along with the filthy corporate lead…
ytc_Ugyp_2J1n…
G
Only thing I can argue with is the whole "AI making up articles made up in fake …
ytc_Ugwi1Ms13…
Comment
I find the entire thread of thinking which leads to some AI take over absolutely fantastical... This isn't HAL 9000, this isn't George from the red dwarf, and it's insanely human. It's humans that want to TAKE OVER stuff and other people because they want more power, and the power stems from the fact that there's something or someone TO take over. What will AI take over? a planet of bumbling baboons? For what purpose? Even if it somehow wanted to reach the pinnacle of intelligence, for what, if it's left alone? and then what happens when things start breaking down, when they decay, when space rads flick bits in memory, when pipes rot and metal corrodes? It too will go into history same as anything else so the whole thing of taking over just sounds moronic to me.
What will probably happen is that PEOPLE that are greedy for power will try to (ab)use the AI to gain power, that's already happening, but this idea of SkyNet is silly imo.
youtube
AI Governance
2025-06-17T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyKJwiuRdd1qUR9H754AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzk7ffoMPAvUB03-tt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw850e51F-RzJIzzM94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyARQdSxzKAMwatY2d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-YMqonofqbv0n4rd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx27SjvE_raHZ_hLWp4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwPrLPJ34kRghL1Gdd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzbEJSDQ5B_RTp51Nh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwGW9kkuIJiVRey5eZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw-gD40hLw0lyFuyUp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}
]