Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I only use AI to read it for me and give me suggestions because nobody else want…
ytc_UgzCWQMSA…
G
Unfortunately, the age of agentic AI is already upon us. The next 6-12 months w…
ytc_UgwYR8VYO…
G
I don’t understand how Ai could apparently get so smart that it figures out all …
ytc_UgzHi0fOG…
G
Jesus stop listen to this bs interview. AI won't replace you 🙉 ppl these days re…
ytr_UgxEEwujJ…
G
Within a few years they could have armies of AI powered murder drones. AI power…
ytr_UgxIvW_jb…
G
AI sounds too real and it’s only 2024. Also in the same sentence it created a sl…
ytr_UgxJVchT5…
G
It’ll backfire. AI will fail, it will add work to the humans left working, becau…
ytc_UgzU6cjBU…
G
Are u serious and believe what you've saying??! AI is an algorithm invented by m…
ytc_UgwoQXtNm…
Comment
Around 30mins into this podcast it sounded so doomsday in a nonnegotiable sense perhaps a little immature by stating we cannot turn off AI it would turn us off first. Well if that’s the genuine fear then turning it off is a must and of course it can happen but I guess we need to be more realistic with the current situation, so maybe we should seriously consider stepping backwards in regards to technology. These systems can be turned off if we are less reliant on the existing technology. FYI I remember the popularity of the first 3310 back in high school and it was a total space invader. Trust me everyone was much happier without the need for phones and all the social media platforms we are so used to
youtube
AI Governance
2025-09-09T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyF-cwYKfmkbGabHDF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgynB4tWHhgN8zCKNit4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxaB1BQrpOJED2KpBN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtYDlL0PWTxaJnPCt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzPvhisYaCVJpdpp-Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwy9G4h1KBuqJa4nbt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwMjdX4jbKf00ugFrx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwojrKNC2Sem8KgJON4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyEvotVAW75Kyx0kVd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwiPGvhKKwKiwOynjJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}
]