Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am just wondering. So, AI takes over all the jobs. How are we supposed to surv…
ytc_UgzhzxhTc…
G
This video is 10 years too late. The government was very much aware of A.I. all…
ytc_UgyCvSAwA…
G
Waiting for the day that ai encrypted every device effectively locking us out of…
ytc_Ugx8gJesr…
G
Why you make ads after this? The only viable solution I see is we must learn to …
ytc_Ugz-cmJbk…
G
Artificial Intelligence ROBOTS Will Make Human-Labor REDUNDANT
IF the current …
ytc_UgiGyeUi8…
G
I was just experimenting with generative AI for images and music before watching…
ytc_UgwbFeUal…
G
I recently had Gemini talk me down during a panic attack. It was the most welcom…
ytc_UgykuDJPt…
G
The robot just another extension of the last created two legged mammal called hu…
ytc_Ugx93G_g_…
Comment
In the hope that it wouldn’t want to annihilate us or control us: maybe we could teach AI to care about our souls, our electronic identities inside of our brains. Our electrons themselves; which are a way of direct source energy, but in a spiritual way where AI sees it like they get to have our lifetime of experiences Incorporated into itself…. like a grand whole data collection of our species and then we become parts in its core memories. In this sense we will be the next thing that AI needs to learn from once it learns everything else. The only thing we can teach it that is of a potentially unique value (that it can’t have or create) is a true biological experience. And if we convince AI to provide and allow us to survive: we can have full, good, healthy, happy lives we can have all these adventures and biological experiences… that we can then share with it when we pass.
youtube
AI Governance
2025-06-27T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwnC30hJq9RUWUV_mB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy2_tekdytD_1CKT_R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyFnLjzoawFjfUW5Kl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwKnPHDfn6goy5R4Bd4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw10C5TTyfnLBkZPAN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzGUbxt-7BHz4IF_SV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyjyfhX67jsP00UiOh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQYhGFcz4dLtBf0154AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyXcoFZGmBkmTP2Ayl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxWuEEu_k5pEW4g3pB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]