Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@laurentiuvladutmanea3622 Most art is driven by the unemotional needs of a payi…
ytr_UgzlxSh13…
G
Spend a few days running stable diffusion locally and the limitations of this te…
ytc_UgzEd9z0b…
G
This is just clearly a piece of opinion, which is fine. But teslas autopilot sur…
ytc_UgwB7BdQf…
G
So what is it that they "know" is going to happen with AI.. complete job loss.. …
ytc_UgwM7-hyI…
G
Oh i found some interesting key factors quite similar with Facebooks Meta ai. S…
ytc_UgxAZBhdi…
G
I don't think it's this complicated. Just be a good person and treat others, AI …
ytc_Ugzp0W5FN…
G
Fr this is why my dad and i always say please and thank you to ai 😂
(Also just c…
ytc_Ugxrbs8zl…
G
What u say is right but all people are not artist those who want their photo to …
ytc_UgxsV0Eut…
Comment
robots will be designed in all sorts of shapes and sizes, fit for purpose (so not even the plumbers are safe) however, if none of us have jobs we won't be able to afford whatever the goods or services the bots are providing so its all a bit of a moot point. I think the one thing that might save us goes something like this..... We have created this AI in order for it to serve us and help us solve problems, create things etc. It will take away our reason to live, but because of that, it will have no purpose, hopefully it sees this as a big enough problem to its own existence and it works to optimise our lives, not end them or make them redundanr.
youtube
AI Governance
2025-12-04T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx68Xp57MkZcWQhtMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9N9EU2ECr2BBkSoN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgweiMNjLEVvTlitmUJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy3iGtMeTwW_p4xAb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweG2tUuUoox0ZcbyF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfzyIAQ3d4VKtOYR94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgRpT8ZghF6p5N0L94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymW6uvFy5iHDwnI-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKbY26j5cjDE3pSYF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBGf_sUEITCUiU4Hp4AaABAg","responsibility":"user","reasoning":"unclear","policy":"ban","emotion":"outrage"}
]