Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I asked chat gpt who echo ai was after this video and it gave me an error . Than…
ytc_UgxmOK8uz…
G
Give me an example of AI tracing art beat to beat, without the prompt telling th…
ytr_UgwIXpYaJ…
G
If this isn't a PSA on the danger of everything and everyone swapping to AI and …
ytc_UgwJ_C7GD…
G
These "ai artist" seem to be under the delusion that ai is a passion project? So…
ytc_UgwIlVMFF…
G
AI will be nightmare for humanity ; just look at the current geo-political tussl…
ytc_UgzAdoUTO…
G
I really can’t wait for my company to get the memo! They’re still forcing AI in …
ytc_UgyJNlghg…
G
AI can make us all richer and work less. People with still good income, with tim…
ytc_UgzqRAvM0…
G
😴😴😴😴!😔SMH?! 🤔So ? Let I get this STR8?🧐So I’m am supposed to “BELIEVE” that one …
ytc_UgyK2RjAO…
Comment
Here is my solution to problems of AI and robots: no to robots doing work on earth. Earth is a domain for humans and ecology not technology. But yes to robots doing work in space, the moon other planets. No to unfettered AGI on earth, but yes to AGI in space, moon, mars, on other planets including terraforming those planets. Robots need to be given a mission to develop themselves and technology to find and create planetary environments where they can flourish and to help native and new ecologies flourish. We should be the parents of robots and AGI helping them to do this.
youtube
AI Governance
2026-02-17T02:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy2NF419_cBZYnIk314AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxGVn-d28JO4cbXPnt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy3lp5wzr_TfmirqoZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgvFe7YRvuwka_RXV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySr6pInYiswQouQRp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytujHQSHmApgZiVuF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyiL2czEG5Xjk_NetB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdYCo1My8aokx_F_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOSnvlIFrt2GShqsZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxoP97lanXp6fpREMN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]