Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The future is so dystopian. When artists have stopped creating new art and inste…
ytc_UgxWswHzy…
G
Buttom line: "We're all fallible" and Ai would do alot better than humans in th…
ytc_Ugw_Wakxi…
G
It would need to be an extremely complex robot capable of critical thinking and …
ytc_Ugzg-AKez…
G
One aspect it's not discussed about this ai issue is the copyrights, ip constrai…
ytc_UgyyypPNm…
G
As a creative person who buys things like these I am so angry when I buy somethi…
ytr_UgyTeZAFs…
G
Congrads, you just taught the Ai the experience of firing a weapon - even regula…
ytc_Ugz-P1ZoS…
G
humans and robots are scarily identical, its just we are organic mechanical comp…
ytc_Ugw7VNxc4…
G
I was so happy when I first first heard about AI poisoning tech. They are obviou…
ytc_UgzCog85A…
Comment
Perhaps that's a way to control/quota power usage with the goal of limiting AI capability? Limit the size of data centers. Control that in the same manner as we control fissile material. Not even state actors should be allowed to exceed these limits. 🤷♂️
youtube
AI Governance
2025-12-14T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxa5ByEDwNrftNdBeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxQLNRCoFVcAmcwETR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxCWA83kPrOA5W5IH54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzzKsXKnsMiK7UIN0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYlJEGQU9-dmWO-Od4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8NR4lH3GDG3kBMJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxM5_wAiiecwXFLxQJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxA8DqJ0Z2yDIFK56x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwta7Msvkq4AFYkXdt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzjNcgSLoAyAFC87XN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]