Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This kind of specialized ChatGPT is exactly what we are building at Remble! 🙂 we…
ytr_UgwJYc94o…
G
Pausing at 39 seconds to say that my observations so far are that a lot of video…
ytc_Ugz7gLsd-…
G
There is a term that is not used much anymore to describe a trite saying: "a bro…
ytc_Ugweva5fX…
G
You just need socialism.... Stopping dancing around. Automation is fantastic. It…
ytc_UgysS1wJJ…
G
I’ve used my organization’s AI tool to organize information to help me get start…
ytc_Ugyzt1miR…
G
just one question, how does the AI company still do profits, since costs sinc so…
ytc_UgzPv5sjH…
G
Possibly the strongest proof that AI "Being Inspired" is NOT the same thing as h…
ytc_Ugy_x8Tt_…
G
This paired with the dude who used AI to write his case AND the AI just straight…
ytc_Ugw0jkl4S…
Comment
I think the energy constraints will do the most to limit AI. Even with Nuclear. Biology has had a long time to create efficient energy systems. We still don't have effective understanding of how it all works. AI will do as humans do destroy things before they fully understand them.
youtube
AI Governance
2025-09-04T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx_bV1jwLAjuNilkOl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxeb0e3BsIISpa6Qr54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"confusion"},
{"id":"ytc_UgwTDdEgXsZ7_fOv1OV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgztFGr4QwQqe2QA7kR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfMH21s_XWjLyY2Sx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwWStSA1qosnBpGQvR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyjkcyiXxGHu13gCAt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx79llVT16gbB0P6Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsGLDsfs5jZMktyDh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJXvUbV2lGnUpDP-B4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]