Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I noticed that alexa thing too!
I dont have one, as its not reliable enought.
My…
ytr_UgwljomVA…
G
I mean. I hate to say it, but I don’t think people will actually take it serious…
rdc_hm8jpf7
G
There was a few drawing i did where i had an idea, but i was jsut struggling so …
ytc_Ugx9Ub2Xo…
G
I'm an attorney and gpt is a powerful tool, but obviously it's not a search engi…
ytc_UgzLf1gOJ…
G
AI induced psychosis.
Your "conscious" AI does not exist between prompts. It on…
ytc_UgyNhUFFy…
G
Charlie’s art may not be conventionally “good” art, but it has far more characte…
ytc_UgxnBSakL…
G
Sarah Biffin, a Victorian woman who got supposedly got commissions from Royalty,…
ytc_Ugwqh0jIL…
G
It’s mind blowing that people are getting annoyed that autonomous driving doesn’…
ytc_UgybIE4qx…
Comment
In 2026, it is estimated that 90% of AI's total electricity use comes from responding to users, not from the initial training. A single AI search uses about 10 times the electricity of a traditional Google search.
Better restrict use when bare necessity!
Good lesson!
youtube
AI Governance
2026-04-06T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugw_QaHrI5fPnDFnli94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxdXSl18IVbQQoGDSt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxEo584FaMC2JMAB0p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzH00a3R4BtLPAXR2l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3eXOV8hZtADyARup4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmvabpZMWC60aESCZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyXO0rMVWbMgokTUaR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwTLEXm0dF8NZp79GV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw1apBEy8QqyemTK7V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykW0vEWGS-jA2lYV94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]