Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can create apps using just AI in 4 hours
Tea app gets both their FIREBASE db…
ytc_UgysvYNrm…
G
I don’t think Geoffrey meant “to be a pumbler” literally, a robot with AI can de…
ytc_Ugykh-a_T…
G
Coworker and family: Why do you look so fake Joanna?
AI Joanna: My responses ar…
ytc_UgzIpaj8B…
G
I'll look into a report on DevOps. What do you exactly mean by AI integration?…
rdc_oi15ryu
G
For being smart these idiots are creating the worst enemy of our future. Serious…
ytc_Ugzjv6s5A…
G
OpenAI just needs to reference the controversy with Musk's Grok. It was clearly …
ytc_UgzIH_Osg…
G
I'll coment first. Using AI for r*pe and k*lling people sounds a lot like "Dunge…
ytc_UgweFDses…
G
"AI is inevitable..." just so true as "AI using against AI is inevitable...". S…
ytc_UgwEJBKQ0…
Comment
I think an important variable is missing in this equation: resources. In the near future, there won’t be enough energy for AI to take all jobs or to build the machines needed to replace us all. It will take decades and decades, maybe even centuries and centuries, to solve this problem.
youtube
AI Governance
2025-06-16T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyAiTOedrBS8WNTDGd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqLOJHMpGxwaQbFtZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyY04CCzB8EuCV5_bF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTvpJrg-VRAsZ6zpJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf_7ygdN7dVADAw6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKQ8402Egi5bDRRfF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8_ThM8byOBjplkQR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwfkGfYHhmjfE6sTPd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzB8GwtjR1rjEJbOhR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnAFwiAX2Nn3_VMhV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]