Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What pipe dream is that, expect a company to give money to the person it's autom…
ytr_UgyaXQkS6…
G
You can use a tool to do anything conceivable by man for good or evil. A yard r…
ytc_UgyeUB1VX…
G
we dont need a turing test to tell that a chatbot using responses gathered from …
ytr_UgxDs5KSr…
G
I would rather trust a robot with a .45 automatic over my cousin Richie any day …
ytc_UgxEyrOBX…
G
Always get payment up front! Almost Every. Single. Time that I've worked pre-pay…
ytc_UgwgBtC4R…
G
I just wish that they don't have robots and AI because that takes away from the …
ytc_UgwwYmm8I…
G
The ai is trained on material scraped from the internet, even the worst places. …
ytr_Ugyz4UwoM…
G
It's really a problem of population density, or the lack of in our cities. NYC f…
ytc_Ugz4Qttw8…
Comment
Really solid breakdown. The “AI reliability tax” is real — we’re seeing it too. For founders, the key is starting small: one painful workflow, measure results, keep a human in the loop, and scale only when it’s actually saving time.
youtube
AI Responsibility
2025-10-06T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzl0O_xyjzvHi9ceUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrxWdV56yB-pzzoQV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgybV2udL0niqpMvepV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKgwIjouK7CSjhK7V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwR4Vzl8FgLyzNc9kR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNUBWrI3tASfJIjc94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxhbNoFwCcGJbNDz6R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx5UrqcBU4jX4GOlt14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyT0Z7NvQ1rvs0e5714AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyVTwOYFRMfntGHzNB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]