Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Somthing the situation didn't consider that if all cars were self driving then…
ytc_UgyO36Rq0…
G
@babybatbailey03 "Don't cry little AI 'artist', Uncle Computer got enough AI slo…
ytr_UgypNrS74…
G
"These AI experts and superforecasters say that the AI systems that we do not un…
ytr_UgwyeDOJt…
G
I would guess psychologists, writers, non surgical doctors, lab researchers, and…
rdc_j438nhj
G
Innovation will make AI digital art indistinguishable from human digital art but…
ytc_UgzzclaiD…
G
I didn't complete this skit, the captions are wild... and to some sense, hyperb…
ytc_UgyTCRqdI…
G
I was dreaming about watermelon and when i weak up and open my phone to answer n…
ytc_UgwpRJrxD…
G
There are more than 8 billions human in the world...just took 1 mistake to make …
ytc_UgzGOz84F…
Comment
We need to use AI, not using AI is like refusing to use stack overflow 5 years ago. But we need to use it smartly, to not do the job instead of you but to teach you, to learn why it's the best solution.
youtube
2025-09-25T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwOaYclmw6hzZ9aic54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwImB_UQIWaT7aUieN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5XKlvRBxYvjBbOKl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf83Zqk0RZXi6jazR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbDaqE3pquBYHMa7B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxP7z-Nvn2yWQJ0lGN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxOkCaNsg8y2fhyyx94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwIuOVCDESs38qQz2R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQmh8pi6kudxygF-Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdw08m8QRVwcMB8dB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]