Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
But if braincells were this efficient then we would be a lot better at everythin…
rdc_jp53ykd
G
You could have easily tested the algorithm by switching the two men’s positions …
ytc_Ugwsc7P3S…
G
ran into a coder in san diego a few weeks ago. He was my uber driver in his supe…
ytc_UgwsJRmNp…
G
This explanation reminds me of the "cyclic universe theory" because it's relativ…
ytc_UgxvwxNW4…
G
Okay so, this is 100% breaking TOS, BUT!!! It’s a touchy subject and something I…
rdc_kcqzize
G
wow amazing one of the best videos on this guy's worst take about ai cgi aka art…
ytc_UgxPvsHwU…
G
Yeah, shut off the data center. Exactly. When tho? After the AI dropped the bomb…
ytc_UgySrwFK1…
G
I don't want to be the annoying european but a big reason why these autonomous c…
ytc_UgzES704r…
Comment
Blaming an AI for killing people would be like blaming a baby for crashing a car if someone let them drive one. An AI can't kill people, a person could kill others by letting an AI control something with the power to kill people, and that person would be responsible.
youtube
AI Governance
2025-07-01T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwKElLAvEDZBY0Xzwl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxOavVCxX-lMUsSxfZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxy7cVYhiC9A6EdcIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgygO8sPkPsVmlqoucB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0j5PRLejTXYNqA0V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx_HZr0FVAX9teFLA14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJiGU7FkQiT6vZitt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxyzA07HRO9plWsrMp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxHbk28PCyBgJnrM9d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwTRJ2tgqXcR1IV2pB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]