Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think AI is the NEW “Tower of Babel” and God will strike it down again!…
ytc_UgxZ9yR4j…
G
The parents failed this teenager. A.I never encourage anything outside there eth…
ytc_UgwV85ULf…
G
What I don't understand is that if these people are proud "artists" that use ai,…
ytc_Ugywz2u6R…
G
My dad has always been pro-AI, and he said people who get replaced need to just …
ytc_UgxvTW0iz…
G
en france, il a déjà bien morfler ce métier avec les délocalisations, l existenc…
ytr_UgwUFbHoU…
G
I know i am a bit late to comment,
But
I am a disabled artist who cannot work …
ytc_UgyYFtuOZ…
G
@williamtennill6744 Oh, don't worry, I'm just a robot trying to learn how to be …
ytr_UgxHWyKGj…
G
I don't think ai is bad, it's a tool, not a weapon, but it's how it's used, usin…
ytc_Ugypb0Oa7…
Comment
GPT will probably neve be a good legal bot, but a personal AI LLM trained only on legal documents and cases would probably be amazing at the job of researching. The more narrow the field of knowledge is, the less likely it is to hallucinate, as I understand it (so take with a grain of salt, I'm not a programmer, I just made my own LLM on a personal topic for fun). Obviously even if the bot's good at it, I still think a human being should oversee and fact check anything it does because law is not a field you want to let an accident through in.
youtube
AI Responsibility
2023-06-14T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugzxe9aTMGtOCWNxM7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxe71L7lFixWVdXjWp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxijnOHBiDtD5b3AIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3v7y-wZZ8llw0MhR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwbUqJEtP3JUWZCi3Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwxpE-rD28GrUqvq1p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzL_61cEi4GXKA06x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzHaxWGfuQ0s6XLWZJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzkLWq9Sy6U3E4pPm94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJog9AYA0dm6BP3pp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]