Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Love your show, but a little disappointed in this episode. This is just straight…
ytc_UgxfdTbH1…
G
I kind of love whoever your AI girlfriend is. She's so funny <3 <3 <3…
ytc_UgxWPVTva…
G
So there is a chance i might experience a apocalypse in the future ai domination…
ytc_Ugxsq1OL3…
G
A lack of personnel has never stopped politicians from waging war. Soldiers are …
rdc_dwvgl96
G
Oh lets create this monster and hope we can controll it .wtf is wrong with you…
ytc_Ugz4eVmaG…
G
Okay so what are schools doing about it? why dont they ban laptops in class and …
ytc_UgyKFWZ9N…
G
the biggest problem we all face, even now, and increasingly in the near future, …
ytc_Ugw7VTBJ9…
G
Louis, have you heard of data poisoning in regard to AI models? Essentially ther…
ytc_Ugw9oWh_k…
Comment
Ai is only worth for all the information put into it. Information that has copyright rights and that was and is stolen. If I violate copyright rights, I go to jail. But these companies do it without any restrictions. Ai is a lie. Unfortunately those that put billions on it, like certain countries and companies, don't want to lose money and need to get a return of investment no matter what. Some would definitely kill for it.
youtube
2025-03-18T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzt2VJDX89-5OCUsct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMi4wJTK_UWQpc9AF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZetTb6f1V5YyFZ9R4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzutm1XDSNSvlbMXjB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgweZmRxDLPdgXkvdkl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzM-tCHD9aGIq-aQnJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwvy_a7lQHB8LEQmAl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrmHAHcekTdE9UdJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwjPkxEDUeuj_pXr6p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz2WLScBLRGbt1Yyo54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]