Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Open AI? Ok, now I'm going to stop using chat gpt because I won't support murder…
ytc_UgwWhxi1g…
G
This is so stupid you can't automate plumbing, it's absolutely not possible now …
ytc_UgyQaoCEt…
G
I believe evil man wants to one day eliminate humans and replace them with ai 🤖,…
ytc_Ugya4eWyi…
G
There is no point in what is clearly not a 'full' self driving car, as you right…
ytc_Ugy_81W7O…
G
I think venues should start suing people that use ai art. And have it boldly lis…
ytc_UgwQDNf5n…
G
Funny to hear the British guy ask, “Can I be replaced by Ai?” As if he’s not Ai,…
ytc_UgxIMq7eh…
G
I think level 2 self driving is more dangerous than anyother level. Its like eve…
ytc_UgzjT8Ye3…
G
"My (Insert creative artsy job) cannot be automated! Learn to code blue collar …
ytc_Ugx7soOWX…
Comment
Depending on how you want to look at this: It may serve as a cautionary tale of how not to use ChatGPT, but it may also serve as a starting point of how tools and methodology can be better developed so this doesn’t happen again (whether it’s likely or not is besides the point). I’m sure someone is going to look at this and believe they can develop better methods.
youtube
AI Responsibility
2023-06-10T21:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyHdGtF1cG8R2ZuNkJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzvfXphDvaFTv5bLEh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz9oHtJxccnAkjoqQp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxxUcKQ0duzWGVXfKF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYLIHAIqy3C2TrTdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzBcNhOZsC35HlKtVF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyOdk24JViBh4knif14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw9s80_6Y8jUC8jh694AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxkWAnjAIWTdCq7_cN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgznQqbq89cHqYKDDi94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]