Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
SO how long will it take for AI to realize there is no need for human beings? -S…
ytc_UgxWEBpvY…
G
Poisoning doesn't really work against training loras. All poisoning does is to c…
ytc_UgwKkbPJi…
G
Did you miss the article where they talked about how Claude was used in the capt…
rdc_o7pw5pk
G
In the Bible according to Matthew 24:22 CEV
If God doesn't make the time shorte…
ytc_UgzZcrDNm…
G
There were many hearings at the capital about the dangers of AI and big tech and…
ytc_Ugz5pAMOK…
G
The bad part is a human did this to this dude I’m sure if this robot hit you it’…
ytc_UgxSfKbrr…
G
If AI were super intelligent, or godlike, then it will eventually make a "perfec…
ytc_Ugw0Lmqm-…
G
i agree and disagree, with the youtuber, i watch to enjoy, so if it was scripted…
ytc_UgzGOTUYM…
Comment
Yeah
The problem is not ai or art by ai
But if the data that has been used to train the model is not copyright free plus if the model isn’t trained using ethical ai best practices that is a huge problem.
The basic model behind these ai is simple and started with a very research oriented outlook in the beginning. But it isn’t supposed to go this way! This is not fair, not ethical and maybe should be illegal as well.
Btw I myself is in ai industry. So ai industry isn’t about stealing someone’s work. It’s some company and some individual who is like this.
youtube
Viral AI Reaction
2022-12-27T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgxT_lkWRnSXbNPldwF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzOCF8iJG5ucfIs2DJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwH9q64edTEM2dMAEd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzKZi05AKbGCbk16Ph4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwEehaEhe5Simgft-B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"sadness"},
{"id":"ytc_Ugwu-vPWq4JCXPdP2xZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgxFe769mFbkrywVjON4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVsCPOcBR0labuO3x4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVusvfiNYhJKXcdTR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQuErrdZOrYjmXKoR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]