Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh cut this nonsense. AI direct overtake is distraction from the real and loomin…
ytc_Ugys9ps5C…
G
You can have an AI anchor that reads the prompts, its not some jobs its most job…
ytc_UgybtxctX…
G
Anyone that believes ai is dangerous doesn’t believe in unalienable truths. Anyo…
ytc_UgzGDG9Hk…
G
Training an AI model on an artist's work is fair use, and what any artist should…
ytc_Ugxnv_UJU…
G
Well, there actually are incredible uses for AI. For example: a group of scienti…
ytr_UgzTCzZZ9…
G
I don't agree and am a user of Anthropic tools. Look, we've been here before wi…
ytc_UgyRXD6g1…
G
Me using Google gemini web for learning
And you have ever try? Just trry
You a…
ytc_Ugx3SV4UB…
G
The only problem here is the legality of it. People just shouldn't be able to pr…
ytc_UgyUQ9tvy…
Comment
The various lengths of trailers and different situations semi drivers have to deal with is not something ai can handle. Self driving cars arnt even where they should be let alone semi trucks. Planes have pilots in them and they fly on autopilot 90 percent of the time. They still have pilots. Most trains still have conductors in them and they are on tracks.
youtube
AI Jobs
2023-09-26T00:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwFcIhmPBmzyBG3Tud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyNPvsIHMO0cUzw9SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugyn3b6hw7hOcXdae8R4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwFsRRwIid_EZ3zA2F4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyYl1dPUYCHOoXoy3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyxrXRiIfnyfJoYJgZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz2lHsIl7RWse1adnl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyFyN3SwIo-PupZvZJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfJ0tPTF0K4Ny3SsF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxOqEnBCCIFcPNAnYB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"})