Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It wouldn't even be a problem if it was a Pro-AI ad but these are anti-human ads…
ytc_Ugz58h8Ur…
G
I'm an artist and I sympathize and understand the struggle in this time of risin…
ytc_UgzinjDfc…
G
AI is developed by idiots. Without spiritual, companionship and love, AI become…
ytc_Ugwd-MsB_…
G
So while we are making a God, or DemiGod of sorts, can't we have an understandin…
ytc_UgzHj2EQ7…
G
@adamchurvis1
Did you understood what you said? Then please, explain to me what…
ytr_UgyuUTz-3…
G
ChatGpt is not scary, humans are! Just as a hammer isn't scary but a insane huma…
ytc_UgwgYcgQV…
G
I'm glad you talked about the photography example, but I'm still looking for a b…
ytc_Ugx-7gnDX…
G
I traumatize them ❤
I kinda felt bad because I made one AI eat somebody to prove…
ytc_Ugz1-XYwI…
Comment
It’s interesting to hear this perspective but as great as AI is at coding humans write that code and those who make those frameworks and build things with it is exactly why hackers even exist. So, without the need of coders because “ai is so great at it” the whole security industry wouldn’t exist.
I think you might have missed that perspective. Again, I could be projecting as well. But, if you agree with my pov then AI being great at coding and not leaving “holes” then wouldn’t there then also be less of a need in the security sector as a whole? Besides the social engineering avenues that seem like the future of cyber more so if this is the case.
youtube
Viral AI Reaction
2025-04-09T04:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwVkyLDsYW57NVMi_p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw-lbSbk4VFZVbCNFV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgypGpq7pNabgOSAhg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyI4Fks8NfPnfGFh294AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},{"id":"ytc_UgximqkQ_kNDc9uHu794AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugzbo3gtk4YmWxRza0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxrgE8XlNfKZncyVph4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgwNXiSBMKzBX-AHW1l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzJ9VnyUoH2QKvqdxR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_Ugyuw9elZxjzG2Ky8VN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}]