Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also it is not 100% accurate and it helps to have a broad knowledge on your topi…
ytc_UgyPgOvYN…
G
FSD has been around for a while with Waymo and other companies. And if you’re ta…
ytr_UgwFKhqdy…
G
Congratulations on ur mission of turning childrens into bots...really china is u…
ytc_Ugxi7cgqv…
G
People are already trying to make awful AI movies, I bet games aren’t that far b…
ytr_UgwijsAWD…
G
Yes, this video is very misplaced and shallow in its outlook !
Take my friend w…
ytr_UgxHbeT7X…
G
they might need to hybride this. and let the passenger drive if they choose to w…
ytc_Ugx3Zwoch…
G
If humans kill millions over economic wars, that’s normal. If AI suggests killi…
ytc_UgzWpjaul…
G
I think you dont know how ai works or you dont understand language. It just pull…
ytc_UgwIzR9VB…
Comment
3:10 well that's because of the kind of energy used and it's consequences yeah AI eats a lot of energy when training since it's doing at super speeds, in my opinion is not AI per se what causes the problem but how everyone uses it, AI in the grand scheme of things would be useful one of the best tools of humanity and if misused as it's happening it'd mean our end, I'd not hate AI just because what it is nor those who created it or those who use it since it'd be meaningless just pointing out better ways to use it really and technically if people still want to use AI but you give that AI poisoned images then you're extending the period for when they need to be trained which will further strain the energy sources so it'll produce more CO2 etc etc
youtube
Viral AI Reaction
2024-11-04T00:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwN3u-zTfRi_5y_iLV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzvj87ZgMRgmFYlb854AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCah5IFFObLBh6SL94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWG20oAhQ2WfEHd1R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyQEaoJFGBOgdmeQtF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwT0LRsaubg0dJqr914AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyobfhgmYvWxv73I9V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOy6d0LAS5cu_cB6B4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw-31s9vqAoOQPmx9V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwU7nWq3C0W7CLy7CV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]