Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Even this so called expert know nothing about AI. in the future AI. will become…
ytc_Ugz-S_0AE…
G
I'm really glad AI exists, it makes a clear divide between the insane and delusi…
ytc_UgxWBswtk…
G
Would you place your life in Ai making decisions off a camera monitor? I don’t…
ytc_UgwUzbLLZ…
G
The weakness of robot is they didnt know real human trick.Robot useing program …
ytc_UgyY5JfAJ…
G
As long as it doesn't tell me which space to take in the parking lot.…
ytc_UgyxlS7-C…
G
A better example would be a person plowing the field replaced by a driverless ha…
ytr_UgwWL5dhJ…
G
The fact this guy hesitated about pressing the button to end ai, knowing that hi…
ytc_UgwK8rWcV…
G
Way back when, photo photography was considered "cheating" compared to the raw t…
ytc_UgyAAWATy…
Comment
Not even a question if ai is going to be more valuable long term. Mostly in robotics and finding new medicine for sick people. There are also other factors as to why the us is investing what seems to be an unreasonable amount into ai. That is for strategic global reasons, if the us wants to continue to be the land with the most innovation, best weapons and the best prosperity, they have to be the best in ai. If china gets murderous AI robots before the us, the us is cooked. Investing because the alternative would be to give global power to another nation, doesn't always have to be because of ROI, but rather of the understanding that you have to, to keep your global dominance. But AI will in 20 years have made ROI twenty fold what we spend today no question, but probably not short term. So the stock market is going to collapse once people realise the ROI will come with time
youtube
2025-12-07T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwCN4r2nR8p7E_e4Z94AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgxHtTw5bdBWW-HB1Rp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugy3g3IP0wIJ0ib_Z0B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyfKc9FUPS_kuG5mNF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwqrybJq-47nNrHm1F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyhxaRvok8EWwEFGS54AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgzUvEcEn4xSqqZhDMd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgwHydWkhajs1LVKExF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_Ugy-tZNLfj5D8MGR_HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugwbq0KRw9yI_ba2vkB4AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"resignation"}]