Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Thats a business elon says ai is far more dangerous they why he is creating Soph…
ytc_Ugy2x0l45…
G
I'm here just to let you know that I'm with the ones who support you and other a…
ytc_UgyZtVKlB…
G
i dont belive in ai because every day new problems rise and they are difrent ai …
ytc_UgwzP9uKu…
G
The biggest mistake they made with AI is calling it “artificial intelligence” or…
rdc_ks2mpva
G
They talk to slow, there is to much dead air among that conversation. I don’t th…
ytc_UgznsMzvi…
G
This is a lot of overhyped stuff, mostly from AI companies themselves. The more…
ytc_UgyUEgLnR…
G
Can you say more? Not sure what point you're making? Is it, we shouldn't be doin…
ytr_UgxpTackW…
G
Man AI ain't that bad 💔 Just small kids have a problem w it. AI is actually very…
ytc_UgzelEs3A…
Comment
Aside from personal views, there's one thing that a lot of people are discounting: AI companies are burning money at unprecedented rates to keep users flowing in and making their AI products better.
At a certain point they'll need to start making money from all those AI, and prices will skyrocket (and we're already starting to see the early signs of this) so it'll be less and less accessible for anyone who's not a big corporation, especially the really good ones.
Not saying that nothing will change, on the opposite, but we as humans are usually extremely bad at forecasting technological developments so it'll probably be very different from what we expect right now...
youtube
AI Jobs
2025-09-09T07:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugygy7HxdZ7pgdwedK14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEodX0Qf10Pw3XP8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxu9Qe_FNqgxBmMMst4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbFIu3izZyoW-8-qJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRxpVip5D8q5KUdHJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxYiOd0OipQvFeC7B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx81S6CqeKYLSr9hNB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIE9_9AKKYvCqURON4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwUvKmmuwZaB7aG514AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzLXxJhMSFLet2ax2F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]