Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you confuse AI "emotion" with the human programmers desire to make you feel l…
ytc_UgzfF2NBH…
G
D. A.I. developes emotion over time (since we know a machine can only perform th…
ytc_UgyqW50LT…
G
there needs to be a term like 'dunning-kruger effect' but for people that think …
ytr_UgzBOXooJ…
G
100 years… I doubt it would even take a 100 seconds for A.I to figure that out…
ytc_UgyKckLAk…
G
In the first place what were people doing messing around with this stuff? They t…
ytc_Ugxi0HPLx…
G
I disagree with the political consequences of AI. The danger only exist if you d…
ytc_UgwcCdQMp…
G
The real money right now in AI is creating make believe doomsday scenarios and p…
ytc_UgyN2Z0H4…
G
The fairytale version of technofeudalism😂 where they replace our jobs and we get…
ytc_UgyWVsegO…
Comment
I think people are instilling more fear about AI right now than anything. It’ll be a while before all of that will actually roll out. The tech actually needs to be bought, implemented, agreed upon, and accepted by normal businesses. We’re not talking about Musk. We are talking about Trucking companies or Nursing. You just can’t transform and sell your products without people accepting it. Look at the electric car. You would expect that by now everyone is driving one. But we aren’t becuase our infrastructure and the oil companies won’t allow it.
This videos is fear mongering.
Look at it this way, with AI we should have no need to speak to people on the phone for customer service, that should be the fastest thing to be fixed BUT we still have to wait 55 minutes to talk to a real person when we could just use an AI.
Bernie isn’t taking in to account that only if companies can implement it and afford it and change their whole structure no one will buy this.
youtube
AI Jobs
2025-10-08T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzt8W5WJsBDujH34St4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzsrUbntcrMARe3Mxx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw58cS2bme8H4imUBN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwNtLERytfX4bJLTi54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz3Us_16hSnm5Ol6UZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxqLUZ0mifhzImsTz94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz6xDhGDdwwgv25fHZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxlLjuZ2YyHTefYC4d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugy1CBdZCWvXW7Gdc354AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyUFJM99coMdrhMwYN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"}
]