Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@dannanamanna What then would be it's goal? AI is developed by humans. If humans…
ytr_Ugzc28NqK…
G
I like to use AI as a little toy to cure boredom, generate wacky text, amalgamat…
ytc_UgxApQtRb…
G
OK I'm not worried about the current Government, provided they obey the laws and…
ytc_UgwJ4rJha…
G
Hope ai goes furthur
That way u can be one man studio and make my own anime fr…
ytc_UgzoPjkl9…
G
Not just Trump. Did you see Ursula's AI first speech in the EU?
We're going to …
ytc_UgxNl41UR…
G
There are sometimes huge water reserves underneath desert sand. Did Ai did purp…
ytc_UgxS3F690…
G
"full Self driving coming next year" Elon 2016 2017 2018 2019 2020 2021 2022... …
ytc_UgyrOqFgr…
G
This guy is straight up lying and selling fear 😂
Humans are dangerous — just loo…
ytc_UgxN3clOK…
Comment
See you all here in 2030 saying the same shit 😂 unless they are going to laser us all.down, it won't happen, complete collapse would happen! So do the robot's and AI pay tax? Do they give a damn? No! Will they consume products also no! Are the big companies going to pay more tax to soak up the gap from nobody working? Also no! It is possible to replace humanity with robots, yes, will benefit the world in the long term, absolutely not! Killing your tax payers and consumers is suicide. Trying to predict the future is the same as telling someone where the stock market will do this year, they are lying nobody knows just guess work.
But if im wrong im going to purchase magnets, stick them on all robots and watch them freak out and lose data 😂😂
youtube
2026-02-09T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwilHFLtyP555PM-ux4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwIPj32Xu22O5lIhKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6Klcr99V1oMnpM6R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8qcq5k9dzE-4Xbzd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0yECUaBdPa6d3YJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIwM8PPCx4kiTRFT14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjiXp3xnQXI3AXDzt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgziwyL6EuRvDSCMj6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwvw2mB9lo9Tk2gOhR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxsPNHqcSKTF-57PWd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]