Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I vote we unalive trillion/billionaires and AI/roboticists. They destroyed the w…
ytc_UgyyVmaLP…
G
You can put a constraint on the self-driving car that it is not allowed to follo…
ytc_Ugib-iFxI…
G
Why does it feel like things are not going to end well for most human? I know we…
ytc_UgxRRg-5u…
G
I think interviewer could not understand what Sir Roger was trying to say, may b…
ytc_Ugz4VVyqM…
G
Any 3D design created by AI will fail as soon as it gets to the quality tests.…
ytc_UgwFuWF2C…
G
Hmm. So, this is answer why when im browsing they check if im human or robot 😢…
ytc_Ugx6WTyf0…
G
We can spend our time enjoying the fruits of the AI. We can all do everything on…
ytc_Ugz5sO8zI…
G
AI "art" is like canned goods or fast food meals. And if you consume a lot of th…
ytc_UgzGI9sWU…
Comment
It is absurd to ask that the Americans stop pursuing High AI.
There are countless others that are NOT American who are also pursuing it. Even the most powerful crime lords and Dictators are logically going to be trying to achieve it because the one who does achieve it AND puts it to proper use (can use it to out do other AI's and others over all) will basically become the most powerful person on Earth.
To ask the USA to not pursue it is to literally give ANY and EVERY other person who is NOT a part of the USA an advantage over getting to High AI first.
youtube
AI Governance
2023-04-18T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyfblKVfB_mqwzMvLl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwyLITVL5YapM0gFNx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy8CEOKb-jRXtkTW8l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgziqxK97902O0qlU894AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxjSPRtGFQIx4Dmuph4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyctdq7IhhaXHK3qDh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyWe22vxAklxeTmGpt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7RcmGntBw21YldKx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy3kErENBazddJ_X-J4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxbp1y1cNE7QJTFXmx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]