Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"You wouldn't be this critical if I hadn't used Ai"
Yeah I like to be nicer to p…
ytc_Ugw-2OtJp…
G
Which is why we are so dangerous. So yes, actually, be scared. Unconscious AI > …
ytr_UgwEy1ktR…
G
Let's make it simple, AI output can't be art, is just content, and should stay l…
ytc_UgyU9VXQc…
G
Theyre both ai
The second one is obvious but in the first one, parts of her face…
ytc_Ugxy5spFJ…
G
You are all very brave for watching and listening. We are the few- who can thi…
ytc_UgzhElowQ…
G
@Poodonkis314those uses would not require the amount of data centers that are go…
ytr_UgwaIElKx…
G
Hilarious. Chatgpt5 sent me on a wild goose chase opening a Workspace account. S…
ytc_UgzAfvDrc…
G
They should add a button labeled "second opinion" or something that prompts the …
rdc_mukf35o
Comment
Here's why you shouldn't be scared about AI
Case 1. They'll never become smarter than us | very unlikely
Case 2. They'll become smarter and solve almost all problems but due to physical limitations they can't advance to singularity | likely to happen
Case 3. They'll reach singularity. Everything you know will instantly cease to exist and you wouldn't even know it | I don't think it will happen
Case 4. They'll destroy humanity, after all we're all doomed to the climate change and overpopulation. It's just a matter of time. | this will happen with a very high chance
In conclusion *nothing matters.*
youtube
AI Moral Status
2023-08-20T19:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz8TjFfF1KbP-SLbfd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwduKgbYlwr8rpXyQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylIWsTHYs2v5xc8554AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugza24H3yRtt0p0wRDZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxzG8zOKGZF9Bse01Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzPwHJCbEpISc9j1qZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzArzDu0tgB5SPh33d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9XJ5rDAwaraxseGN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyJVMYav0z6qa0Jfr14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxRVIPjPcmwVNdM-QR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]