Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Try asking it again a bunch of times and see if It gives the same answers. LLMs…
ytr_UgwDfwETv…
G
I know what he means. When we discovered the atom no one would have thought that…
ytc_Ugw0kNT8E…
G
I imagine a significant population control will be needed. There will be to many…
ytc_UgyI6yG3p…
G
44:22 - 44:30
Ok having watched the video now, if what you mean when you say “AI…
ytc_UgzDMLGKI…
G
"you can't even define what consciousness is or even explain how humans are cons…
ytr_Ugz6-lP11…
G
The biggest concern is about the moral and ethical principles used by those runn…
ytc_UgyfEUfhu…
G
The real problem isn't that large language models suck at maths, and they still …
ytc_Ugzck4DZL…
G
Doesn’t look real enough for me. Her job isn’t open. I don’t how much she doesn’…
ytc_UgyMJTLwP…
Comment
At least they are concerned. I am too. Machines are unable to have empathy. This is as dangerous as a human being psychotic. They do know this. These AI robots are actually creating their own language but without a soul these are also dangerous. 1st atom bomb now this! Very sad for our children.
youtube
AI Governance
2023-05-19T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwRr3etHRtaby2Er2l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxH40Ew9BwIEFJT5nF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOrY41FIUXnkW6BWZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxLTLfun6GNiz6fjPt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxWCR0-UlkiVoxy4QF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgydBKl03RGcqiMMsPB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0sFPYMMhfPlxd6Xd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwjumGn4C3ez6q_tlV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwB8sUGs8J7obSSP3Z4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwjTFDRDLDxN8DLT0N4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]