Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My husband presented my initial symptoms of a rare disease (Anti Synthetase Synd…
rdc_jktck42
G
I honestly think AI could help a lot, especially if we use tools like Pneumatic …
ytc_Ugx9W9nDj…
G
Bro, I’ve been doing this shit too with ChatGPT and I’ve been getting them hemme…
ytc_UgxXve59p…
G
I honestly foresee a rise in conversions of cars from manual to self-driving. An…
rdc_d8aytq2
G
Good art will always prevail. If you can’t get success with AI competition then …
ytc_UgxhNcfTY…
G
Well jokes on the ai
It cannot replicate the art of normal human errors and mist…
ytc_UgzEnw0K4…
G
It's new compression algorithms. If you are concerned about environment it's awe…
ytc_UgyKgJXMt…
G
nope. he's not a robot. he's a figment of the matrix. I stumbled over crash cour…
ytc_Ughe1hk1M…
Comment
All this AI talk has got me thinking of the TV series Person of Interest again.
To create The Machine or Samaritan.
One to save us or control us.
What happens when it glitches and doesn't take accountability for its actions? Would it tell us that we are wrong and then become Skynet? Would it then enslave us as batteries?
This comes down to ethics. Just because you can do something doesn't mean you should do it.
youtube
AI Governance
2024-01-30T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxt24K7wrZ6VTC9VT94AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzRsOO_HboUkgGXnpx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx0jyGQ5nArHq61CyV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzCBjB7BxlyOtpq8N54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxlFRKTyx9E8XuIVGF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzBrxku7icoduZqAh54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgydJ6UjoO6N18aJust4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugycza7bCmNvIuCZBOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyFlZ2DUj8h5hznWuR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgweCcz5BPxx0i7R8I94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]