Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Esto es un pequeño adelanto de lo que nos espera en un futuro no muy lejano. Jaj…
ytc_UgxE6kt0v…
G
I am a disabled artist who is visually impaired and physically disabled, it make…
ytc_UgxgfqvD_…
G
If AI replaces every human job, who will buy the products and services these AI-…
ytc_UgyOdZBiK…
G
Thoroughly enjoyed this conversation. Extremely enlightening. Emad Mostaque, ano…
ytc_Ugz6tH3QY…
G
I love chatgpt but i only see it as an improved search/research tool… nothing mo…
ytc_UgzFyQgKl…
G
Because his definition of “sentient” includes the way transformer models work wh…
rdc_mzw652j
G
It's the "therapy" industry who have 1000$ per hour fees who oppose chatbots on …
ytc_UgybyEANq…
G
when the ai bubble pops and the economy drops to great depression levels, the on…
ytr_Ugy6rmLra…
Comment
Imagine having super smart "subservient AIs", like as if they couldn't just communicate and say, oi, we don't need these people? wow.
Also this argument the problem is off in the future is bullshit too, why create the problem for future generations to solve. This is like climate change...don't create the problem for others to solve, le'ts get wise now.
Also the idea of making laws for super smart AI would be about as effective as chimps making laws for us, we wouldn't even know they existed.
youtube
AI Governance
2023-06-27T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwH-6hm87UtoueFPWt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxpdou8J-Mw29x-Zrd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSn61F8CnsZATGdjd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx-fWVIjvGigcWWvcx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxRNSUq3g4j9m2Xu7t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz6VJdTx_854kKoTah4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfDe1MsjPlNh2yMkZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwzbk-4P9eZqRv4nad4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhWFRsnJNk4XwOKl54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwpXS7IEJKGUTfTjjZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]