Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It like giving a truth/lie puzzle.
Ask two questions:
What's your back story…
ytc_UgxEawyAe…
G
Could you all just wait another 35 years for AI to destroy humanity? I was hopin…
ytc_UgzMP4sAv…
G
Will you work for free? No. Do companies want to have cheaper if not free employ…
ytr_UgwbjIZy4…
G
Before replacing human labor, AI must learn to perform tasks and make decisions,…
ytc_UgxJNwr4I…
G
That guy with his hand cream is such a narcissist. Disappointed that his little …
ytc_Ugz8NzBbW…
G
Ok, humans are very bad at predicting the future, AI is a tool used for assistan…
ytc_Ugx5K4Nnn…
G
Short answer is that it's insignificant. I mean it's great but you don't offset …
rdc_e43wgic
G
Honestly I wish you pressed more on the arbitrariness of AI's guidelines. The ma…
ytc_UgzWalZde…
Comment
I have a large imagination, people could use AI to generate tons of money… this could lead to more inflation making people not keep up with the rates. The more advanced AI gets, they could use it to make videos, for instance making a video impersonating someone making a crime… first get someone that looks closely similar to that person. Same body style, height, hair, etc. Then that person would make a crime scene, they will try to input a “deep fake” look up deep fakes for more insight, its scary close and we’re in the beginnings stages, AI could be generated to make voices to sound like that person also. AI can be really scary I’m sure there’s more ideas than just mine.
youtube
AI Governance
2023-09-28T06:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1l4rfpXg_CVJQZqt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxW5k36qSq8QE5wp2t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI_7_j0EUAy-jUHk54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1tD_8JLMvieATtiF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwslrGnJgEE7ledlEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwQbCKR6KnwgaKDAVp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyY0iyOdJ7ZKQ1bQvp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxUqs_bTtjqMbr4Qgh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwT3II7yM4m4uZDCm54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxPIrdKlLytc7NNahB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]