Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We are at the beginning of AI, while the investments are in the billions and tri…
ytc_UgzWjQ6ju…
G
Even if we don't know exactly what sentience is, maybe we can come up with some …
ytr_Ugwmkluq8…
G
That's an intriguing question! The relationship between concepts like reincarnat…
ytr_Ugx8EjnS0…
G
they'll just say that one day, with advancements of AI technology, mankind no lo…
ytc_UgzvBrkdH…
G
did you really just tell us Native people to “learn from history” while ignoring…
ytr_UgwnQFtkq…
G
How about we program self driving vehicles to measure large vehicles outnin fron…
ytc_UghqMvbGk…
G
@caseyrickeyart credits are given for remixes, the originality is not disturbed,…
ytr_UgxNVeiaX…
G
I loved the video, but this doesn’t seem to be a difficult self-driving challeng…
ytc_UgyxTEMiR…
Comment
The AI 2027 scenario from that BBC piece is chillingly detailed—AGI by 2027 spiraling into superintelligence, mass job displacement, then rogue extinction by 2035. It's a stark reminder of alignment risks and unchecked acceleration in AI development. Genuinely concerning how little emphasis there seems to be on robust safeguards before we hit those milestones, especially with fintech and societal systems in the crosshairs. Thought-provoking watch. 🚨🤖
youtube
AI Governance
2026-02-25T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_BxWIzW48C8tOHlB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxlIwijgiYmoUYe0VF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwU9brjyXaQQB8chgp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwGCn0Yy-d3VhSR-mB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw7Rq7fChMg0dtZZFV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxFYhSLIVkY6Dlu3oh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_PFy4UVHHuKdFu5l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxPylUV2bS3_0LUCg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws-9lw50vSnMNX15t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw4pNHPNMzXrF4T7Wp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]