Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Exactly! It's even pretty weird to me that many artists claim that their art "on…
ytr_Ugx3nN3Sg…
G
Thank you for bringing this up, Bernie. High time for more attention to this top…
ytc_UgxPS2NUz…
G
Also you have to figure out the energy to support these Ai systems.. where are y…
ytr_Ugz8_LAtQ…
G
Okay, hot take time. AI is a tool. So like any tool, using it to speed up or eas…
ytc_Ugzu2KmQN…
G
"Fully autonomous in 2020!" - The Muskrat is such a bullshitter he gives Traitor…
ytc_UgwBU0VYH…
G
If people are worried about OpenAI then they need to research Expang Iron Robot …
ytc_Ugw5LaLlZ…
G
You’re absolutely right! You didn’t just approach the problem from a *practical*…
rdc_oh18zl2
G
Aseguremosnos de alzar nuestro nivel de consciousness! As ChatGPT says “ I am t…
ytc_UgxjgWiJi…
Comment
Automation isn't new, it's just jumped from rail to road, autonomous trains have been a thing since the 80s. The only real solution is just to not allow companies to replace paid positions. Automate them if they believe it will improve safety or productivity but still require someone, an ACTUAL PERSON, get paid for it. Either pay into universal income pool or something of that nature, everyone will still get paid for "their work" but humanity will be able to defer most of the actual work to machines, they aren't alive, they don't have feelings and therefore can be exploited in this way without moral issues, at least for now, AI development is somewhat concerning.
youtube
AI Jobs
2025-10-31T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw3qOemnnZHT7t_KrB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzOOeia2leIjlFv-Dp4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzEoVkMc2AJyaStP9J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwI-3CQH_2T3KuFNiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy-qoiAOcqBwDG5yR14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwQ83aaR7BE7ckpup4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyKLPC-b2LEjJDDAR14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy0NGJ0qPvcvtJcBcJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPruSyFRNtxAFW74d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkpGEXZecply7mjPF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]