Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just imagine how many future hits will be AI generated and we will never know. G…
ytc_UgyQD1FpH…
G
Stupid me thought driverless cars meant a whole different road system would be c…
ytc_UgyQh1-px…
G
Passing AI pictures off as art should really be get you in as much trouble as tr…
ytc_UgwmgdO4w…
G
AI is very useful; it improves the productivity of many industries and services.…
ytc_UgwRQkMtF…
G
If that's the case, then there is literally no other group who should be dominat…
ytr_UgzMbU0_2…
G
lol the ai isn't just a tool he used to make the drawing himself. the ai made th…
ytc_UgzF0qUJX…
G
@matthew_berman GREAT follow-up question. Bitcoin is completely "online" but req…
ytr_UgyGOTZHV…
G
Robots need rights our technology might be so advance in the next few years we …
ytc_UgxHnrQw-…
Comment
Due to the fact that the efficiency coefficient is always less than 1.0, or in other words, the cause is always greater than the effect, a programmer cannot create software that can completely replace it. This software will always have lower complexity and functionality than its creator. So called AI systems will always be less efficient in terms of organization and energy consumption than their creators.
Any idea of a computer super intelligence created by humans does not correspond to the reality in which we live. Such an idea can only be another scare tactic by the globalists.
youtube
Cross-Cultural
2025-12-19T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzNanwxKvr6SC5jjpF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwdDgxfpblCdsBuixB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxIP0B3IikW-gRrN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxiQR75jYl3whVFNAV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRlzu5yCdkE3suKpt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzchm61pYTnKpprcvN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxJdX69KOMGCTB3dix4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6YgprDhEu47ZRz3t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyjBJ8s5A3utn1vxzV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgziLpUv-e2DNwqo93h4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]