Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's a big difference. We humans have a soul, a spirit, and a heart. We have …
ytc_UgyG4r2Y_…
G
Is universal income similar to Public Aide? If so, how would that replace a well…
ytc_UgyWEuaWN…
G
Yes but ai is no longer llms it become physical with physical ai in 2026…
ytr_UgzBEHqQb…
G
I saw a video yesterday that my father sent me that you couldn't even tell was A…
ytc_UgzBHMx8_…
G
Rise and shine Mr Freeman, Rise and Shine, not that I wish to imply you have bee…
ytc_UgxM6lZfx…
G
People defending multi billion dollar companies´ right to profit in the middle o…
rdc_grsbwd8
G
Hmmmmm, so using algorithms to prevent crime is dystopian but kicking people off…
ytc_Ugy38AEC1…
G
Because the AI feeding the speech is programmed to respond as if it was another …
ytr_Ugzg6Mfa6…
Comment
I see what Sal is trying to do and, at first glance, it does appear to be an efficient method of teaching, but aren't we missing the actual societal changes here? For example, when you break down the Great Gatsby to an AI interpretation and presentation, you (without knowing it) begin the process of eroding the human artist behind the work. You begin (perhaps unconsciously) to project intelligence/agency/omnipotency into a machine. In short, you eliminate the presence of the author/artist whose experience and talents (etc.) created the work to begin with! Hence, do you really understand the consequences of incorporating AI into education? I don't think Sal does or many others who endorse this unchecked AI into our society.
youtube
2023-09-03T02:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxLEGw-aH6OcIlZuXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzh7ZkZDh1ZaL-LILR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzHP61b76DDZfTL-WB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzWzWRBO2mr2RNgRFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzVkx6mdZ2ZoP4NXtd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwT0zn6B-16TjnSIld4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxJLhqsFLJV6p7DCkt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwDDNPjEVMgAPrQ4qZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzDvS1A4D3_q_YWVP94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxY4ZHLUTOVsfl6tt14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]