Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It really does seem like there is someone in the room who is controlling the rob…
ytc_UgyH-WYPu…
G
Uber's existence is dependent on the availability of autonomous vehicles by the …
ytc_UgyyO3IPx…
G
AI will become like cocaine in the future, where people especially those emotion…
ytc_UgwJEw1lL…
G
My dad sent me a picture and told us it was ai and didn’t tell lies to us.…
ytc_UgyafS0_Z…
G
AI is ran by selfish people that want to bring humanity years back. Each year th…
ytc_UgxYGlWaO…
G
> Yep, the excitement over ChatGPT isn't because of what it currently is, rat…
rdc_j8c7tq1
G
🎯 Key Takeaways for quick navigation:
00:03 🚗 Overview of the Video
- The video…
ytc_Ugyq9U-AK…
G
thats kinda the fault of the internet, if the ai learns from all of us, is going…
ytc_Ugzz6Euw3…
Comment
Where we need to be worried about AI is the unintended consequences of well-intended pragmatic thinking turned horribly wrong. The Shirky principle states “institutions will try to preserve the problem to which they are the solution.” AI takes what information and misinformation it can find online and makes it credible. One question I asked it was how did the Industrial Revolution impact the American Revolution. What it came back with is as follows...."While the Industrial Revolution did not directly cause the American Revolution, it did have a significant impact on the latter. The Industrial Revolution led to the growth of industries in the United States, which in turn led to the growth of cities and towns..." How is it possible that an event that occurred a century later impacts an event that occurred a century before?
youtube
2024-09-02T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxKK5YotBktVlNtDVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugwm23aCeFafiKqgWhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyD6B58u_n-uAoib7J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyJuEl0gakIQRx0nxF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_Ugzvk4ljBvFClh6HyN14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_Ugw4o5NCerdctUihyjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugzpzh8-Ab1btOQpGRR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxsVnzn_5kcmWeNF3J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwZlA7uDX2cZBmR2cx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgyKuCClanocWSLslRp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}]