Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@BluecoreG Ai sucks though. As a singular tool it can be very useful, but it ca…
ytr_UgyjJ6wsq…
G
Wasting Penrose's time imo. If you want to believe so completely and fully that …
ytc_UgxLalFwq…
G
The movie Wallie comes to mind. All the humans a fat and living in a virtual rea…
ytc_UgxakMv_h…
G
It would be very very very nice to have an economist and AI expert be in the pod…
ytc_UgzmDPuDg…
G
Where are the international guardrails to civilise AI so its harm is reduced. Go…
ytc_Ugzf9p7qV…
G
Who's here after Grok said Elon is fitter than Bruce lee and can beat Mike Tyson…
ytc_UgyupEzkR…
G
The battle is far from over guys, we still gotta find a way of artists being abl…
ytc_Ugw3kk-6e…
G
Does anyone think AI would have worn a shirt that still has the creases showing,…
ytc_Ugzra4iRQ…
Comment
What is the aspiration of mankind? To have white collar jobs in the now contrived worlds of law and accountancy - both easilly to be taken over by AI completely, just for preserve some form of self-esteem, is meaningless. So let it happen, and that freed up human brain power can be re-tuned to poduce the next Mozart or Shakespear. But it will lead to economic collapse as we rip out busyness from the daily lives of humans. But 100 years ago, Henry Ford summed up the essence of a similar dilemma even then - shunning the use of too many robots in his factories, as they don't buy his cars.
We have created an artificial way of life - is it worth preserving - probably not.
youtube
2026-02-12T22:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzoGEzrZ04dH0QSKKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwbIcGmgsKA7hhuvfx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw8mZpln6KfYEXT9CB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwhygt0NbliESaY0Vp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyifskYkxF13r9UCbd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPFg3mI6ySGPUOb254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyGzuwAAaFrosw9X7J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwa-R1JxLYIe496Upt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxHu-fRYwhE7h4YTKR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHmwZ5uJqK5vnHGmB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]