Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What an interesting analogy. "--I wouldn't want ai to be so obsessed with effica…
ytc_Ugx2HVW7t…
G
The multi-modal chat bot example is an example (explanation) of Hegel.
There is…
ytc_UgxMu8mqC…
G
I mean what he completely fails to understand (maybe on purpose) is that people …
ytc_UgyLbOXwO…
G
I don’t expect much philosophical or moral consistency. I think his position is …
rdc_jkfnj5r
G
It’s getting to the point where all these companies are going to automation. You…
ytc_Ugw7iXn9b…
G
These are all over SF and I absolutely despise them. One Waymo ran my friend off…
ytc_UgybHvnZx…
G
As a writer and visual artist, I feel so insulted whenever anyone claims AI isn'…
ytc_UgwfIINnN…
G
So Google is what PipePiper from Silicon Valley would have become if they didn't…
ytc_UgygKDDc5…
Comment
This is going too far. I am so against this. I drove Explorer and sensors didn't work I hit my husband's car, almost hit a kid and never sensed an animal either. Now you're wanting trucks with no human drivers and they can cause fatalities. What about last minute lane changes, detours, weather, debris on road etc. This is sad to take so much time, money and investment into AI over appreciating humans and paying them. They drive so many hours and people depend on stores being filled with merchandise and deliveries etc.
youtube
AI Jobs
2025-05-31T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwmgL5emcvEFBvC0QB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyRe0G8P7yFNkJsBtd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyirvOvTRVPkQQw3_t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzUCrBcrMt_6MUmuip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyw3JokkxouKXevDsF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_OFo6ibQnOYdzMhZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzFWyol0Ud9WnWSROF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOSSVOy5XEGKzmBRd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgygyAt2KcXchjRbR7J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz48na8nvmFyTOTzkR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"}
]