Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Flashbax7 It is known and reported many times to "hallucinate" answers... I u…
ytr_Ugx_1lhmO…
G
Same. Our PE owners recently told leadership to get an account from every employ…
rdc_ofhtc77
G
"What are we going to do if robot starts demanding their rights"
Are those amer…
ytc_UgxySr3f0…
G
I agree and that is why it appeared that the debate was skewed, somehow, in the …
ytr_UgytTT4UF…
G
As a paid user of ChatGPT, I don't see it as an intelligence, but a BOT programm…
ytc_UgyxAh_wp…
G
ChatGPT isn't smart enough to have the answer to so many of these questions 😂😂😂 …
ytc_UgxSu6qC4…
G
AI creating AI is when AI destroys humans.
The movie iRobot is actually going t…
ytc_Ugy_A965T…
G
It's really interesting to hear Sundar Pichai's perspective on the future of AI.…
ytc_Ugzl8GwKO…
Comment
I’ve applied to jobs that had the bells and whistles when it comes to the automated HR. Jobs that I was over qualified for. It’s so frustrating to know that it could be simple as a computer thinks I’m suited for another position and that’s why I didn’t get the job…even though I could probably do both.
youtube
2024-03-18T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugx8Vf6tq7e7KwROyol4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyIp5qaq4xTpcxe6p94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyUFpq4NTJl_6SaMIN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgzbnvpGiVQJ6krWKER4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugy-l9SPDXbn15AsJbV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxNNVKg8XS3yWZSEu94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},{"id":"ytc_UgzQVTUYxNjjscYWhlh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgyzX7pfvCV0ulaTewF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwGOuWRlSs0aopHr1N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},{"id":"ytc_UgwJabKkowIWXbgFj354AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]