Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This person who posted this is either stupid for believing that the Woman on her…
ytc_UgwmhUG99…
G
i've saw some bullshit in museums (mind you, national galleries and staple museu…
ytc_UgysKUExR…
G
The first guy lives in a completely different reality than the rest of us. Imagi…
ytc_UgyJ1fvZ5…
G
i really needed this today 🥹🖤 i categorically reject AI, and completely refuse t…
ytc_Ugy9pbRxa…
G
So it begins...
You can label it a "human problem", but these LLMs are very con…
ytc_UgxrQY-eN…
G
twaddle! the BBC gets worse and worse. I'm a Brit and ashamed of the BBC
The …
ytc_UgxIBjkmF…
G
There is absolutely no "skill" that goes into AI. Pick up a pencil and learn to …
ytr_UgxgxgUZR…
G
Also face recognition are not as accurate as we think for brown and black people…
ytc_UgwKplNmy…
Comment
Companies aren't going to stop developing facial recognition tech because of some moral code. The reasoning is that if I don't develop it, someone else will.
So the only way to put a stop to it is for government intervention. But should we? Should we crawl back into our caves because we can't properly use a tool?
And when you bring up the argument of helping to catch criminals, forget about government regulation that would ban the technology.
So the argument that we should be having is how do we properly use that technology to not cause more harm to people.
youtube
2020-07-11T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgylHfXz9Y7ulraAchl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzFmhvk3sTkR6shmKB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxgD3khjON7yCzyU-x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwAASWsYppaDNvCGIF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz55X3uWBnA6OuLKYV4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyrqB9DbygaTFxbQhd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyzJ3lUOAkjhRRZFh54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwAoUtYS81izHxOSdR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwdqoo_vX1AT1xgJ9d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz82OOQj6-uKq7zzJ14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]