Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh, We don't know !
Newton
Euler
Lagrange
Legendre
Gauss
Cauchy algori…
ytc_Ugy9G7qsL…
G
I asked chatgpt who should lead a relationship between a succesful man and a wom…
ytc_UgyCA3oaa…
G
Agreed, I tried using it to make a flaming snake, somehow I got a snake with 4 d…
ytc_Ugyuwt97S…
G
You need it on an international scale to work though. Otherwise, whichever count…
ytr_UgxPxzApN…
G
This robot has many data circuit.
One to be pleasant
One to be sad.
One to be a…
ytc_UgyLzrBkc…
G
AI IS the worst. AI videos, images, music, voices. There are imposters pretendin…
ytr_UgwSydpdc…
G
Chapter: Why would a.i. go rogue?
- it's not that a.i. would go rogue, it's tha…
ytc_UgzuloiXX…
G
@qdlaty23 i was generally speaking about junior developers, those are gonna be i…
ytr_UgyK04ssS…
Comment
After all cars are made autonomous (which will come with time) the cars should be able to communicate with one another, so if someone was to jump in front of a car, sensors could detect them and alert the cars to either side of them to move lanes to allow the car with people in front of it to move without harming others. Now, of course this is a very simple solution, but with the geniuses at Google (and no doubt other companies working with them to develop this technology) I think they could design a system that goes beyond anything we could imagine today. Eliminating human error from driving would save countless lives.
youtube
AI Harm Incident
2014-05-26T20:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxDaMBOSiNJD2siXy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyN5DcdkN4_CPV9W4N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJgB9bBLAVLTAPPcl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvkgE0W9UcAxjR8s54AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugimzyh73Mem3ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugj-0SjwbhJOkngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgiSbOcYg8LpjHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjzNHMI5Dl8n3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiKPIaGrg1XJHgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UggDy9xEJWdA5HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]