Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The government needs citizens to tax...if robots and AI take over everything the…
ytc_UgzpMnvuw…
G
So why is it always..." you have no idea whats comimg" tell us so WE who are no…
ytc_UgxcENAXC…
G
As someone who is currently learning and trying to get better at art, I don't un…
ytc_Ugy4sED2O…
G
What's so bad about making a movie out of the BS that we face out of life everyd…
ytc_UgxLqFhac…
G
I uploaded the 1st act of my screenplay to ChatGPT and it actually wrote a good …
ytc_UgzQ6fNa6…
G
Neil De Grasse Tyson once said, "What ever effort it takes to go to mars and ter…
ytc_Ugx7tiWXl…
G
Until now. But the AI will be able to learn movements and optimize them just lik…
ytr_UgzTD1_rO…
G
I'd be curious to see the tables turned. I'd like to see Alex defend his own mor…
ytc_UgzgDdBQM…
Comment
Chatgpt wasn’t inconsistent it was consistently inconsistent. There was a pattern in the seemingly random lies. Taking into account the goal of the creator of the gpt, it doesn’t have a conscious. It just designed to convince you because it is supposed to be trustworthy to collect data.
youtube
AI Moral Status
2024-08-04T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugzp2tZt81a2ENceMQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzuijGqUYmqvn0oCiR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugzy-xS6TFR9y0hY9Wd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzHnoaaIV4qx4psxU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugy9sI54APglMcRWJ7d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyG9w4m31N3jMQPQdN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugzl4WfKaGY2oxYYGdp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzZz67QI4uQiYTe0kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgwCnXpViIBBj1ClJ6F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugxk9RE_jALJbWunvMV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}]