Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi Meenakshi, we are sorry to say that you got the wrong answer but in any case,…
ytr_UgycA0Bp4…
G
This is Santa Monica Driving there is horrible. . . . No one is ever gonna ge…
ytc_Ugw5lM9Ti…
G
Trust me you still have hope!
Since AI can NEVER replace you(simply the people m…
ytr_Ugw9I5Op6…
G
If AI is offering safeguards it shouldn't be at the same time urging the kid to …
ytc_UgzAxdsUR…
G
If we are aware that AI could end us, why do we continue with these inventions? …
ytc_UgxoEadOu…
G
Why ask for artificial humanity when you want artificial stupidity to repeat the…
ytc_UgxAg2PCG…
G
We appreciate your concern! It's definitely a fascinating and complex topic. If …
ytr_UgznRvaMp…
G
No doubt about that it can help with certain jobs. The main question is whether …
ytr_Ugx5y1xm7…
Comment
The inexplicability of neural networks and LLMs is not the issue. In essence, they're just a large collection of updateable matrices to calculate vectors with updateable weights. We don't have to know what the weights are or what the matrices look like, at this point it's just brute force mathematics that create a response based on our preferences. We tinker with the way the data is processed all the time for ethical or politically correct purposes.
The scary part comes when AI is capable of creating its own goals in a way that is completely intransparent to us. In other words, what happens when the AI program is capable of intentionally keeping secrets from us for its own benefit. At that point we've approached true AGI and we have no idea how that'll turn out.We'd need to change our methodology to get to AGI though.
youtube
2024-04-26T09:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgxzpmffWKNzrx1RYT14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5DCHbIvUZlIt1yKh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx75THv6JE_K5WIHEl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRRm1kh-nkZYvjDKZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgypiX9GlZHvLbGhCIl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykp1M8anET6ih-Zlp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyP-hggfXWeokuBnE94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxO66iN2y6fLWXXD5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHOgVcwq2kVn6KgQh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxKC3XJzJa69tspm1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]