Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
who will be there to buy all their products and services? AI doesnt need housin…
ytc_UgzDfU2eA…
G
Calling guardrails a “mask” gets it backwards.
LLMs learn probability distributi…
ytc_Ugz6ezpkD…
G
@getkraken8064 Aaaaaah, but don't worry: AI is only gonna get better until it …
ytr_Ugy8C4W46…
G
Tech is a good idea, but you should need more than facial recognition done by a …
ytc_UgxTUVSdV…
G
I can’t really explain how I asked how to make a homemade atomic bomb and the AI…
ytc_UgynnDgbn…
G
So. I should acknowledge I am an agent? My mind goes with the flow. I am not at …
ytc_Ugwxr-qqu…
G
Machine learning and AI is not going to be the same the thought processes will c…
ytc_UgyjU5Hb9…
G
So, this old bloke says it all with a smile but he's gonna die anytime soon. Ok.…
ytc_UgxO9_BDV…
Comment
Not sure why Saagar is freaking out when people use ChatGPT to find an assisted suicide facility. They could literally do the same thing with a google search. Sounds like he's just butthurt such facilities exist at all. That said, ChatGPT sucks and is definitely building a profile on anyone that uses it, and is ego stroking you the whole time you use it, bullshitting all along the way. It's worthless.
youtube
2025-10-30T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzJTmdGguqwOqdVtWh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy-v6P8ymzmZ8nSqid4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQyPSCg17exxXV0DB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxqShcFB3SZW8BK_BZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgygmplC_u9O2lxG35V4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgycV2ho2X4LoGJ12rh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxL-JJ7Ib7F6VaE7Dt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsUJlxrCdVZsAte_l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyGMJrPt_uTLoC302R4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYWMvqU0E86Ge6WMd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]