Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly it is safer and better to not make AI conscious. LLM like chat GPT is …
ytc_UgyLEK3CE…
G
@oo--7714 You say this as you defend an automatic system that is supposed to cre…
ytr_UgxtFzV1k…
G
U don’t need to get consent from someone to use an image that was posted publicl…
ytr_UgxL-U8bc…
G
Yeah but you got to understand that Tommy Gun is .45 ACP in caliber and it doesn…
ytr_Ugw8YGHBf…
G
Just like self driving, it's the demos that are easy and to close that last 5% o…
ytc_Ugzt0HKJh…
G
Who cares about AI. Go read revelations, we have been told by our creator exactl…
ytc_UgyRGgshG…
G
> If they are looking to expand into autonomous cars, then wait until Toyota/…
rdc_dfudx95
G
@livia2via Why I can't commision stuff: Cause I don't have the money? I am barel…
ytr_Ugw-A7yfM…
Comment
This car is driven by a computer. Sometimes, computers don't (always) work the way they're EXPECTED to - which is why they need repairs or software updates/patches, etc.
The fact that UNLIKE any other vehicle out there (right now), for the most part, these vehicles can get you (safely) to where you need to go AND back without any engagement/involvement whatsoever. Waymo is still in it's beginning stages.
The jobs/careers of professional, Uber/Lyft, and taxi drivers might not be as "safe" as they THINK they are.
youtube
AI Harm Incident
2025-03-22T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgycphhweL3H1_TGluR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0TYHVGiURQO_m7yN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxM52seW1-UHY_Jked4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwTv49M-hB-B73er_d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgzDYrg2ejHr9dt5M8p4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},{"id":"ytc_UgyIkLTOej87JpFWay94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxoBqiDQTTiv1QJDXd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgwJhzs7_SG-G20n02p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"outrage"},{"id":"ytc_UgxR8vG33hjebCbDm8h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},{"id":"ytc_UgwkgS3658VdbedXmT14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}]