Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m a machine learning engineer and researcher. Very well explained. Tip to the …
ytc_UgxX-Ayn7…
G
evil want to denigrate they will get the answer they want as far as what ai is…
ytc_Ugxpdvlv8…
G
We also need to worry about AI programmed by communists, many of whom see humani…
ytc_UgxR_UkD-…
G
“People think Ai is neutral, safe, & under human control, none of that is true” …
ytc_Ugzo_3CuI…
G
If unemployment is sky high , no one will have money to buy houses , cars, iPhon…
ytc_Ugxnze3RC…
G
She is brilliant ,, He is obsessed with Sam Altman spending huge amounts of ti…
ytc_UgxOGlajP…
G
Anyone can draw, it’s a skill. Some come with talent but most of us it’s years o…
ytc_Ugyjf101p…
G
Are they going to REPLACE Consumers as well? Who will buy the things AI makes s…
ytc_UgxpSqiyC…
Comment
Usually, I’d be worried about stifling innovation, but honestly, the AI development industry desperately needs to be stifled, at least until humanity figures out what we’re doing.
In the end, I doubt this will have much of an impact on the global AI arms race, but it might just slow down competition just enough to allow AI companies abroad some breathing room, in order to implement better safety systems.
The EU AI safety board should also act as an advisory body for foreign businesses, before their own country’s government gets their shit together, as this is a truly global issue.
youtube
AI Responsibility
2024-09-23T09:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw1-kDZfUOgwn9Xmf14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZD2nmS4Njfg_0HKd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyOKjtBE92ElsntcFl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy4Oj8yRRp0Rb3hJnp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwWzMwtOexLgKkZzg54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy4b6M9EZKJ9fuey4p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGqgg7BN5t7zT-MAl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfOvGoqQe5Pj4RndZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBxwf47HDfUH4QJoB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwW9m4xKh9Yvjn9RrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]