Back to archive

Thread

2 tweets

1
What is it called if it doesn‘t matter whether arguments are true because both sides already figured out what they want to do, and the arguments conveniently support that? That‘s the vibe I get from Sam Altman calling for AI regulation and the US congress going along with it.
2
There has to be a term for this in game theory / cognitive sciences / rhetorics / politics.