A lot of this is why I get itchy when people start talking about how AI is going to revolutionize science.
It's like we became aware of how hardware finally made neural net models from the 60's-90's useful, and we've been finding a way to beat any problem into their solution space since then. If the only tool in your toolbox is a convolutional neural network, every problem starts to look like a classification problem. Machine learning is cool as hell, and has an incredibly broad range of potential impact...But people need to read articles like this one before they throw money at someone claiming that their fucking keras app is going to solve every need in the market. Good post.
The problem is a different one than people want to talk about: there's a bottleneck, either in speed or authority, and someone wants to code their way out of it. All the AI resume review stuff is exactly that: every department worth their salt already says that HR has no idea what to hire for, so HR says "well but we've got this skookum software that will pare the prospects down to useful prospects" and every department says "great, you go chase that squirrel" fully knowing that the positions are going to be filled from people they know and wondering again why the fuck HR exists. Now, where's that list of AI training fails.
It feels like a lot of this tech comes from people who are relatively young and have no idea of all the complexities that go into the jobs they're replacing. Ditto with all the "fully automated" folks who have no idea how farming works. And if you grow up thinking you're smart and "often right", it's pretty easy to solve a different, easier problem and then insist you've solved the original problem because everyone who disagrees with you doesn't have any legible criticisms.
In fact, so cool, I'll copy the recipe here: - Look at a complex and confusing reality, such as the social dynamics of an old city - Fail to understand all the subtleties of how the complex reality works - Attribute that failure to the irrationality of what you are looking at, rather than your own limitations - Come up with an idealized blank-slate vision of what that reality ought to look like - Argue that the relative simplicity and platonic orderliness of the vision represents rationality - Use authoritarian power to impose that vision, by demolishing the old reality if necessary - Watch your rational Utopia fail horriblyHere is the recipe: