

Most of the AI industry is currently stuck in a kind of uncanny valley where it’s close enough to fool people who don’t care about details or who so desperately want to make money that they deny the reality that these AIs aren’t actually good at very much.
But there’s been so much money invested in it that they are desperate to make it generate some revenue and profit and keep shoving it into things, hoping that their thing will be the one the public finally latches on to.
It’s also management types that really bought in to it. The kind of managers that don’t know shit and will make impossible requests, or think something simple is hard and something hard is simple because they don’t actually know much about the jobs they are managing. But they do have the power to direct those under them to use the AIs as well as get of or dismiss the opinions of those pointing out the emperor has no clothes.
Right now, they are hoping to find that substance that will keep the AI bubble from popping. But IMO the problem is fundamental to the big data approach to AI of “throw a ton of data at a generic correlation engine and hope that it ends up smart”.











Or not very emotionally intelligent, if you will.