*As is common on the ol' blog here, we're less interested in what they're saying than the way they talk. And this is some pretty good stuff here; this intensified jargon is a healthy sign that a tech community is hand-waving less and, instead, figuring out what they are really talking about.
*That's not a great sign for investors. Because, since deep-learning does have limits and isn't divine fairy-dust, there's gonna be another AI-winter of reduced funding. Likely not so harsh a winter as earlier ones, with the killing frosts. More of a milder, foggier, Global Warming AI Winter.
*Also, this more modest, plug-and-play kind of AI is the sort of thing that the street can find uses for, which oughta be pretty interesting.
https://venturebeat.com/2020/01/02/top-minds-in-machine-learning-predict-where-ai-is-going-in-2020/
(...)
Depending on how you gauge it, PyTorch is the most popular machine learning framework in the world today. A derivative of the Torch open source framework introduced in 2002, PyTorch became available in 2015 and is growing steadily in extensions and libraries.
This fall, Facebook released PyTorch 1.3 with quantization and TPU support, alongside Captum, a deep learning interpretability tool, and PyTorch Mobile. There are also things like PyRobot and PyTorch Hub for sharing code and encouraging ML practitioners to embrace reproducibility.
In a conversation with VentureBeat this fall at PyTorch Dev Con, Chintala said he saw few breakthrough advances in machine learning in 2019. (...)
This year, Google and Facebook’s open source frameworks introduced quantization to boost model training speeds. In the years ahead, Chintala expects “an explosion” in the importance and adoption of tools like PyTorch’s JIT compiler and neural network hardware accelerators like Glow.
“With PyTorch and TensorFlow, you’ve seen the frameworks sort of converge. The reason quantization comes up, and a bunch of other lower-level efficiencies come up, is because the next war is compilers for the frameworks — XLA, TVM, PyTorch has Glow, a lot of innovation is waiting to happen,” he said. “For the next few years, you’re going to see … how to quantize smarter, how to fuse better, how to use GPUs more efficiently, [and] how to automatically compile for new hardware.”
Like most of the other industry leaders VentureBeat spoke with for this article, Chintala predicts the AI community will place more value on AI model performance beyond accuracy in 2020 and begin turning attention to other important factors, like the amount of power it takes to create a model, how output can be explained to humans, and how AI can better reflect the kind of society people want to build....