The Link Between Infinite Computing and Machine Learning
At the Formlabs Digital Factory event in June, Carl Bass used the phrase Infinite Computing in his keynote. Iâd heard it before, but I liked it in this context and it finally sparked a set of thoughts which felt worthy of a rant.
For 50 years, computer scientists have been talking about AI. However, in the past few years, a remarkable acceleration of a subset of AI (or a superset, depending on your point of view) now called machine learning has taken over as the hot new thing.
Since I started investing in 1994, Iâve been dealing with the annual cycle of the hot new thing. Suddenly, a phrase is everywhere, as everyone is talking about, labeling, and investing in it.
Here are a few from the 1990s: Internet, World Wide Web, Browser, Ecommerce (with both a capital E and a little e). Or, some from the 2000s: Web Services, SOAs, Web 2.0, User-Generated Data, Social Networking, SoLoMo, and the Cloud. More recently, weâve enjoyed Apps, Big Data, Internet of Things, Smart Factory, Blockchain, Quantum Computing, and Everything on Demand.
Nerds like to label things, but we prefer TLAs. And if you really want to see what the next yearâs buzzwords are going to be, go to CES (or stay home and read the millions of web pages written about it.)
AI (Artificial Intelligence) and ML (Machine Learning) particularly annoy me, in the same way Big Data does. In a decade, what we are currently calling Big Data will be Microscopic Data. I expect AI will still be around as it is just too generally appealing to ever run its course as a phrase, but ML will have evolved into something that includes the word âsentient.â
In the mean time, I like the phrase Infinite Computing. Itâs aspirational in a delightful way. Itâs illogical, in an asymptotic way. Like Cloud Computing, itâs something a marketing team could get 100% behind. But, importantly, it describes a context that has the potential for significant changes in the way things work.
Since the year I was born (1965), weâve been operating under Mooreâs Law. While there are endless discussions about the constraints and limitations of Mooreâs Law, most of the sci-fi that I read assumes an endless exponential growth curve associated with computing power, regardless of how you index it.
In that context, ponder Infinite Computing. Itâs not the same as saying âfree computingâ as everything has a cost. Instead, itâs unconstrained.
What happens then?
Also published on Medium.