Machine Learning is Dead! Long live Machine Learning!

Machine Learning is Dead! Long live Machine Learning!

Machine Learning is Dead! Long live Machine Learning!

Businesses today are generating enormous amounts of data, and over the years we have gone from the processing of enterprise data through business intelligence tools, combating the storage of Big Data by providing faster and newer cloud technologies, and now entering the era of Machine Learning, and of course everyone is doing AI!

As part of this article, I wanted to look at why machine learning has suddenly met an untimely death – according to Gartner!  You will know what I mean when we review the trends.

I decided to review the well-known “Hype Cycle”, out of curiosity to understand the trends in this area, for an article I’m writing.  So, I decided to go to the Gartner resource that looks at emerging technologies, which is released each year and tracks a number of new innovations.

Understanding the trends (and scratching my head at the same time)

Let’s explore the “Hype Cycle” by Gartner, but first a bit of an introduction.  Every year they release the “Gartner Hype Cycle for Emerging Technologies”.  It looks at an array of innovations that jump from Smart Dust to IoT Platform to Autonomous Vehicles and so on.  Providing audiences with an understanding of where innovation sits within 5 categories, notably (taken from Gartner):

  • Innovation Trigger: A potential technology breakthrough kicks things off. Early proof-of-concept stories and media interest trigger significant publicity. Often no usable products exist, and commercial viability is unproven.
  • Peak of Inflated Expectations: Early publicity produces a number of success stories — often accompanied by scores of failures. Some companies take action; many do not.
  • Trough of Disillusionment: Interest wanes as experiments and implementations fail to deliver. Producers of the technology shake out or fail. Investments continue only if the surviving providers improve their products to the satisfaction of early adopters.
  • Slope of Enlightenment: More instances of how the technology can benefit the enterprise start to crystallize and become more widely understood. Second- and third-generation products appear from technology providers. More enterprises fund pilots; conservative companies remain cautious.
  • Plateau of Productivity: Mainstream adoption starts to take off. Criteria for assessing provider viability are more clearly defined. The technology’s broad market applicability and relevance are clearly paying off.

They are portrayed in a graph that has two axes: Expectations vs. Time.

Now, I’m not going to go into the depths of the hype cycle itself, but, at least you can see that it’s aptly named the hype cycle, as many new technologies state that they offer amazing possibilities, so I am assuming that Gartner wants to dispel some myths and see if they are solving real business problems or are they just hype. Hence, the name the “Hype Cycle”.  On this occasion I am not going to delve much deeper into the methodology, you can read that on Gartner’s website.  I am using it to for the main purpose of analysing the trends around Machine Learning.

2015 Hype Cycle

Machine learning is up and running.  In 2014 we saw Big Data & Data Science on the hype cycle, so there was no real observation for Machine Learning at that time.  However, in 2015, we see Machine Learning is in what seems a very busy Hype Cycle.  It’s sitting very nicely on the down curve and close to moving into the “Trough of Disillusionment”.

Compared to Data Science in 2014 which was in the “Innovation Trigger” and Big Data being in the “Peak of Inflated Expectations”, I guess someone decided to merge the two and put machine learning somewhere in the middle.  Huh?  Was this a finger in the wind type calibration, let me know your thoughts.

2016 Hype Cycle

As you can see from the Hype Cycle below for 2016, Machine Learning has been positioned at the top of the “Peak of Inflated Expectations”.  This is a bit odd as compared to 2015, as it’s gone backward.  With the same expectancy that it will reach the plateau in two to five years (colour notation to the right of the graph in the key).

In 2016, the industry was coping with fast technological changes in the area of data and business intelligence.  The use cases that were in existence around Machine Learning were very much in the marketing area to do with customer targeting and recommendation engines, as well as in Banking looking at churn and fraud etc.  There are more, and you can add those in the comments.

The main challenges that data scientists faced back then when attempting to run predictive models, was data quality.  Therefore, more time was spent cleaning then doing.  We also had issues with many organisations that suffered from data fogginess “there is so much data, that we don’t know what to do with it!”  That’s still an issue in 2018, as is data quality, and will continue to be until companies realise that data is a prized asset just like their people.  Back to the Hype Cycle.  Let’s move to 2017.

2017 Hype Cycle

In 2017, we see a slight move down the curve for Machine Learning (getting closer to where it was in 2015), still in the “Peak of Inflated Expectations” and we see the introduction of Deep Learning.  Deep Learning probably arrived due to a couple of companies doing some stuff in the realms of “AI”.

The first company Google did their deep learning experiment with Alpha Go.  The Alpha Go experiment demonstrated a machines ability to play the Chinese game called “Go”.  The Alpha Go algorithm learned from a dataset of approx. 100,000 Go games to build it’s understanding of the game, it’s moves and rules.  Deep Mind the arm of Google that built the Alpha Go algorithm used Deep Learning (a subset of Machine Learning) to beat the world champion in Go.  The experiment was an absolute success and raised the bar for Artificial Intelligence to go mainstream.

The second company to release a new innovation was Microsoft.  The innovation was another experiment, that wanted to nurture a Twitter bot, to grow up into a teenage girl.  While, Google blew Go players’ minds away across the world, Microsoft saw their bot literally blow up!  Microsoft’s attempt failed miserably when they launched their AI bot “Tay.ai”.  Within a day of launching the bot, Tay.ai became a very nasty bot (that’s probably understating how nasty it was).

The fact that Alpha Go learned from 100,000 games and then mastered the rules etc., Tay.ai learnt from the barrage of comments on Twitter where she started mimicking her followers.  A brutal education when an algorithm gets fed with words that they don’t understand, as they are not humans, and context is very difficult for a machine to learn and grasp just yet.

I suspect deep learning got its place on the hype cycle due to this and of course other examples, these probably being the most prominent.  I may have gone off-track a little from machine learning but, I wanted to provide you with that backstory of why deep learning ended up on the hype cycle, as it is a subset of Machine Learning.

So, now let’s get back to the matter of Machine Learning, where most of the applications are being built, and are typically being called Artificial Intelligence.

2018 Hype Cycle

In 2018 on the Hype Cycle, something strange happens.  We look at the 2018 hype cycle, and we are expecting Machine Learning it to be going further south towards the good old “trough of disillusionment”.

Whoosh – as if by magic and the clicking of Mary Poppins’ fingers, machine learning has now vanished into thin air!  Huh?  What happened?  We no longer need it?  Virtual assistants and other innovations still remain on the curve, but not Machine learning.  I don’t have a clue why, do you? Maybe someone was napping that day at Gartner when the report came out.

Nonetheless, when so much is being discussed about Machine Learning, and this is by far the most common application as a subset of AI, how can it go missing from the depths of the hype cycle.  Somebody, I’m sure will have a very good explanation for it all.  It certainly hasn’t gone into the “plateau of productivity” yet!  Clearly someone has…

What are your thoughts?  Has Machine Learning been lost forever, or will it turn up back in the Peak with Deep Learning in 2019?