PERSPECTIVES

The Rule of Least Power in Data Analytics – Part 2

In Part 1 of Kirk Borne’s “The Rule of Least Power”, we learned how a computer programming principle can be applied to data science modeling using machine learning algorithms. In this part, you’ll learn about four examples that are highly relevant to today’s data-driven business goals.

Written by Dr. Kirk Borne

Examples of the Rule of Least Power in Analytics

1) A customer-facing services organization was trying to find an early warning signal of customer attrition — to identify those customers who were on the verge of taking their business elsewhere — and then reach out to those customers with a little bit of extra customer care.

The data science team could have built a large data infrastructure, full of esoteric predictive analytics algorithms and ensembles of models. Instead, the team tried a simple approach first.

They decided to look at web logs (website usage histories) of those customers who have already left, compared with a control set of customers who have not left. They found a simple signal in the data and built a model from that. It was a very simple, yet very successful model. What was the simple model? The team simply counted how many times the customer visited and clicked on their account webpages in the month preceding their departure. These customers had a much higher web visit rate than other customers. So, the data science team implemented an alert algorithm. When the algorithm alerted a customer service agent that a customer was in the high-click-rate category, the agent simply reached out to the customer and offered some “just-in-time” advice, information, and other assistance related to their accounts. The customer retention rate rose, the attrition rate dropped, and the simple “count the web clicks” model was a huge success.

Why Data Accessibility Is the Biggest Issue of Our Time2) A mobile phone services provider wished to cross-sell their traveling customers with a mobile roaming package for their international travels, and do this before the customer leaves the country, thus avoiding a situation where the customer purchases the roaming package from a competing service provider elsewhere. They identified customers who were at the international departure terminal of an airport, simply from the GPS location data that the phone automatically provides via the cellular network. The mobile provider was able to offer a discount on a “just-in-time” roaming package to those international-bound travelers prior to their departure. A high percentage of the customers responded and accepted the offer. The simple location-based model, based on metadata that was already within the mobile device signal packet, was a huge win for the company, with almost zero additional promotional or marketing costs.

sensor analytics at micro scale

3) Streaming data from sensors in engines usually have discrete modes of behavior: specific frequencies and ranges for the different sensor readings. When one of these frequencies or ranges or mean values for the sensor readings suddenly changes or begins to drift unexplainably, then that could be an early warning sign of an undesirable outcome: engine failure or engine malfunction. Industries are using deviations in simple statistical metrics (mean, median, variance, skew) of digital signals as a prompt for them to schedule “just-in-time” prescriptive maintenance. These industries are saving money in two ways from such a simple model: (a) servicing a component before an undesirable event occurs; and (b) reducing significantly the amount of scheduled preventive maintenance for components that are working just fine and do not need servicing.

4) An electronics retail store chain many years ago was selling the hot items of the day: video cassette players/recorders (VCR) and camcorders. That’s obsolete technology these days, but not 20+ years ago. The store tried upselling a camcorder to a customer when the customer bought a VCR, but the response to the store’s offer was weak, at best. The store then looked at their customer data from a different but simple perspective. They looked at the association of VCR purchases with future camcorder purchases. They discovered that some of customers who bought a VCR came back several months later to buy a camcorder. The likely explanation was that the customers realized that they could make their own home movies to show on the family VCR. So, the store began sending discount coupons to customers 4–6 months after their VCR purchase, to capture the customer’s attention “just-in-time” as they were starting to consider a camcorder purchase. It worked! A significantly higher percentage of customers accepted the offer, returned to the store to make the purchase, and the time-shifted marketing campaign was a great success.

You can see several recurring themes in all these examples:

  • They all had a declarative outcome as the initial requirement on the analytics task;
  • They all used easily accessible data;
  • They used “small data”;
  • They achieved business value and good ROI from simple models, perhaps even an algorithm “of least power”;
  • The models were simple enough that they could easily be adjusted and updated if the initial implementations failed (which means they followed a “fail fast in order to learn fast” DataOps strategy); and
  • They were all “just-in-time” implementations, which was made possible by simple data, simple models, simple implementations, and simple ROI metrics.

Getting the most value from data analytics is not about fancier algorithms. It is really about ease of use and ease of explainability, both of which deliver trusted outcomes in a timely manner. So, if you are looking to deploy fast, learn fast, and earn fast with data analytics, then the rule of least power might be just what the doctor of analytics ordered. Data makes that possible.


Now, take the conversation to Twitter! Agree or disagree with this perspective on data analytics? Want to ask Kirk a question? Tweet @KirkDBorne using the hashtag #datamakespossible right now!

Leave a Reply