Bayes Analytic for Communications Companies

I have been working on analytic components which may provide benefits to AT&T wireless or similar companies.   AT&T is particularly interesting because they have multi-year relationships with their customers but do such a poor job of targeting communication (I say this based on personal experience).  AT&T marketing is actually alienating customers when they could have been adding value and building loyalty.  A good example of this is when they have a customer who just purchased two phones on a plant that has never had more than two lines and then for several months they send advertisements for new phone models after they just locked that person into a 2 year contract before they can upgrade.    The person is unable to buy a new phone so the ineffective communication builds buyer remorse that is permanently associated with AT&T.      The person is relatively unlikely to add another line and the new phone models focus just makes them dis-satisfied whereas they probably could have sold extra chargers,  data plan and other items that user would likely need.   They are simply throwing away these sales because they do not have the Bayes Analytic engine helping to determine which offers the users will find beneficial.

This failure of AT&T’s marketing group is representative of first generation engines which made a ultra simple association between a person who buys a phone buying another phone.  They are right but the time delay between upgrades is longer than 1 year.   The 1st generation marketing engine missed the more complex correlation and as such sent the wrong offers to the customer which increases the buyers remorse.    In contrast there is a distinct correlation where buyers of smart phones  start looking at the next generation phone about 1 to 1.5 years after the initial purchase.     The Bayes Analytic engine can see this kind of correlation and make better decisions even before the marketing person thinks about asking the specific questions that would allow them to deduce the same output.

I have been focusing on components that intersect Bayes statistical  forecasting and genetic algorithms.  My goal has been to find option transactions that are likely to meet specific goals such as rising 23% over a 7 day period.   This is a challenging task because most experts endorse the “Efficient market hypothesis” which stipulates that stock prices move at random.    An interesting side effect of this ability to predict success is that the engine can be fed with dozens to thousands of offers for a given customer to find offers they will find compelling which can then be used the next time anybody  in the company is contact with the customer to make the most valuable offers.   After this step these customers can be grouped by offers they are likely to find attractive which an allow the marketing company to discover customer segments they where previously unaware of.

Stock forecasting should be more difficult that predicting customer behavior where you have a rich data set.   You know a fair amount about a customers including things they have purchased,  where they have shopped,  when they tend to shop, etc.   We can use these to deliver statistics which help predict with a high degree of confidence if they would be likely to accept specific offers based on offers other customers with similar characteristics have accepted.   This should make it possible to target specific customers with specific offers they highest probability of accepting.     This is not clustering but we may be able to use clustering to improve it even further.

My recent tests delivered a 95.14% success rate at selecting option transactions that subsequently met the goals.  These where filtered from a set of 10,976 Facebook CALL transactions which ultimately had a 14% success rate over the same time frame.    Overall the system tends to yield greater than 93% success rate but has the ability to select a larger set with relaxed constraints when needed.     A 90+% success rate in predicting both market direction and minimum relative amount of movement is particularly interesting since the leading experts claim stock prices follow a random walk.

Example of sub select selection rates from the Bayes analytic engine
Example of sub select selection rates from the Bayes analytic engine

Think of these Goals as something you could re-cast for marketing,  EG:   Customer buys a drill with price under $65.

I was careful to ensure the data used in the training set stopped 15 days prior to the data sent and I used relatively recent data for the test set.    Since the rule is for 7.5 days only options purchased in the last 7 days are considered part of the input set.  This gives us a 7 day gap between the training and the test data set.    Using not overlapping data ensures that we are not cheating by effectively allowing our training to looking into the future.

Thanks Joe Ellsworth


Sep-2-2013 –  I forgot to mention when I wrote the original post is that AT&T has lots of quality of service data available which when combined with Geo-location data could be leveraged in the engine.  I proved this is possible  with an application called GPS-SAFE which allowed me to isolate problem zones in the Nextel network to within a couple hundred foot and to the cell that was having trouble.   I was even able to deduce which hand-offs from cell to cell where problematic along with which devices where most likely to have problems in marginal coverage areas.    Recruiting new customers is an expensive process so it is worth a significant investment to retain those customers especially as they approach key renewal periods but AT&T’s clearly does not execute on this critical business focus.        As an example,  I have a high failure rate of calls on the island near my home.     I had to nearly beg AT&T to sell me a micro cell and they have never offered me an upgrade even though their cell crashes multiple times per month which causes new lost calls.     Their quality of service has been so poor that they will most likely loose me as a customer during the next renew cycle.   I strongly suspect there is a distinct correlation of lost customers to those customers who loose more than 20% of their calls in the areas where they spend over 70% of their waking hours.  There is also a likely correlation to those who loose over 20% of their calls in areas where they commonly spend time during primary commute hours.  For those in both categories customer loss is nearly guaranteed.     I am in a group of premium high profit customers with multiple smart phone devices paying for full data plans.  Standard business practice requires retention of high margin customers to be a primary goal when protecting net ROI for the business.       Customers like me are the customers that AT&T should be aggressively contacting to sell boosters,  amplifiers and tuning their networks.   In an informal survey of other business professionals I have not found a single person out dozens who has received this kind of unsolicited service  from AT&T and yet it would have a significant impact on retaining me as a customer.      It seems that AT&T simply doesn’t have the ability or the will to use this extra data combined with an advanced engine like Bayes Analytic to protect their profits form this premium high profit niche of people.     AT&T needs the Bayes Analytic engine to give them the insight needed to better target premium customer retention.

Leave a Reply