What a Local Traffic Snafu Teaches About Artificial Intelligence in ...

11/13/19

The DC suburbs are a case study in NIMBYism. Lots of communities try to limit through-traffic via all sorts of means:  speed bumps, one-way streets, speed cameras, red-light cameras, etc.  The interaction of one of these NIMBYist devices with GPS systems is a great lesson about the perils of artificial intelligence and machine learning in all sorts of contexts.  Bear with the local details because I think there's a really valuable lesson here.

The story takes place in Chevy Chase Village is one of the fanciest suburbs of DC (and not my home). Much of it lies between two major parallel thorough-fares, but for a mile there i only cross-street connecting these thorough-fares because of NIMBYist zoning, and the connecting streets on either end of the mile are grossly inadequate for the traffic flow. That one cross-street in between is the lovely Grafton Street, the home of George Will, among others. 

Now, Grafton is a two-way street, but it is only a true cross-street for eastbound traffic.  While traffic can go east or west for the length of the street, westbound traffic is prohibited from turning onto one of the thoroughfares (Wisconsin Avenue, a/k/a MD-355, in the map below). 

Screen Shot 2019-11-13 at 12.07.19 AM
Despite the signs prohibiting a turn onto Wisconsin from Grafton, some westbound drivers seem to turn anyhow. (The poor saps never see the cleverly hidden camera that snaps a picture of their license plates....)  These drivers are breaking the law, but it's much faster than any of the other routes. And some of them have GPS apps on that seem to have some sort of artificial intelligence or machine learning, such that if one route is faster than another, the app will reroute other vehicles to that faster route.  So what started happening last week?  There was a sudden uptick in westbound traffic turning onto Wisconsin from Grafton because some GPS app learned that it was faster than the other routes.  It didn't matter that there were signs saying "no turn" and that there was a traffic camera.  It didn't even matter that the Village of Chevy Chase police parked two black and whites at the intersection to grab the scofflaws. The GPS kept sending more and more folks westbound down Grafton.  (I often take Grafton eastbound and saw a lot of Virginia plates headed westbound, which was a give-away that these weren't locals, but folks relying on GPS... also that they weren't Mercedes or Audis...)

Eventually, the Village got the County Police to post an electronic sign that flashed "GPS APPS ARE WRONG.  NO EXIT TO 355". (See pictures below.) Whether it was the sign or a phone call to one of the app makers, etc. the problem seems to have subsided.  But it has some important lessons about AI.   IMG_3458 IMG_3459

What the Grafton Street cut-through teaches is that AI only learns what it is programmed to learn--in this case, what is the fastest route from point A to point B. If AI isn't programmed to look for legality, it won't. And if AI learns to copy past behavior, it just compounds past errors.

Understanding these perils of AI is important because AI is increasingly used in credit underwriting. Underwriting apps might use neural networks that will find patterns, but because the neural network won't spit out a simple linear equation for the underwriting, the firm using the AI might not understand why the pattern exists. It might be that those patterns are based--unintentionally—on data that correlates with protected classes under ECOA or the Fair Housing Act or state anti-discrimination laws.  In other words, AI can be a bit of a Frankenstein.

Which brings me to the totally buried lede--complaints about gender discrimination on the new Apple-Goldman Sachs credit card. I have no view on the merits, but I just want to toss out an idea:  I assume that as part of the Apple-Goldman deal, Apple is sharing customer data with Goldman for marketing and underwriting purposes. Exactly what they are doing I do not know.  But imagine that Goldman is figuring out the terms of credit based on a consumer's iTunes Store or App Store purchase history. That's basically what Alibaba does in China in the absence of a credit reporting system.  This means that your purchase history determines your credit-worthiness.  Rented Mary Poppins?  There's more than a tuppence of credit coming your way.  Rented Movie 43? Credit denied. Bought the Guarneri quartet's recording of the complete Beethoven string quartets?  Perhaps a different result than if you bought that new Kanye West single.There are likely all kinds of correlations between entertainment purchase histories and FICO scores.  But they are also likely often correlated with protected classes.  If purchase history data is used for underwriting, it could easily result in a mess, much like the Grafton Street cut-through.  

[more]