The “Interference” of Phil Jackson

By: Dr. Ikjyot Singh Kohli

So, I came across this article today by Matt Moore on CBSSports, who basically once again has taken to the web to bash the Triangle Offense. Of course, much of what he claims (like much of the Knicks media) is flat-out wrong based on very primitive and simplistic analysis, and I will point it out below. Further, much of this article seems to motivated by several comments Carmelo Anthony made recently expressing his dismay at Jeff Hornacek moving away from the “high-paced” offense that the Knicks were running before the All-Star break:

“I think everybody was trying to figure everything out, what was going to work, what wasn’t going to work,’’ Anthony said in the locker room at the former Delta Center. “Early in the season, we were winning games, went on a little winning streak we had. We were playing a certain way. We went away from that, started playing another way. Everybody was trying to figure out: Should we go back to the way we were playing, or try to do something different?’’

Anthony suggested he liked the Hornacek way.

“I thought earlier we were playing faster and more free-flow throughout the course of the game,’’ Anthony said. “We kind of slowed down, started settling it down. Not as fast. The pace slowed down for us — something we had to make an adjustment on the fly with limited practice time, in the course of a game. Once you get into the season, it’s hard to readjust a whole system.’’

First, it is well-known that the Knicks have been implementing more of the triangle offense since All-Star break. All-Star Weekend was Feb 17-19, 2017. The Knicks record before All-Star weekend was amusingly 23-34, which is 11 games below .500 and is nowhere mentioned in any of these articles, and is also not mentioned (realized?) by Carmelo. 

Anyhow, the question is as follows. If Hornacek was allowed to continue is non-triangle ways of pushing the ball/higher pace (What Carmelo claims he liked), would the Knicks have made the playoffs? Probably not. I claim this to be the case based on a detailed machine-learning-based analysis of playoff-eligible teams that has been available for sometime now. In fact, what is perhaps of most importance from this paper is the following classification tree that determines whether a team is playoff-eligible or not:

img_4304

So, these are the relevant factors in determining whether or not a team in a given season makes the playoffs. (Please see the paper linked above for details on the justification of these results.)

Looking at these predictor variables for the Knicks up to the All-Star break.

  1. Opponent Assists/Game: 22.44
  2. Steals/Game: 7.26
  3. TOV/Game: 13.53
  4. DRB/Game: 33.65
  5. Opp.TOV/Game: 12.46

Since Opp.TOV/Game = 12.46 < 13.16, the Knicks would actually be predicted to miss the NBA Playoffs. In fact, if current trends were allowed to continue, the so-called “Hornacek trends”, one can compute the probability of the Knicks making the playoffs:

knickspdfoTOV1

From this probability density function, we can calculate that the probability of the Knicks making the playoffs was 36.84%. The classification tree also predicted that the Knicks would miss the playoffs. So, what is being missed by Carmelo, Matt Moore, and the like is the complete lack of pressure defense, hence, the insufficient amount of opponent TOV/G. So, it is completely incorrect to claim that the Knicks were somehow “Destined for glory” under Hornacek’s way of doing this. This is exacerbated by the fact that the Knicks’ opponent AST/G pre-All-Star break was already pretty high at 22.44.

The question now is how have the Knicks been doing since Phil Jackson’s supposed interference and since supposedly implementing the triangle in a more complete sense? (On a side note, I still don’t think you can partially implement the triangle, I think it needs a proper off-season implementation as it is a complete system).

Interestingly enough, the Knicks opponent assists per game (which, according to the machine learning analysis is the most relevant factor in determining whether a team makes the playoffs) from All-Star weekend to the present-day is an impressive 20.642/Game. By the classification tree above, this actually puts the Knicks safely in playoff territory, in the sense of being classified as a playoff team, but it is too little, too late.

The defense has actually improved significantly with respect to the key relevant statistic of opponent AST/G. (Note that, as will be shown in a future article, DRTG and ORTG are largely useless statistics in determining a team’s playoff eligibility, another point completely missed in Moore’s article) since the Knicks have started to implement the triangle more completely.

The problem is that it is obviously too little, too late at this point. I would argue based on this analysis, that Phil Jackson should have actually interfered earlier in the season. In fact, if the Knicks keep their opponent Assists/game below 20.75/game next season (which is now very likely, if current trends continue), the Knicks would be predicted to make the playoffs by the above machine learning analysis. 

Finally, I will just make this point. It is interesting to look at Phil Jackson teams that were not filled/packed with dominant players. As the saying goes, unfortunately, “Phil Jackson’s success had nothing to do with the triangle, but, because he had Shaq/Kobe, Jordan/Pippen, etc… ”

Well, let’s first look at the 1994-1995 Chicago Bulls, a team that did not have Michael Jordan, but ran the triangle offense completely. Per the relevant statistics above:

  1. Opp. AST/G = 20.9
  2. STL/G = 9.7
  3. AST/G = 24.0
  4. Opp. TOV/G = 18.1

These are remarkable defensive numbers, which supports Phil’s idea, that the triangle offense leads to good defense.

 

 

NCAA March Madness 2017 Predictions

By: Dr. Ikjyot Singh Kohli

Update: March 18, 2017: In a stunning upset, Wisconsin just beat Villanova. It is easy to see why this happened based on the factor relevance diagram below. To win games, Villanova has relied heavily on moving the ball, while Wisconsin has relied heavily on opposing assists! Wisconsin had a minor 5 assists in the whole game today, great defense by them.

wisconsinvillanovafactors.png

 

 

Original Article: March 16, 2017

So, I’m a bit late this year with these, but, it’s only the first day of the tournament as I write this (teaching 2 courses in 1 semester tends to take up A LOT of one’s time!). Anyways, I tried to use Machine Learning methodologies such as neural networks to make predictions on who is going to win the NCAA tournament this year.

To do this, I trained a neural network model on the last 17 seasons of NCAA regular-season team data.

The first thing that I found was what are the most relevant predictor variables in a team’s NCAA championship success:

  1. Free Throws Made : 99.99% relevance
  2. Opponent Assists : 55.86% relevance
  3. Opponent Field Goal Attempts : 31.44% relevance
  4. Free Throws Attempted : -83.13% relevance
  5. Opponent Field Goals Made: -69.2% relevance

It is interesting that the most important factor in deciding whether or not a team wins the NCAA tournament is actually free throw percentage. In other words, schools that have a knack for shooting a high free throw percentage seem to have the highest probability of winning the NCAA tournament. (Point 1 and Point 4 in the list above translates to having a high free throw percentage.) Obviously, with a neural network the relationship between these predictors and the output is not necessarily linear, so other factors could play a strong role as well.

The neural network structure used looked like this:

Now, for the results:

School Name

Probability of Winning Tournament

Villanova 0.9294916774
Gonzaga 0.8076801
Baylor 0.716319
Arizona 0.5516670309
Duke 0.005617711
Saint Mary’s 0.0048923492
Wichita St. 0.001208123
Purdue 0.001180955
SMU 0.0008327729
North Carolina 0.0006080225
UCLA 0.0003794108
S. Dakota St. 0.0003186754
Oregon 0.0002288606
Princeton 0.0002107522
Wisconsin 0.000206285
Northwestern 0.0001878604
Cincinnati 0.0001875887
Marquette 0.0001828106
Virgnia 0.0001532999
Kent St. 0.0001353252
Miami 0.0001338989
Fla. Gulf Coast 0.0001308963
Vermont 0.0001288239
Notre Dame 0.0001278009
Minnesota 0.0001277032
New Mexico State 0.0001276369
USC 0.0001274456
Middle Tenn. 0.0001268802
Florida 0.0001265646
Texas Southern 0.0001265547
Xavier 0.0001264269
Vanderbilt 0.0001262982
Michigan 0.0001261976
East Tenn. St. 0.0001261878
Nevada 0.0001261331
Butler 0.0001260504
Louisville 0.0001260042
Troy 0.0001259668
Dayton 0.0001259567
Arkansas 0.0001259387
Michigan St. 0.0001259298
Oklahoma St. 0.0001259287
Winthrop 0.0001259213
Iona 0.0001259197
Jacksonville St. 0.0001259174
Creighton 0.0001259092
West Virginia 0.0001259032
North Carolin-Wilmington 0.0001259012
Northern Ky. 0.0001259000
Kansas 0.0001258950
Iowa St 0.0001258950
Bucknell 0.0001258945
Florida St 0.0001258939
Kentucky 0.0001258939
Virginia Tech 0.0001258938
Seton Hall 0.0001258937
Maryland 0.0001258936
North Dakota 0.0001258936
South Carolina 0.0001258935
Rhode Island 0.0001258934
Kansas St. 0.0001258933
Mount St. Mary’s 0.0001258932
VCU 0.0001258931
UC Davis 0.0001258929

This neural network model predicts that the team with the highest probability of winning the NCAA tournament this year is Villanova with a 92.94% chance of winning, followed by Gonzaga with a 80.77% chance of winning, Baylor with a 71.63% chance of winning, and Arizona with a 55.16% chance of winning.

Basketball Machine Learning Paper Updated 

I have now made a significant update to my applied machine learning paper on predicting patterns among NBA playoff and championship teams, which can be accessed here: arXiv Link . 

The Trump Rally, Really?

Today, The Dow Jones Industrial Average (DJIA) surpassed the 20,000 mark for the first time in history. At the time of the writing of this posting (12:31 PM on January 25), it is actually 20,058.29, so, I am not sure if it will close above 20,000 points, but, nevertheless, a lot of people are crediting this to Trump’s presidency, but I’m not so sure you can do that. First, the point must be made, that it is really the Obama economic policies that set the stage for this. On January 20, 2009, when Obama was sworn in, the Dow closed at 7949.089844 points. On November 8, 2016, when Trump won the election, the Dow closed at 18332.74023. So, during the Obama administration, the Dow increased by approximately 130.63%. I just wanted to make that point.

Now, the question that I wanted to investigate was would the Dow have closed past 20,000 points had Trump not been elected president. That is, assuming that the Obama administration policies and subsequent effects on the Dow were allowed to continue, would the Dow have surpassed 20,000 points.

For this, I looked at the DJIA data from January 20, 2009 (Obama’s first inauguration) to November 08, 2016 (Trump’s election). I specifically calculated the daily returns and discovered that they are approximately normally distributed using a kernel density method:

obamadowpdf

Importantly, one can calculate that the mean daily returns, \mu = 0.00045497596503813, while the volatility in daily returns, \sigma = 0.0100872666938282. Indeed, the volatility in daily returns for the DJIA was found to be relatively high during this period. Finally, the DJIA closed at 18332.74023 points on election night, November 08, 2016, which was 53 business days ago.

The daily dynamics of the DJIA can be modelled by the following stochastic differential equation:

S_{t} = S_{t-1} + \mu S_{t-1} dt + \sigma S_{t-1} dW,

where dW denotes a Wiener/Brownian motion process. Simulating this on computer, I ran 2,000,000 Monte Carlo simulations to simulate the DJIA closing price 53 business days from November 08, 2016, that is, January 25, 2017. The results of some of these simulations are shown below:

djiaclosingvaluesims

We concluded the following from our simulation. At the end of January 25, 2017, the DJIA was predicted to close at:

18778.51676 \pm 1380.42445

That is, the DJIA would be expected to close anywhere between 17398.0923062336 and 20158.94121. This range, albeit wide, is due to the high volatility of the daily returns in the DJIA, but, as you can see, it is perfectly feasible that the DJIA would have surpassed 20,000 points if Trump would not have been elected president.

Further, perhaps what is of more importance is the probability that the DJIA would surpass 20,000 points at any time during this 54-day period. We found the following:

probofexceeding

One sees that there is an almost 20% (more precisely, 18.53%) probability that the DJIA would close above 20,000 points on January 25, 2017 had Trump not been elected president. Since, by all accounts, the DJIA exceeding 20,000 points is considered to be an extremely rare/historic event, the fact that the probability is found to be almost 20% is actually quite significant, and shows, that it is quite likely that a Trump administration actually has little to do with the DJIA exceeding 20,000 points.

Although, this simulation was just for 53 working days from Nov 08, 2016, one can see that the probability of the DJIA exceeding 20,000 at closing day is monotonically increasing with every passing day. It is therefore quite feasible to conclude that Trump being president actually has little to do with the DJIA exceeding 20,000 points, rather, one can really attribute it to the day-to-day volatility of the DJIA!

The Most Optimal Strategy for the Knicks

In a previous article, I showed how one could use data in combination with advanced probability techniques to determine the optimal shot / court positions for LeBron James. I decided to use this algorithm on the Knicks’ starting 5, and obtained the following joint probability density contour plots:

One sees that the Knicks offensive strategy is optimal if and only if players gets shots as close to the basket as possible. If this is the case, the players have a high probability of making shots even if defenders are playing them tightly. This means that the Knicks would be served best by driving in the paint, posting up, and Porzingis NOT attempting a multitude of three point shots.

By the way, a lot of people are convinced nowadays that someone like Porzingis attempting 3’s is a sign of a good offense, as it is an optimal way to space the floor. I am not convinced of this. Spacing the floor geometrically translates to a multi-objective nonlinear optimization problem. In particular, let (x_i, y_i) represent the (x-y)-coordinates of a player on the floor. Spreading the floor means one must maximize (simultaneously) each element of the following distance metric:

distancematrix

subject to -14 \leq x_i \leq 14, 0 \leq y_i \leq 23.75. While a player attempting 3-point shots may be one way to solve this problem, I am not convinced that it is a unique solution to this optimization problem. In fact, I am convinced that there are a multiple of solutions to this optimization problem.

This solution is slightly simpler if one realizes that the metric above is symmetric, so that there are only 11 independent components.

Analyzing Lebron James’ Offensive Play

Where is Lebron James most effective on the court?

Based on 2015-2016 data, we obtained from NBA.com the following data which tracks Lebron’s FG% based on defender distance:

lebrondef

From Basketball-Reference.com, we then obtained data of Lebron’s FG% based on his shot distance from the basket:

lebronshotdist

Based on this data, we generated tens of thousands of sample data points to perform a Monte Carlo simulation to obtain relevant probability density functions. We found that the joint PDF was a very lengthy expression(!):

lebrondistro

Graphically, this is:

lebronjointplot

A contour plot of the joint PDF was computed to be:

lebroncontour

From this information, we can compute where/when LeBron has the highest probability of making a shot. Numerically, we found that the maximum probability occurs when Lebron’s defender is 0.829988 feet away, while Lebron is 1.59378 feet away from the basket. What is interesting is that this analysis shows that defending Lebron tightly doesn’t seem to be an effective strategy if his shot distance is within 5 feet of the basket. It is only an effective strategy further than 5 feet away from the basket. Therefore, opposing teams have the best chance at stopping Lebron from scoring by playing him tightly and forcing him as far away from the basket as possible.

 

The Relationship Between The Electoral College and Popular Vote

An interesting machine learning problem: Can one figure out the relationship between the popular vote margin, voter turnout, and the percentage of electoral college votes a candidate wins? Going back to the election of John Quincy Adams, the raw data looks like this:

Electoral College Party Popular vote  Margin (%)

Turnout

Percentage of EC

John Quincy Adams D.-R. -0.1044 0.27 0.3218
Andrew Jackson Dem. 0.1225 0.58 0.68
Andrew Jackson Dem. 0.1781 0.55 0.7657
Martin Van Buren Dem. 0.14 0.58 0.5782
William Henry Harrison Whig 0.0605 0.80 0.7959
James Polk Dem. 0.0145 0.79 0.6182
Zachary Taylor Whig 0.0479 0.73 0.5621
Franklin Pierce Dem. 0.0695 0.70 0.8581
James Buchanan Dem. 0.12 0.79 0.5878
Abraham Lincoln Rep. 0.1013 0.81 0.5941
Abraham Lincoln Rep. 0.1008 0.74 0.9099
Ulysses Grant Rep. 0.0532 0.78 0.7279
Ulysses Grant Rep. 0.12 0.71 0.8195
Rutherford Hayes Rep. -0.03 0.82 0.5014
James Garfield Rep. 0.0009 0.79 0.5799
Grover Cleveland Dem. 0.0057 0.78 0.5461
Benjamin Harrison Rep. -0.0083 0.79 0.58
Grover Cleveland Dem. 0.0301 0.75 0.6239
William McKinley Rep. 0.0431 0.79 0.6063
William McKinley Rep. 0.0612 0.73 0.6532
Theodore Roosevelt Rep. 0.1883 0.65 0.7059
William Taft Rep. 0.0853 0.65 0.6646
Woodrow Wilson Dem. 0.1444 0.59 0.8192
Woodrow Wilson Dem. 0.0312 0.62 0.5217
Warren Harding Rep. 0.2617 0.49 0.7608
Calvin Coolidge Rep. 0.2522 0.49 0.7194
Herbert Hoover Rep. 0.1741 0.57 0.8362
Franklin Roosevelt Dem. 0.1776 0.57 0.8889
Franklin Roosevelt Dem. 0.2426 0.61 0.9849
Franklin Roosevelt Dem. 0.0996 0.63 0.8456
Franklin Roosevelt Dem. 0.08 0.56 0.8136
Harry Truman Dem. 0.0448 0.53 0.5706
Dwight Eisenhower Rep. 0.1085 0.63 0.8324
Dwight Eisenhower Rep. 0.15 0.61 0.8606
John Kennedy Dem. 0.0017 0.6277 0.5642
Lyndon Johnson Dem. 0.2258 0.6192 0.9033
Richard Nixon Rep. 0.01 0.6084 0.5595
Richard Nixon Rep. 0.2315 0.5521 0.9665
Jimmy Carter Dem. 0.0206 0.5355 0.55
Ronald Reagan Rep. 0.0974 0.5256 0.9089
Ronald Reagan Rep. 0.1821 0.5311 0.9758
George H. W. Bush Rep. 0.0772 0.5015 0.7918
Bill Clinton Dem. 0.0556 0.5523 0.6877
Bill Clinton Dem. 0.0851 0.4908 0.7045
George W. Bush Rep. -0.0051 0.51 0.5037
George W. Bush Rep. 0.0246 0.5527 0.5316
Barack Obama Dem. 0.0727 0.5823 0.6784
Barack Obama Dem. 0.0386 0.5487 0.6171

Clearly, the percentage of electoral college votes a candidate depends nonlinearly on the voter turnout percentage and popular vote margin (%) as this non-parametric regression shows:

electoralmap.png

We therefore chose to perform a nonlinear regression using neural networks, for which our structure was:

nnetplot

As is turns out, this simple neural network structure with one hidden layer gave the lowest test error, which was 0.002496419 in this case.

Now, looking at the most recent national polls for the upcoming election, we see that Hillary Clinton has a 6.1% lead in the popular vote. Our neural network model then predicts the following:

Simulation Popular Vote Margin Percentage of Voter Turnout Predicted Percentage of Electoral College Votes (+/- 0.04996417)
1 0.061 0.30 0.6607371
2 0.061 0.35 0.6647464
3 0.061 0.40 0.6687115
4 0.061 0.45 0.6726314
5 0.061 0.50 0.6765048
6 0.061 0.55 0.6803307
7 0.061 0.60 0.6841083
8 0.061 0.65 0.6878366
9 0.061 0.70 0.6915149
10 0.061 0.75 0.6951424

One sees that even for an extremely low voter turnout (30%), at this point Hillary Clinton can expect to win the Electoral College by a margin of 61.078% to 71.07013%, or 328 to 382 electoral college votes. Therefore, what seems like a relatively small lead in the popular vote (6.1%) translates according to this neural network model into a large margin of victory in the electoral college.

One can see that the predicted percentage of electoral college votes really depends on popular vote margin and voter turnout. For example, if we reduce the popular vote margin to 1%, the results are less promising for the leading candidate:

Pop.Vote Margin Voter Turnout % E.C. % Win E.C% Win Best Case E.C.% Win Worst Case
0.01 0.30 0.5182854 0.4675000 0.5690708
0.01 0.35 0.5244157 0.4736303 0.5752011
0.01 0.40 0.5305820 0.4797967 0.5813674
0.01 0.45 0.5367790 0.4859937 0.5875644
0.01 0.50 0.5430013 0.4922160 0.5937867
0.01 0.55 0.5492434 0.4984580 0.6000287
0.01 0.60 0.5554995 0.5047141 0.6062849
0.01 0.65 0.5617642 0.5109788 0.6125496
0.01 0.70 0.5680317 0.5172463 0.6188171
0.01 0.75 0.5742963 0.5235109 0.6250817

One sees that if the popular vote margin is just 1% for the leading candidate, that candidate is not in the clear unless the popular vote exceeds 60%.