A poorer than average, but not entirely out of the ordinary performance this week (given expected fluctuations): $32/54 = 59.3\%$. A slightly poorer performance might be expected on weeks like this – where a blockbuster deal completely changes part of the league: Apparently the Pistons are now unbeatable sans Josh Smith. (more…)
Another good week for the NBA predictor: correct predictions for 39/53 = 73.6% of the games played. This is an excellent result considering our algorithm had a difficult time cross-validating with this week’s results from last year. Also, the 76ers won a game – there goes one of our sure-thing predictions.
Predictions for next week are up. We are also working on a nice dynamic visualization for our NBA content – coming soon!
Our NBA predictor did considerably better this week. In fact, it did better than we even expected/hoped. We had an accuracy of 40/51 = 78.4%, making up for last week’s lull.
Week 4 predictions are now up. This week set a high bar and our cross-validation indicates that the coming week may be harder to predict.. but we are still optimistic.
This was an odd week in the NBA – plenty of upsets to go around. You’re right, we’re making excuses – our NBA predictor had a tough week this round. We had a total accuracy of 27/50 = 54%. The summary by point spread is at right. But seriously, who would have predicted results like back-to-back Laker victories over Atlanta and Houston (aside from die-hard Kobe fans)? Well, we certainly didn’t.
For the coming week we have further refined our algorithm in two ways. First, we have now incorporated a team’s away performance into their predicted home performance and vice-versa. And second, we now select the parameters from our model with a more thorough and systematic cross-validation method. Predictions are now up!
The first week of our NBA game outcome prediction experiment is in the books! We had a prediction accuracy of 32/51 (= 63%). A summary of the results broken down by game point spread is given at right. The point spreads shown are from the actual games, and the accuracy values shown are the fraction of correct predictions for games within the particular point spread range specified in that row.
Our accuracy this past week was significantly lower than we achieved on the 2013-2014 historical data. In training on the first 800 games of that season we achieved approximately 70% accuracy on the remaining 430 games, after incorporating momentum into the prediction. This past week, our model had only 70 games from the current season to train on, relying on last year’s data to supplement the training. For this reason, we feel 63% is actually pretty reasonable for our first week. In fact, we had a few very good days late in the week — and, of course, we are particularly proud to have correctly predicted this.
Our predictions for the next week are now up as well. This week we have further incorporated fatigue into the model – keeping track of the number of games each team has played in the five days preceding a game. When applied to the prior season data, we found that this feature helped boost our accuracy by about 2%.