Primary Menu
by Mike Pavlichko

As we continue to investigate the Born Power Index – a new criteria that makes up for 60% of the playoff seeding formula this year – one pattern has emerged:  blowouts appear to be a factor in the ratings for each team, with no cutoff applied.

Bill Born – creator of the index – has said teams are “cut off” at a certain point, though it’s unknown what that point is.

However, one area athletic director who was involved with the process of choosing a new criteria to use along with power points – as well as NJSIAA Assistant Director Jack DuBois – have said there is a cutoff.  The athletic director suggested the cutoff is at 35 points, which is what triggers a second half running clock in high school football in New Jersey.  However, neither of them have seen the formula for the Born Power Index.  The formula remains known only to Bill Born, who has been running the index for nearly 50 years.

But a close analysis of games has shown otherwise.  It appears blowouts count for the full amount of a margin of victory.  Winning by 50 is better than winning by 40.  That’s a formula that encourages blowouts.

We’ve reached out to DuBois and Born for comment.  We’ll let you know when we hear back.

To understand how, one must understand how the Born Power Index works.  We believe we have found the formula by picking up on a pattern, but there remain a few games that don’t fit that pattern.

The Basics on the Born Power Index

After the Week Zero games, we found a couple key points:

  • The amount one team goes up or down is equal to the amount their opponent goes down.  If Edison plays East Brunswick, and Edison goes down 2.5 points, East Brunswick will go up 2.5 points.  Upon analyzing the numbers, this is a certainty.
  • The amount of increase or decrease in the Born Power Index for each team is relative to difference between the “spread” – the difference between the team’s BPI’s – and the margin of victory.  If Edison is rated 10 points higher than East Brunswick, Edison is expected to be 10 points better than East Brunswick.
    • If Edison wins by 10, they covered the “spread” and neither team’s index would change.
    • If they beat East Brunswick by 20, they beat the spread by 10, and they would go up, while East Brunswick would go down.
    • If they beat East Brunswick by 5, they lost to the spread by 5, and they would go down, while East Brunswick would go up.

The question we had was:  how much does a team go up or down?

A simple graphing of all the winners in Week Zero shows for every 4 points a team beats the spread, they go up 1 point in the Born Power Index.

Beat the spread by 10?  You go up 2.5, while your opponent goes down 2.5.  That pattern fit all but 3 of the approximately 80 games in Week Zero.

In Week One, we looked at just the GMC and Somerset County teams in action.  Applying the same formula, we correctly predicted 24 of 24 games, with one of those games 0.1 points off, well within the margin of error due to how the numbers are rounded.

Week One Examples:

Carteret at JFK
Carteret’s BPI was 58.6.  JFK was 40.  The “spread” was 18.6, in favor of Carteret.
Carteret won 28-27.  The margin of victory was 1, and they lost to the spread by 17.6.
As we said, you go up or down 1 point for every 4 points of difference in the spread.  So, divide 17.6 by 4.  You get 4.4.
Carteret lost to the spread, and their number went down to 54.2.
JFK beat the spread, and they went up to 44.4.
That math gets the same result the Born Power Index did.

Let’s take another one:

JP Stevens at Perth Amboy
This one was an upset.
JP Stevens’ BPI was 46.7.  Perth Amboy was 40.7.  The “spread” was 6 in favor of JP Stevens.
But Perth Amboy won 27-13.  The margin of victory was 14.  That’s a 20 point difference (should have lost by 6, won by 14).
Again, divide by 4, you get 5.
JP Stevens lost to the spread by 5, so they went down to 41.7.
Perth Amboy beat the spread, and they went up to 45.7

This pattern worked for every GMC and Somerset County game.

Even for blowouts.

A blowout – East Brunswick at Piscataway
East Brunswick’s BPI was 53.5.  Piscataway’s BPI was 82.2.  The “spread” was 28.7 in favor of Piscataway.
Piscataway won 41-0.  That’s well over a suggested 35 point blowout cutoff.  That’s also 12.3 points over the spread.
Divide by 4 and you get 3.1.
Piscataway beat the spread, and went up to 85.3.
East Brunswick lost to the spread and went down to 50.4.
No blowout cutoff was applied to this game.

The same happened in Bound Brook’s 42-0 win over Manville.
Bound Brook was a 19.1 point favorite.
They beat the spread by 22.9.  Divide by 4 to get 5.7
Bound Brook indeed rose from a 52.1 to 57.8.

Want to look at Week Zero?

How about Highland Park’s 41-0 loss to Montclair-Kimberley.
MKA was a 12.1 point favorite.
MKA beat it by 28.9.  Divide by 4 to get 7.2.
Sure enough, MKA rose 7.2.  Their BPI went from 16.7 to 23.9.

How about Friday night, in Week 2?

Somerville had an 87.2 BPI entering it’s game with North Plainfield, which had a 41.6 rating.  That’s a 45.6 point spread.
Somerville won 50-0, beating the spread by 4.4 points.
Dividing by 4, that’s a simple 1.1 points.
The BPI, though, had Somerville going up only .7 points.
Using the multiple of 4, that would have been equivalent to beating the spread by 2.8 points, which would have been a 48.4-0 win.  Seems like a random number to us.  And too high, and pointless, to be a blowout cutoff.
Had the 35 point cutoff been used, Somerville actually would have gone down.  They would have lost to the spread by 10.6 points and, and gone down 2.65 points.
In this case, they went up, just a little less than we thought.

Our conclusion:  The blowout factor is not in play.

This is the most important point, and another reason Bill Born should release the Born Power Index formula.  No limit on blowouts encourages running up the score.  Of course, coaches have been told blowouts don’t matter.  That clearly is not the case.

Ultimately, there are a lot of inconsistencies in the BPI.  But Born has stated repeatedly that the formula never changes, that all games count, and that all teams are treated equally.

We disagree.  If that were the case, we wouldn’t have had a situation like this week, where only 6 of the 20 games involving area teams were within the margin of error.

But we also ran the numbers on 21 games from the Shore Conference, albeit after the fact.  But we used the same formula.  The computer does all the calculations.  We saw 18 of the 21 games come within the margin of error.

Born made a major error (perhaps a typo) when South Brunswick’s rating came out today.  He had the Vikings down to 65.4, a 34.8 point drop, while Piscataway – the team they lost to – rose from 85.3 to 90.1, a difference of 4.8.  After we pointed it out to him, he corrected it.  But we only noticed it after closely examining the numbers and finding a pattern, in addition to it being way off the mark.  Would anyone have noticed if he was only a point off?

Even when corrected, the Piscataway-South Brunswick game didn’t fit the pattern of dividing the margin over or under the spread by 4.  Nor did the Woodbridge-Sayreville game, which was almost the same margin over/under the spread:  28.8 for Sayreville-Woodbridge, 28.9 for Piscataway-South Brunswick.  That Woodbridge-Sayreville game also didn’t fit the pattern, interestingly, both games were a factor of 6.

This doesn’t make sense if Born maintains the formula is the same for all teams, because in some strange quirk of fate, a third game had a 28.8 margin over the spread:  Old Bridge’s win over Perth Amboy, which was closer than the BPI would have predicted.  In this one, the pattern fit.  The margin was divided by 4, and Old Bridge dropped by 7.2, while Perth Amboy rose.  (Born’s theory here is Perth Amboy performed better than expected, they beat the spread.)  Three games, same margin over or under the spread, two had different results from a third.

The only connection between the Sayreville-Woodbridge game and the Piscataway-South Brunswick game is that Sayreville and South Brunswick all were playing their third game.  Neither Old Bridge nor JP Stevens were.  The others had one team playing their second, and one team playing their third game.

But what about Colonia?  They played their third game this week, beating JFK, which was playing it’s second game.  We were withing the margin of error, 0.3 points off.  Same with Bishop Ahr, which played its third game this week, losing to Carteret.  We were only 0.1 points off on that game.  Again, well within the margin of error for rounding.

End result:  too many inconsistencies.  And if no one but Born himself knows the formula, how can teams report discrepancies?

The NJSIAA considered other rankings besides the Born Power Index.  One was the website MaxPreps.  But word is they wouldn’t reveal their formula, and it encouraged blowouts, so the committee that looked at changes to the playoffs decided not to use it.

Sure sounds like the Born Power Index to us.