I've been doing my own dual-meet rankings, but I didn't update them this week. Mostly that's because there wasn't much action; the big Texas Tech-Nebraska dual meet was canceled due to extreme weather, and besides that there was maybe one or two dual/tri/quad meets this week, none of them involving top-level programs.
The national coaches' association has done national computer rankings for several years, but those are nothing more than fancied-up projections of NCAA championship scoring. Once you get past the top ten or so teams, that's not a terribly reliable indicator of how good a team is. It just tells you if they've got any individual stars, which doesn't say anything about the quality (or lack thereof) of the rest of the team.
The new results-reporting system mandated by the NCAA has allowed the coaches' association to do something new, though. Now they are using the same formula to do regional rankings, which project what would happen if nine separate regional championships were held. The result is much less about one or two big stars and a lot more about overall quality. It's a great for comparing teams within a region; rarely has a lower-ranked team beaten a higher-ranked team in a dual meet. Unfortunately, it doesn't give you any way of comparing teams from separate regions. Why? Well, for one reason, the regions are not balanced in the number of teams. It's a lot easier to run up a big score in the Mountain region (18 men's teams) than it is in the Mid-Atlantic region (31 men's teams). But even more importantly, the regions are not balanced in competitiveness. For example, the West region puts up far better marks than the Northeast region.
The solution is pretty easy, though. Total up the national rankings points for each region, and use that to pro-rate the regional points for each team. This takes both competitiveness and number of teams into account all at once. The results are more or less an accounting of how well a team would do at an average conference championship meet--which is the kind of meet that really matters for at least 95% of college teams. Here's the top 25 men's teams after such an adjustment, with their USTFCA national rankings in parenthesis:
1. Texas A&M (1)
2. Arkansas (7)
3. Nebraska (6)
4. Georgia (22)
5. Texas (29)
6. Oregon (3)
7. Arizona (23)
8. Stanford (10)
9. Arizona St (8)
10. Baylor (12)
11. Tennessee (37)
12. LSU (4)
13. Minnesota (13)
14. Washington St (25)
15. Oklahoma (9)
16. Florida St (2)
17. Florida (5)
18. California (16)
19. Illinois (46)
20. TCU (71)
21. Washington (27)
22. Ohio State (41)
23. Iowa (104)
24. Indiana (11)
25. Louisville (45)
There are some pretty big differences between the first and second numbers for most of these teams. Texas A&M is #1 no matter how you cut it (and their dual-meet thrashing of Texas becomes even more impressive now), but Florida State drops from #2 to #16. Iowa zooms from #104 to #23. The rumors of Arkansas' death are greatly exaggerated, and so on.
Another application of this analysis is to see who's favored to win each conference.
Major conferences
Big Ten: Minnesota
Big XII: Texas A&M
MPSF: Oregon
SEC: Arkansas
Big conferences
ACC: Florida
Big East: Louisville
Conference USA: Houston
Mountain West: TCU
WAC: Idaho
Mid-major conferences
Big Sky: Eastern Washington
Ivy League: Princeton
Mid-American: Akron
MEAC: Norfolk State
Missouri Valley: Southern Illinois
Southland: Stephen F. Austin
The rest
Atlantic 10: Charlotte
America East: Albany
Atlantic Sun: East Tennessee State
Big South: Liberty
Great West: Utah Valley
Horizon League: Wis-Milwaukee
MAAC: Manhattan
NEC: Monmouth
OVC: SE Missouri State
Patriot League: Navy
Sun Belt: Middle Tennessee
Summit League: Oral Roberts
Southern: Western Carolina
SWAC: Alabama State
It's early yet, and the rankings will change. I'll do an update just prior to the conference championships and see how well it does.
The oldest track & field blog on the internet
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment