As the calendar turns over to a new month, I prepare the metagame update for your perusal. And this April's update is rather unusual. It is the only look at a complete month without Lurrus of the Dream-Den around we're going to get before Streets of the New Capenna arrives to shake up Modern. Yes, SNC was legal for the last two days of April, but I didn't see any new cards in the decks that were posted those days. In addition, there are some more statistical anomalies in the data.
Another Set of Outliers
So, just like March's data, April has outliers. Yes, plural. In defiance of my expectations, UR Murktide outstripped the rest of the field by a significant margin. To such an extent that I didn't actually both to run the usual statistical verification. I did have to verify that Hammer Time was an outlier, as it appeared. It was just over the line. Interestingly, removing Murktide from the calculations didn't affect them meaningfully. Removing Hammer Time did impact the data substantially, confirming the outlier effect. Specifically, Hammer was covering up Murktide's effect. Their data is still reported and they're in their correct place on the tiers, but didn't impact the overall analysis.
These outliers were only present in the MTGO data. The paper results had no outliers, though the top decks were really close to the line. This is an odd split in the results, and I'm not sure how it happened. Differences in preferences are the most logical answer but I can't prove them. The observed performance of the decks between paper and online aren't really suggestive either. It is what it is, as unsatisfying as I find it right now.
April Population Metagame
To make the tier list, a given deck has to beat the overall average population for the month. The average is my estimate for how many results a given deck “should” produce on MTGO. Being a tiered deck requires being better than “good enough.” Every deck that posts at least the average number of results is "good enough" and makes the tier list. Then we go one standard deviation (STdev) above average to set the limit of Tier 3 and cutoff for Tier 2. This mathematically defines Tier 3 as those decks clustered near the average. Tier 2 goes from the cutoff to the next standard deviation. These are decks which perform well above average. Tier 1 consists of those decks at least 2 standard deviations above the mean result, encompassing the truly exceptional performing decks.
The MTGO Tier Data
In April the average population was 5.92 setting the Tier 3 cutoff at 6 decks, which is still below average like March was. If this keeps up I'll have to redefine the average cutoff. Tier 3 therefore begins with decks posting 6 results. The STdev was 7.60, which means that means Tier 3 runs to 14 results. Again, it's the starting point to the cutoff, then next whole number for the next Tier. Therefore Tier 2 starts with 15 results and runs to 23. Subsequently, to make Tier 1, 24 decks are required. This is closer to average than March's data.
After the decline in March thanks to the banning, April's data is back in line with pre-ban numbers.. January had 502 decks, February had 436 decks, and March only hit 356, but April is up to 437 total decks on MTGO. It's quite the recovery after the drop off and really should have been higher but a number of events weren't reported. There were also fewer non-Wizards results to add into the data. That said, the total number of decks making the tier list is the same as March with 16, out of 64 total unique decks. The outliers soaking up all the results is to blame.
|Deck Name||Total #||Total %|
The gap between the two outliers and the rest of the data is truly enormous. It is also a bit deceptive. I separate the 4-Color Omnath Money Pile decks into Blink and Control variants based on whether they play Ephemerate. It really does change gameplay enough to warrant the distinction. If I didn't do that, the combined Omnath deck would have bridged the gap between Hammer and Cascade Crashers and might have removed Hammer as an outlier. Which has implications for the overall metagame.
The Paper Tier Data
The paper tiers are calculated the same way as the MTGO tiers, just with different data. While more paper events are represented in the data, they rarely report more than the Top 8 (sometimes less). However, that doesn't mean that the overall population is lower. Indeed, paper Modern is far more popular than online and the data reflects this fact. There were 641 decks in the data, representing 92 unique decks. Anyone who says the metagame is narrow is blind. I initially hypothesized that paper should have more results and decks than paper and it's starting to look like I was right.
Paper's average decks were 6.97, meaning the starting point is 7 decks. Unexpected for it to be higher than online, but maybe that's normal. The STDev is 10.89, so Tier 3 runs from 7 to 18 decks. Tier 2 begins with 19 decks and runs to 30, and Tier 1 requires 31 decks. It will take most of the year to know whether these are indicative of what paper Modern "should" look like. 23 decks made the paper population tier, and it's looking like paper's size should always be higher than online's.
|Deck Name||Total #||Total %|
|Izzet Breach Combo||8||1.25|
Cascade Crashers and Murktide blew the other decks away, but not by quite enough to be outliers. Remove Crashers and Murktide would likely have become an outlier, but as they're not outliers when taken together I didn't remove them.
Worth noting in paper: had I combined the Omnath decks here it would have been the top deck. And possibly shifted the stats enough for it and Crashers to become outliers. Maybe Murktide too, but that's far less likely.
March Power Rankings
Tracking the metagame in terms of population is standard practice. But how do results actually factor in? Better decks should also have better results. In an effort to measure this, I use a power ranking system in addition to the prevalence list. By doing so, I measure the relative strengths of each deck within the metagame. The population method gives a deck that consistently just squeaks into Top 32 the same weight as one that Top 8’s. Using a power ranking rewards good results and moves the winningest decks to the top of the pile and better reflects their metagame potential.
The MTGO Power Tier
For the MTGO data, points are awarded based on the population of the event. Preliminaries award points for record (1 for 3 wins, 2 for 4 wins, 3 for 5) and Challenges are scored 3 points for Top 8, 2 for Top 16, 1 for Top 32. If I can find them, non-Wizards events will be awarded points the same as Challenges or Preliminaries depending on what the event in question reports/behaves like. Super Qualifiers and similar higher-level events get an extra point and so do other events if they’re over 200 players, with a fifth point for going over 400 players. There was only one 4 point event in April and no 5 pointers.
As with the population numbers, points in April were up from March, from 668 to 729. I didn't have to omit an entire week this time, so it makes sense for the numbers to be up. Still not all the way up to February's numbers, but there were some missing Preliminaries and no non-Wizards events.
The average points were 8.85. Therefore 9 points makes Tier 3. The STDev was 13.16, which is relatively normal normal. Thus add 14 to the starting point and Tier 3 runs to 23 points. Tier 2 starts with 24 points and runs to 38. Tier 1 requires at least 39 points. There was a lot of adjustment from population inside the tiers this month. Surprisingly, there was no change in the decks from population to power tier.
|Deck Name||Total Points||Total %|
The only deck to move between tiers is Mono-Green Tron moving up to Tier 2. Turns out that in a world filled with control decks the old kryptonite has a lot of play.
The Paper Power Tiers
Unlike with population, the paper power data works differently than the equivalent MTGO data. Again, the data is usually limited to Top 8 lists, even for big events. Not that I know how big most events are, that doesn't always get reported. In other cases, decks are missing. SCG Con Indianapolis had a Modern 5ks and numerous smaller events, but decks were missing from the Top 32 and the smaller events reported anywhere from 5 to 21 decks for no obvious reason. Applying the MTGO point system just doesn't work when I don't know how many points to award.
Thus, I award points based on the size of the tournament rather than placement. That way I'm being internally consistent with the paper results. When there's a Modern Pro Tour again it would qualify for 3 points, as would Grand Prix or whatever the GP equivalent will be. Star City Modern Opens also award 3 points. SCG 5k-10k and similar events award 2 points. Side events are evaluated based on the number of players and type of event. The purely local events get 1 point. There were a number of events awarding 2 points in April, but only two 3 point events.
The average points were 9.02. That's close enough to 9 that I rounded down, it just felt too pedantic to place the cutoff at 10. The STDev was 14.84, thus add 15 to the starting point and Tier 3 runs to 24 points. Tier 2 starts with 25 points and runs to 40. Tier 1 requires at least 41 points. There was a lot less movement between the tiers compared to previous months, but Merfolk did fall off Tier 3 and nothing replaced it.
|Deck Name||Total #||Total %|
|Izzet Breach Combo||11||1.32|
As they were the main large events in April, the Star City events had a disproportionate effect on the power tiers. Burn say negligible play in Dallas which contributed to its fall to tier 2. I don't think that Amulet Titan could have been Tier 1 except for its enduring popularity with SCG players.
Average Power Rankings
Finally, we come to the average power rankings. These are found by taking total points earned and dividing it by total decks, which measures points per deck. I use this to measure strength vs. popularity. Measuring deck strength is hard. There is no Wins-Above-Replacement metric for Magic, and I'm not certain that one could be credibly devised. The game is too complex, and even then, power is very contextual. Using the power rankings certainly helps and serves to show how justified a deck’s popularity is. However, more popular decks will still necessarily earn a lot of points. Which tracks, but also means that the top tier doesn't move much between population and power, and obscures whether they really earned their position.
This is where the averaging comes in. Decks that earn a lot of points because they get a lot of results will do worse than decks that win more events, indicating which deck actually performs better. A higher average indicates lots of high finishes, where low averages result from mediocre performances and high population. Lower-tier decks typically do very well here, likely due to their pilots being enthusiasts. So be careful about reading too much into the results. However, as a general rule decks which place above the baseline average are overperforming and vice versa. How far above or below that average determines how "justified" a decks position on the power tiers are. Decks well above baseline are therefore undervalued while decks well below baseline are very popular but aren't necessarily good.
The Real Story
When considering the average points, the key is looking at how far-off a deck is from the Baseline stat (the overall average of points/population). The closer a deck’s performance to the Baseline, the more likely it is to be performing close to its “true” potential. A deck that is exactly average would therefore perform exactly as well as expected. The greater the deviation from average, the more a deck under- or over-performs. On the low end, a deck’s placing was mainly due to population rather than power, which suggests it’s overrated. A high-scoring deck is the opposite.
I'll begin with the average for MTGO:
|Deck Name||Average Points||Power Tier|
Congratulations to BGx Yawgmoth for being the highest placing Tier 1 deck! It didn't show up in many events overall, but enjoyed an oversized appearance and win rate in the Challenges. That's all that's necessary to win Deck of the Month (Online).
Onto the paper averages:
|Deck Name||Average Points||Power Tier|
|Izzet Breach Combo||1.38||3|
While Yawgmoth is the best placing deck overall, the top Tier 1 deck is Amulet Titan, and thus it wins Deck of the Month (Paper). However, I'm going to asterisk that award, and remind readers that the SCG tournaments have featured a disproportionate amount of Amulet stretching back at least 3 years. It has a very dedicated, regional fan club.
That's a lot of data, but what does it all mean? When Modern Nexus first started, we had a statistical method to combine the MTGO and paper data, but the math of that system doesn't work without the big paper events. I tried. So, I'm using an averaging system to combine the data. I take the MTGO results and average the tier, then separately average the paper results, then average the paper and MTGO results for the final placement.
This generates a lot of partial Tiers. That's not a bug; it's a feature. The nuance separates the solidly Tiered decks from the more flexible ones and shows the true relative power differences between the decks. Every deck in the paper and MTGO results is on the table, and when they don't appear in a given category they're marked N/A. This is treated as a 4 for averaging purposes.
|Deck Name||Paper Population Tier||Paper Power Tier||Average Paper Tier||MTGO Population Tier||MTGO Power Tier||Average MTGO Tier||Overall Tier|
|Izzet Breach Combo||3||3||3||N/A||N/A||N/A||3.5|
For the second month in a row, Cascade Crashers and UR Murktide are the only purely Tier 1 decks in Modern. Well done them. Given the continuing flux in the data and the outliers observed on MTGO, that is something to keep an eye on.
Stormy Waters Ahead?
With the apparent settling on top of the metagame by UR Murktide and the arrival of a new set, I expect Modern to have quite a bit of churn in May. Whether that will amount to a drastic change in the metagame I have no idea. I hope there is one, as the indications right now point toward unhealthy metagame share.