menu

Return to Form: January ’22 Metagame Update

Are you a Quiet Speculation member?

If not, now is a perfect time to join up! Our powerful tools, breaking-news analysis, and exclusive Discord channel will make sure you stay up to date and ahead of the curve.

We are now firmly into the new year, so it's time for another metagame update. This one is arriving a little later than normal, but that's what happens when the month begins on a Tuesday. I just don't have time to gather and process the data and then complete the write up in 24 hours. And it's not like I'm missing anything major in the world of Magic by talking about this instead of any new development.

Modern has been in an odd state since last July. On the one hand, there has been considerable innovation, churn, and general change within Modern since the release of Modern Horizons 2. On the other, it has remained remarkably static: Hammer Time has held firmly onto the #1 spot in the metagame, with the same few decks hanging around Tier 1 as well. Will the new year yield a new metagame? Let's dive in and see!

January Metagame

To make the tier list, a given deck has to beat the overall average population for the month. The average is my estimate for how many results a given deck “should” produce on MTGO. Being a tiered deck requires being better than “good enough;” in January the average population was 6.88 setting the Tier 3 cutoff at 7 decks. Which is back where it's been for most of the past year.

Tier 3 begins with decks posting 7 results. Then we go one standard deviation above average to set the limit of Tier 3 and cutoff for Tier 2. The STdev was 12.52, which means that means Tier 3 runs to 20 results. Again, it's the starting point to the cutoff, then one above for the next Tier. The STdev was lower this month, though still on the high end of normal. Therefore Tier 2 starts with 21 results and runs to 34. Subsequently, to make Tier 1, 35 decks are required. Which, again, is on the high end of normal for post-MH2 Modern.

I should note that it will look like there are several outliers in January's data. But they are not in fact outliers. Rather, an unusual number of singleton decks in the data skewed the numbers. And even if the decks were outliers, removing them doesn't change anything about the tier list's composition, so the outliers wouldn't affect the conclusions.

The Tier Data

Total decks fell in January. This is not January's fault, however. Wizards failed to post the January 29 Challenge, and so there's a minor gap in the data. This happens periodically; I'm guessing the auto-updater bugs out. There were also almost no non-Wizards events in January. I have no idea why. As such, the data consists of 502 decks down from December's 528 decks. Had the missing Challenge been posted or there been more non-Wizards events, it would have beaten December easily.

The unique decks were up with 73 unique decks in January to December's 67, just over a third of which were singletons. I've had updates with more in total, but this is high percentage-wise. Not sure how that happened, but it is what it is. Consequently, the number of decks is also a bit up from December's low of 13 at 18, which is back in the normal range.

Deck NameTotal #Total %
Tier 1
Grixis Shadow6713.35
Hammer Time6011.95
4-Color Blink448.76
Tier 2
UR Murktide316.17
Cascade Crashers295.78
Burn275.38
Jund Saga244.78
Tier 3
Amulet Titan163.19
Belcher142.79
4-Color Control132.59
UW Control122.39
Mill112.19
Yawgmoth112.19
Mono-Green Tron101.99
Ponza81.59
Blue Living End81.59
Dredge81.59
4-Color Bring to Light71.39

I hope my into was taken as the foreshadowing it was, because Hammer Time has finally been dethroned... by 2017 boogeyman Grixis Shadow. The more things change, the more they stay the same. However, don't rejoice too loudly yet. I don't know if this is a real metagame shift or a hiccup. Grixis has a lot of nostalgia going for it, and that may be the reason it has dethroned Hammer Time after seven months.

Adjustments

Not to say that Grixis didn't do it on merit, mind. Dress Down is extremely powerful against Hammer Time and many other decks. To say nothing of old standby Kolaghan's Command's effectiveness. Add in discard's power against slow strategies and the fast pressure of Ragavan, Nimble Pilferer, and Grixis has game against the field.


Which may be why I'm seeing a lot of change within the other top decks. Some Hammer Time lists are dropping Sanctifier en-Vec from maindecks because it matches up so poorly against Dress Down. Instead, it's moving to more equipment like Nettlecyst and Kaldra Compleat, which means giving up Lurrus of the Dream-Den. Others are branching out into more colors. Red gives reach and Ragavan, black has discard and Dark Confidant, while blue gives Meddling Mage.


Meanwhile, 4-Color Omnath piles are continuing to evolve. The value blink version with Ephemerate was the most popular, but that was thanks to early-month dominance. By the end, the control version with extra counters and answers for Death's Shadow was the most played. I don't know if this will continue or was linked to a specific streamer's results. We'll all have to see.

Power Rankings

Tracking the metagame in terms of population is standard practice. But how do results actually factor in? Better decks should also have better results. In an effort to measure this, I use a power ranking system in addition to the prevalence list. By doing so, I measure the relative strengths of each deck within the metagame. The population method gives a deck that consistently just squeaks into Top 32 the same weight as one that Top 8’s. Using a power ranking rewards good results and moves the winningest decks to the top of the pile and better reflects its metagame potential.

Points are awarded based on the population of the event. Preliminaries award points for record (1 for 3 wins, 2 for 4 wins, 3 for 5) and Challenges are scored 3 points for Top 8, 2 for Top 16, 1 for Top 32. If I can find them, non-Wizards events will be awarded points the same as Challenges or Preliminaries are depending on what the event in question reports/behaves like. Super Qualifiers and similar higher-level events get an extra point and so do other events if they’re over 200 players, with a fifth point for going over 400 players. There were two 4 points events in October and no 5 pointers.

The Power Tiers

Unlike with population, the total points were down slightly in January, but would have easily beaten December if Wizards had posted the missing Challenge. There are 872 total points in January compared to December's 889. A single Challenge adds 56 points, for the record. There were a lot of very large Preliminaries in January which pushed up the points despite the number of events being down overall.

The average points were 11.94. Therefore 12 points makes Tier 3. The STDev was 21.95, which again is on the high end of normal. Thus add 22 to the starting point and Tier 3 runs to 34 points. Tier 2 starts with 35 points and runs to 57. Tier 1 requires at least 58 points. The number of decks making the power tiers was down from 18 to 16. Because Ponza and 4-Color Bring to Light just don't win events.

Deck NameTotal #Total %
Tier 1
Grixis Shadow11913.65
Hammer Time9911.35
4-Color Blink849.63
Tier 2
UR Murktide576.54
Cascade Crashers485.50
Burn445.05
Jund Saga374.24
Tier 3
Belcher283.21
UW Control273.10
Amulet Titan242.75
Mill242.75
4-Color Control222.52
Mono-Green Tron202.29
Yawgmoth161.83
Blue Living End131.49
Dredge131.49

It's unusual that the top tiers stay the same between population and power, but here we are. There's a lot of movement in Tier 3 and none in the other two. It makes sense as with fewer overall results the effect of a few good placings will be exaggerated. Belcher being at the top of Tier 3 is important because the deck is very well positioned, and I'm surprised it's not more popular online. In a metagame this fair and warped around beating fair decks, you'd think the extremely unfair deck would be the right metagame call. I don't know why that isn't happening.

Average Power Rankings

Finally, we come to the average power rankings. These are found by taking total points earned and dividing it by total decks, which measures points per deck. I use this to measure strength vs. popularity. Measuring deck strength is hard. There is no Wins-Above-Replacement metric for Magic, and I'm not certain that one could be credibly devised. The game is too complex, and even then, power is very contextual. Using the power rankings certainly helps and serves to show how justified a deck’s popularity is. However, more popular decks will still necessarily earn a lot of points. Which tracks, but also means that the top tier doesn't move much between population and power, and obscures whether they really earned their position.


This is where the averaging comes in. Decks that earn a lot of points because they get a lot of results will do worse than decks that win more events, indicating which deck actually performs better. A higher average indicates lots of high finishes, where low averages result from mediocre performances and high population. Lower-tier decks typically do very well here, likely due to their pilots being enthusiasts. So be careful about reading too much into the results. However, as a general rule decks which place above the baseline average are overperforming and vice versa. How far above or below that average determines how "justified" a decks position on the power tiers are. Decks well above baseline are therefore undervalued while decks well below baseline are very popular but aren't necessarily good.

The Real Story

When considering the average points, the key is looking at how far-off a deck is from the Baseline stat (the overall average of points/population). The closer a deck’s performance to the Baseline, the more likely it is to be performing close to its “true” potential. A deck that is exactly average would therefore perform exactly as well as expected. The further away the greater the deviation from average, the more a deck under- or over-performs. On the low end, the deck’s placing was mainly due to population rather than power, which suggests it’s overrated. A high-scoring deck is the opposite.

Deck NameAverage PointsPower Tier
UW Control2.253
Mill2.183
Belcher2.003
Mono-Green Tron2.003
4-Color Blink1.911
UR Murktide1.842
Grixis Shadow1.771
Baseline1.74
4-Color Control1.693
Hammer Time1.651
Cascade Crashers1.652
Burn1.632
Blue Living End1.623
Dredge1.623
Jund Saga1.542
Amulet Titan1.503
Yawgmoth1.453

Congratulations to 4-Color Blink, the best performing high tier deck and therefore the best deck of January 2022! It's interesting though instructive that Grixis Shadow was right at baseline, indicating it's performing as expected while Hammer is well below. Not that Hammer's average points have been anything special for a while now. I'd also expect a deck undergoing a redesign to underperform. It isn't fully optimized anymore, how is it going to dominate?

On that note, the baseline is quite high by the standards of the past few months. This is not an indication of a shift in the winds but rather the result of a quirk from all the singletons. For some reason, they did very well in a number of Challenges Super Qualifiers and having multiple 3+ point winners skewed the average higher.

Paper's Back

Normally, that would be the end of the metagame update. However, clearly that's not the case, and that's because I have a promise to keep. In the December update, I announced that paper would be returning to the data, and it has. You may rejoice!


Now cease rejoicing, because this will be underwhelming. There's not a lot of data to work with for paper. I can only work with results that get reported, and the only place that has said reports is MTGTop8.com. Over there, nearly every event is reported as just a Top 8, and sometimes even less data is given per event. The data is therefore extremely lacking in granularity or dynamism. Additionally, not every store that is running usable events is posting so my results reflect a few major stores more than any other. Specifically, Hareruya is responsible for about a third of the data. Keep that in mind as I report the data, and if anyone knows of other sources, please let me know. I'm mildly desperate.

The Paper Metagame

There were 293 total paper decks posted in January, which is roughly 3/5's of the online metagame. As such, the data is less robust and reliable. It is what is it is, and an analyst must work with the available data. Given that almost all the results were just Top 8 results (and many were less) and also didn't the number of players or final records, I'm not doing a power metagame. I don't have enough information to assign points, so even if I did one, it would not look sufficiently different to population justify the effort.

However, this data is significant for backing up a long-held belief. Despite having significantly fewer results than the online results, the paper metagame consists of 66 unique decks, just 7 less than online. It has always been assumed that paper is more diverse than online, and this data backs up the assumption. We'll see if further results corroborate these ones and prove the belief true. That said, only 15 decks made the paper Tiers. Again, lots of singletons in the data set.

Deck NameTotal #Total %
Tier 1
Grixis Shadow289.56
Burn217.17
Hammer Time206.83
4-Color Blink206.83
Tier 2
Amulet Titan155.12
UW Control155.12
UR Murktide144.78
Tier 3
Cascade Crashers113.75
Blue Living End103.41
Jund Saga72.39
Yawgmoth72.39
Mono-Green Tron62.05
Eldrazi Tron62.05
Ponza51.71
Humans51.71

Of course, for all that, Paper Tier 1 looks extremely similar to Online Tier 1. Burn being more popular in paper was a huge surprise. The assumption has always been it's overrepresented online, but that appears to be untrue. In fact, the overall composition of these tiers is quite similar to online's tiers. I have no idea if this is normal or not.

Composite Metagame

At would logically follow to combine the two lists into one. It is what we did in the old days. However, that's not going to work. The math the old system used doesn't work without major paper events. Maybe when Wizards gets Organized Play back together that will change but for now, I can't use it. I can't just combine the two because the online results vastly outweigh paper and will obscure the "true" result. So for now, I'm just going to report their average Tier. This reflects the differences between paper and online play sufficiently well to demonstrate the relative power of decks and the differences between paper and online.

Deck NameOnline Population TierPaper Population TierAggregate Tier
Tier 1
Grixis Shadow111
Hammer Time111
4-Color Blink111
Burn211.5
Tier 2
UR Murktide222
Cascade Crashers232.5
Jund Saga232.5
Amulet Titan322.5
UW Control322.5
Tier 3
Yawgmoth333
Mono-Green Tron333
Ponza333
Blue Living End333
Belcher3N/A3.5
4-Color Control3N/A3.5
Mill3N/A3.5
Dredge3N/A3.5
4-Color Bring to Light3N/A3.5
Eldrazi TronN/A33.5
HumansN/A33.5

Unsurprisingly, the best decks from online are still solidly Tier 1 in aggregate. Burn moves up by a half tier, reflecting that it's #2 overall in paper but Tier 2 online. Those decks that didn't show up on one list but did the other get an N/A, which is treated like a 4 for math purposes.

Just the Beginning

I've got a lot of bugs to work out with the paper results. I didn't really know what I'd have, and so failed to make solid plans prior to gathering the data. If anyone has advice, I'm all ears, but until then, this is the metagame we faced in January. And may still be facing next time. We have to wait and see.

Join the conversation

Want Prices?

Browse thousands of prices with the first and most comprehensive MTG Finance tool around.


Trader Tools lists both buylist and retail prices for every MTG card, going back a decade.

Quiet Speculation