• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 03:00
CEST 09:00
KST 16:00
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
[ASL21] Ro24 Preview Pt2: News Flash10[ASL21] Ro24 Preview Pt1: New Chaos0Team Liquid Map Contest #22 - Presented by Monster Energy17ByuL: The Forgotten Master of ZvT30Behind the Blue - Team Liquid History Book20
Community News
Weekly Cups (March 23-29): herO takes triple6Aligulac acquired by REPLAYMAN.com/Stego Research8Weekly Cups (March 16-22): herO doubles, Cure surprises3Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool51Weekly Cups (March 9-15): herO, Clem, ByuN win4
StarCraft 2
General
Team Liquid Map Contest #22 - Presented by Monster Energy Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool What mix of new & old maps do you want in the next ladder pool? (SC2) Aligulac acquired by REPLAYMAN.com/Stego Research Weekly Cups (March 23-29): herO takes triple
Tourneys
RSL Season 4 announced for March-April Sparkling Tuna Cup - Weekly Open Tournament StarCraft Evolution League (SC Evo Biweekly) WardiTV Mondays World University TeamLeague (500$+) | Signups Open
Strategy
Custom Maps
[M] (2) Frigid Storage Publishing has been re-enabled! [Feb 24th 2026]
External Content
Mutation # 519 Inner Power The PondCast: SC2 News & Results Mutation # 518 Radiation Zone Mutation # 517 Distant Threat
Brood War
General
[ASL21] Ro24 Preview Pt2: News Flash Pros React To: JaeDong vs Queen Gypsy to Korea BGH Auto Balance -> http://bghmmr.eu/ How Can I Add Timer & APM Count?
Tourneys
[Megathread] Daily Proleagues [ASL21] Ro24 Group E [ASL21] Ro24 Group F Azhi's Colosseum - Foreign KCM
Strategy
Fighting Spirit mining rates What's the deal with APM & what's its true value Simple Questions, Simple Answers
Other Games
General Games
Nintendo Switch Thread Stormgate/Frost Giant Megathread Starcraft Tabletop Miniature Game General RTS Discussion Thread Darkest Dungeon
Dota 2
The Story of Wings Gaming Official 'what is Dota anymore' discussion
League of Legends
G2 just beat GenG in First stand
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Mafia Game Mode Feedback/Ideas TL Mafia Community Thread Five o'clock TL Mafia
Community
General
US Politics Mega-thread Russo-Ukrainian War Thread NASA and the Private Sector Things Aren’t Peaceful in Palestine Canadian Politics Mega-thread
Fan Clubs
The IdrA Fan Club
Media & Entertainment
[Req][Books] Good Fantasy/SciFi books [Manga] One Piece Movie Discussion!
Sports
2024 - 2026 Football Thread Formula 1 Discussion Cricket [SPORT] Tokyo Olympics 2021 Thread General nutrition recommendations
World Cup 2022
Tech Support
[G] How to Block Livestream Ads
TL Community
The Automated Ban List
Blogs
Funny Nicknames
LUCKY_NOOB
Money Laundering In Video Ga…
TrAiDoS
Iranian anarchists: organize…
XenOsky
FS++
Kraekkling
Shocked by a laser…
Spydermine0240
ASL S21 English Commentary…
namkraft
Customize Sidebar...

Website Feedback

Closed Threads



Active: 9984 users

Computer Simulation to Understand Arena Results - Page 2

Forum Index > Hearthstone General
Post a Reply
Prev 1 2 All
MarcoBrei
Profile Joined May 2011
Brazil66 Posts
March 25 2014 17:18 GMT
#21
On March 25 2014 21:44 obesechicken13 wrote:
Show nested quote +
On March 25 2014 21:28 MarcoBrei wrote:
On March 25 2014 21:14 obesechicken13 wrote:
I think the actual distributions may be slightly different for arena runs.

http://www.arenamastery.com/sitewide.php
Arena mastery shows some great players so their sitewide stats sit around 4 wins average, but they have a lot of 12s compared to your simulation. This could be because some players legitimately have a greater than 50% chance of winning a certain match.

The average # of wins in a run should be around 3 though. 2.9-3.1 ish.


Arena mastery does not have all information, right? Just the ones people submit, if I'm correct. If so, several results, specially the bad ones, may not be present, which ruins the metric.
About "50% of chance", let me clarify that in my simulation we do not have this rule. Players win game based on their skills, and people have different skills simulating a normal distribution over the population.

I agree with the first point.

How did you assign skill levels to players? did you give each a point rating and use that to determine their probability of winning? What did the distribution of skill levels look like? Did you use something like Elo distributions for chess or lol?


Skill is a number. Players have skills varying from 0 to 10. As expected in a normal distribution, there are a lot more people with skill around 5 than people with skill around 0 or 10, and that's done gradually. I also know the distribution I made is not perfect real, but I think it is enough to use in this test.
When a player faces another player, basically the one with better skill wins. But I introduced some RNG to allow a worse skilled player to have a chance to win. Lets say player X with skill equals to 6.5 faces a player Y with 7.0. The considered skill in the match is something like this:
X: a random number from 5.85 to 7.15
Y: a random number from 6.3 to 7.7
The player Y has more chance to win, but player X still have his chance.
If the players have too discrepant skills then the victory of the best player is certain.


mikeymoo
Profile Blog Joined October 2006
Canada7170 Posts
March 25 2014 17:33 GMT
#22
On March 26 2014 02:18 MarcoBrei wrote:
Show nested quote +
On March 25 2014 21:44 obesechicken13 wrote:
On March 25 2014 21:28 MarcoBrei wrote:
On March 25 2014 21:14 obesechicken13 wrote:
I think the actual distributions may be slightly different for arena runs.

http://www.arenamastery.com/sitewide.php
Arena mastery shows some great players so their sitewide stats sit around 4 wins average, but they have a lot of 12s compared to your simulation. This could be because some players legitimately have a greater than 50% chance of winning a certain match.

The average # of wins in a run should be around 3 though. 2.9-3.1 ish.


Arena mastery does not have all information, right? Just the ones people submit, if I'm correct. If so, several results, specially the bad ones, may not be present, which ruins the metric.
About "50% of chance", let me clarify that in my simulation we do not have this rule. Players win game based on their skills, and people have different skills simulating a normal distribution over the population.

I agree with the first point.

How did you assign skill levels to players? did you give each a point rating and use that to determine their probability of winning? What did the distribution of skill levels look like? Did you use something like Elo distributions for chess or lol?


Skill is a number. Players have skills varying from 0 to 10. As expected in a normal distribution, there are a lot more people with skill around 5 than people with skill around 0 or 10, and that's done gradually. I also know the distribution I made is not perfect real, but I think it is enough to use in this test.
When a player faces another player, basically the one with better skill wins. But I introduced some RNG to allow a worse skilled player to have a chance to win. Lets say player X with skill equals to 6.5 faces a player Y with 7.0. The considered skill in the match is something like this:
X: a random number from 5.85 to 7.15
Y: a random number from 6.3 to 7.7
The player Y has more chance to win, but player X still have his chance.
If the players have too discrepant skills then the victory of the best player is certain.



I have issues with this methodology, but you've inspired me to construct my own simulation.
o_x | Ow. | 1003 ESPORTS dollars | If you have any questions about bans please PM Kennigit
MarcoBrei
Profile Joined May 2011
Brazil66 Posts
March 25 2014 17:42 GMT
#23
On March 26 2014 02:33 mikeymoo wrote:
Show nested quote +
On March 26 2014 02:18 MarcoBrei wrote:
On March 25 2014 21:44 obesechicken13 wrote:
On March 25 2014 21:28 MarcoBrei wrote:
On March 25 2014 21:14 obesechicken13 wrote:
I think the actual distributions may be slightly different for arena runs.

http://www.arenamastery.com/sitewide.php
Arena mastery shows some great players so their sitewide stats sit around 4 wins average, but they have a lot of 12s compared to your simulation. This could be because some players legitimately have a greater than 50% chance of winning a certain match.

The average # of wins in a run should be around 3 though. 2.9-3.1 ish.


Arena mastery does not have all information, right? Just the ones people submit, if I'm correct. If so, several results, specially the bad ones, may not be present, which ruins the metric.
About "50% of chance", let me clarify that in my simulation we do not have this rule. Players win game based on their skills, and people have different skills simulating a normal distribution over the population.

I agree with the first point.

How did you assign skill levels to players? did you give each a point rating and use that to determine their probability of winning? What did the distribution of skill levels look like? Did you use something like Elo distributions for chess or lol?


Skill is a number. Players have skills varying from 0 to 10. As expected in a normal distribution, there are a lot more people with skill around 5 than people with skill around 0 or 10, and that's done gradually. I also know the distribution I made is not perfect real, but I think it is enough to use in this test.
When a player faces another player, basically the one with better skill wins. But I introduced some RNG to allow a worse skilled player to have a chance to win. Lets say player X with skill equals to 6.5 faces a player Y with 7.0. The considered skill in the match is something like this:
X: a random number from 5.85 to 7.15
Y: a random number from 6.3 to 7.7
The player Y has more chance to win, but player X still have his chance.
If the players have too discrepant skills then the victory of the best player is certain.



I have issues with this methodology, but you've inspired me to construct my own simulation.


Even if my simulation have some flaws, it inspired someone to try some similar work, so it's a good thing!
figq
Profile Blog Joined May 2010
12519 Posts
Last Edited: 2014-03-27 09:58:48
March 25 2014 18:16 GMT
#24
Thank you. You did something I was considering doing, but was too busy. Basically, I suspected the way certain people (who are good at arena) advertise arena may be over the top a bit. Like poker, and essentially most things that depend on experience, this looked a bit like a pyramid scheme. In which they always claim your best value is in a arena, but if we actually take *all* players in arena and just average out their performance, it would turn out that the "bank" (in our case Blizzard) wins more from us, compared to what "we" (that is the whole average of all players) can just win from constructed. So yeah, the arguments still stand, you could keep playing arena (do the pyramid work) until you get good enough to actually pull better gold from it than you would from constructed. Individually - for those that are good or become good - there's great value in arena, the best value. But as a whole, I suspected the entirety of players do not benefit from it, and your results pretty much confirm it. Of course, the difference is very small, and arena is fun, so it's still okay. EDIT: also, of course, there's a 100 gold limit in constructed which if you grind a lot will be an issue.
If you stand next to my head, you can hear the ocean. - Day[9]
obesechicken13
Profile Blog Joined July 2008
United States10467 Posts
Last Edited: 2014-03-25 21:27:30
March 25 2014 21:27 GMT
#25
I don't think anyone's linked it yet, but OP maybe you should take a read at this other simulation:
http://www.liquidhearth.com/forum/hearthstone/393-monte-carlo-simulation-of-hearthstone-ranking
I think in our modern age technology has evolved to become more addictive. The things that don't give us pleasure aren't used as much. Work was never meant to be fun, but doing it makes us happier in the long run.
MarcoBrei
Profile Joined May 2011
Brazil66 Posts
Last Edited: 2014-03-26 00:46:32
March 26 2014 00:45 GMT
#26
Original post updated:

UPDATE - from relevant inputs:

Gerenal questions about the program:
+ Show Spoiler +

"gerenal questions about the program"
How did you make the normal distribuiton of skill?
How did you make the matchmaking?
How did you decide the winner?
How did you make the RNG?


I really don't want to discuss details of the program, just because it's tiresome. There is no much famous algorithm behind this program, I made myself every step of it.
Some overview:
Skill is a number, the greater, the better.
Normal distribution: More players have skills like 5, less players have skill like 10 or zero. It increases and decreases gradually.
Matchmaking: I don't know how blizzard does, I just pick a random player and try to find another with the same (or as close as possible) "win balance" (win - loss). So a player with 7-0 will play against a 9-2 instead a 9-0.
The winner is decided by comparing the skills. But I introduced some RNG to allow a worse skilled player to have a chance to win. Lets say player X with skill equals to 6.5 faces a player Y with 7.0. The considered skill in the match is something like this:
X: a random number from 5.85 to 7.15
Y: a random number from 6.3 to 7.7
The player Y has more chance to win, but player X still have his chance.
If the players have too discrepant skills then the victory of the best player is certain.

Important update: I just made some more simulations varying the amount of RNG and find out that it does not make that big of a difference. Also, I mixed the normal distribution of skill and still get similar results. On the other hand matchmaking is much more relevant and affects a lot the results.



Comparing this result with...
+ Show Spoiler +

On March 25 2014 21:14 obesechicken13 wrote:
I think the actual distributions may be slightly different for arena runs.
http://www.arenamastery.com/sitewide.php

I don't think arenamastery has all necessary data, just the ones people submit. If so, several results, specially the bad ones, may not be present, which ruins the metric.
The only way to confirm the results would be comparing with real data from Blizzard, and I would love to do that.

I know this simulation is not perfect, but I think it can give us a clue about what happens. Some methods should be done differently, but so far I think nothing is "wrong enough" to invalidate the results. What I'm saying is that we can't learn here that exactly 77.96% of the players reach no more than 4 victories, but we can imagine this number is not 30% neither 90%.


Some aspects the simulation does not consider:
+ Show Spoiler +

On March 25 2014 17:23 RenSC2 wrote:
One further consideration is that each "player" in the simulation is not actually an individual, but instead a single run of the arena. A person who averages 3 wins or less in a run will likely only be able to play in the arena once every couple days. A person in that 4-6 range will be able to run arena about 1-2 times per day. People who average 7+ get to run the arena as much as they want. Some of the really top notch arena people are full time streamers/players and so do a very large number of runs every day and account for many runs in the 7+ club.

So the actual % of people who will average 7+ is less than what the simulation will tell you because a larger portion of those runs are being filled by the same people. Meanwhile, the less than 3 club will actually be a much larger % of people since those people don't get to make nearly as many runs as even the 4-6 people.

Essentially, averaging 5 wins or more per run actually puts you in better standing than the top 22% that the initial post calculated.

The only caveat is that real $ infusions are probably highest in the less than 3 club and those people could be balancing things out a bit.

Still, we live on a forum where being "only" in Masters in Starcraft makes you a "bad" player. Masters was initially supposed to be the top 2% of the active playerbase. We're a very elitist community.



Other simulation
+ Show Spoiler +

On March 26 2014 06:27 obesechicken13 wrote:
I don't think anyone's linked it yet, but OP maybe you should take a read at this other simulation:
http://www.liquidhearth.com/forum/hearthstone/393-monte-carlo-simulation-of-hearthstone-ranking

This is simulating the ladder, it's interesting, but I wanted to try Arena which I think is more interesting


Inspiration
+ Show Spoiler +

On March 26 2014 02:33 mikeymoo wrote:
I have issues with this methodology, but you've inspired me to construct my own simulation.


On March 25 2014 11:02 Came Norrection wrote:
You just made me want to write my own simulation.


Feel free to try your own tests! When blizzard reveals the real numbers we may see who was more accurate


C[h]ili
Profile Joined December 2011
Germany167 Posts
April 12 2014 17:49 GMT
#27
This is a well done analysis. Thank you very much for the effort you have put into this, and for sharing the results with us. I have two comments:

1. Please share the computer code you have used for your simulation. This allows us to more deeply understand what you have done.

2. My guess would be that the implied distribution of players across "number-of-win-categories" (0-3,1-3, and so on) dependens to some degree on the assumed probability distribution for "skill". If you have time, I would be interested in seeing the implied distribution for different assumptions on "skill".
BenJamesBen
Profile Joined May 2014
0 Posts
Last Edited: 2014-05-09 12:34:34
May 09 2014 03:53 GMT
#28
I was inspired by MarcoBrei's work to write my own simulation. There are some differences between those numbers and mine, but they mostly agree:
  • 50% of players end up with 2 wins or less. Only 50% have 3+ wins. (Compare to MarcoBrei's 51.57%/48.43%)
  • 65.6% end up with 3 wins or less. Only 35.4% have 4+ wins. (MarcoBrei: 66.91%/33.09%)
  • Only about 9% have 7+ wins. (MarcoBrei: 9.01%)
  • I did not see significantly different results when match outcomes were determined randomly (50/50 chance to win) vs. determined by skill (higher skilled player always wins).
Data:
+ Show Spoiler +
0-3: 12.500% (12.500% 0-0 wins, 87.500% 1+ wins))
1-3: 18.750% (31.250% 0-1 wins, 68.750% 2+ wins))
2-3: 18.750% (50.001% 0-2 wins, 49.999% 3+ wins))
3-3: 15.625% (65.626% 0-3 wins, 34.374% 4+ wins))
4-3: 11.719% (77.345% 0-4 wins, 22.655% 5+ wins))
5-3: 8.203% (85.548% 0-5 wins, 14.452% 6+ wins))
6-3: 5.469% (91.017% 0-6 wins, 8.983% 7+ wins))
7-3: 3.516% (94.532% 0-7 wins, 5.468% 8+ wins))
8-3: 2.197% (96.729% 0-8 wins, 3.271% 9+ wins))
9-3: 1.343% (98.072% 0-9 wins, 1.928% 10+ wins))
10-3: 0.806% (98.877% 0-10 wins, 1.123% 11+ wins))
11-3: 0.476% (99.353% 0-11 wins, 0.647% 12+ wins))
12-2: 0.476% 12-2, (99.829% less or equal, 0.171% higher))
12-1: 0.146% 12-1, (99.976% less or equal, 0.024% higher))
12-0: 0.024% 12-0, (100.000% less or equal, 0.000% higher))


I believe that my simulation differs in that it uses "perfect matchmaking", without any randomness. For example, an 0-2 player will only ever get matched with another 0-2 player, and the result is that there will always be one of those players ending with an 0-3 run.

I've also played around with different player distributions. For example, under the assumption that more higher-skilled players tend to play Arena (because they find it profitable), that lower-skilled players avoid Arena (more efficient to buy packs), and that there's a crop of new, unskilled players playing their free Arena run. The other distribution types I've tried are Normal (bell curve) and equal. So far, these different distributions don't seem to affect the results much in the long run, in a situation where there is perfect matchmaking.

I agree that the matchmaking algorithm is the most important factor in the distribution of arena win rates.

I will make my computer source code available for people to view and run. I'm currently looking into where best to host the code. Thanks, MarcoBrei for inspiring this effort.

Edit: added: The source code and sample run output.
Amui
Profile Blog Joined August 2010
Canada10567 Posts
Last Edited: 2014-05-09 21:32:27
May 09 2014 21:26 GMT
#29
Interesting simulation.

I do agree with the analysis that the people who can consistently hit 7+ wins will represent a significantly lower percentage of players though. There is a small subset of users that can average 7+, and that means that for every arena run, they take out 2 1/3 players on average, not to mention the fact that they'll get enough money to go right back and do it again.

While John Pub might win 2-3 games most of the time, they won't get enough gold for more than one arena run a day, in comparison to a higher level player who might do 5 or more, averaging more than double the wins.

I also think getting to 12 wins requires an element of luck in addition to skill. Some good decks that could hit 6-7 most of the time occasionally run into 3 very strong decks, and then sometimes an average deck will hit 12. Unless you've drafted something like 3 consecrate 5 truesilver triple aldor tirion pally, or 4 frostbolt 3 manawyrm 5 fireball 3 flamestrike double poly pyro mage with solid supporting cards(which is an element of luck as well), you can predict a good range for a deck, but never be certain.
Porouscloud - NA LoL
BenJamesBen
Profile Joined May 2014
0 Posts
May 11 2014 00:50 GMT
#30
I updated the simulation program: source code and sample output.

We already know what distribution of results to expect overall. However, what people are probably more interested in is the question, what Arena result can an individual player expect to receive based on their skill level? The simulation program now tries to tally this information. There are 14 player skill levels, from 1 (worst) to 14 (best).

Average Skill Level per Result:
+ Show Spoiler +
0-3: n=125000, average level=4.1
1-3: n=187500, average level=6.5
2-3: n=187500, average level=7.6
3-3: n=156250, average level=8.2
4-3: n=117187, average level=8.9
5-3: n=82031, average level=9.7
6-3: n=54687, average level=10.6
7-3: n=35155, average level=11.3
8-3: n=21971, average level=11.8
9-3: n=13426, average level=12.3
10-3: n=8055, average level=12.6
11-3: n=4759, average level=12.9
12-2: n=4759, average level=13.2
12-1: n=1464, average level=13.4
12-0: n=244, average level=13.5

Average Result per Skill Level (whale2 distribution):
Level 1: n=70044, average wins=0.3
Level 2: n=4995, average wins=0.4
Level 3: n=9944, average wins=0.5
Level 4: n=15038, average wins=0.8
Level 5: n=19942, average wins=1.2
Level 6: n=89749, average wins=1.6
Level 7: n=200544, average wins=2.1
Level 8: n=199102, average wins=2.5
Level 9: n=110207, average wins=3.2
Level 10: n=99744, average wins=4.0
Level 11: n=80368, average wins=5.1
Level 12: n=50238, average wins=6.2
Level 13: n=30051, average wins=7.3
Level 14: n=20022, average wins=8.6

The above is using a non-standard distribution of players that I've named "whale2". It assumes that Arena has more higher-skilled players than the normal population, with low-skilled players avoiding Arena but with a number of new, inexperienced players trying out their free run.

Player skill level distribution (whale2):
L1: 7.00%, L2: 0.50%, L3: 0.99%, L4: 1.50%, L5: 1.99%,
L6: 8.97%, L7: 20.05%, L8: 19.91%, L9: 11.02%, L10: 9.97%,
L11: 8.04%, L12: 5.02%, L13: 3.01%, L14: 2.00%,

With a normal distribution of players, the results instead look like:
+ Show Spoiler +
Average Result per Skill Level (normal distribution):
Level 1: n=1016, average wins=0.2
Level 2: n=5053, average wins=0.3
Level 3: n=16623, average wins=0.5
Level 4: n=43939, average wins=0.8
Level 5: n=91984, average wins=1.3
Level 6: n=150681, average wins=1.8
Level 7: n=191184, average wins=2.4
Level 8: n=190992, average wins=3.1
Level 9: n=150072, average wins=3.9
Level 10: n=91477, average wins=5.0
Level 11: n=44057, average wins=6.4
Level 12: n=16932, average wins=7.9
Level 13: n=4965, average wins=9.5
Level 14: n=1013, average wins=10.8
Hryul
Profile Blog Joined March 2011
Austria2609 Posts
May 11 2014 02:22 GMT
#31
On May 09 2014 12:53 BenJamesBen wrote:
I was inspired by MarcoBrei's work to write my own simulation. There are some differences between those numbers and mine, but they mostly agree:
  • 50% of players end up with 2 wins or less. Only 50% have 3+ wins. (Compare to MarcoBrei's 51.57%/48.43%)
  • 65.6% end up with 3 wins or less. Only 35.4% have 4+ wins. (MarcoBrei: 66.91%/33.09%)
  • Only about 9% have 7+ wins. (MarcoBrei: 9.01%)
  • I did not see significantly different results when match outcomes were determined randomly (50/50 chance to win) vs. determined by skill (higher skilled player always wins).
Data:
+ Show Spoiler +
0-3: 12.500% (12.500% 0-0 wins, 87.500% 1+ wins))
1-3: 18.750% (31.250% 0-1 wins, 68.750% 2+ wins))
2-3: 18.750% (50.001% 0-2 wins, 49.999% 3+ wins))
3-3: 15.625% (65.626% 0-3 wins, 34.374% 4+ wins))
4-3: 11.719% (77.345% 0-4 wins, 22.655% 5+ wins))
5-3: 8.203% (85.548% 0-5 wins, 14.452% 6+ wins))
6-3: 5.469% (91.017% 0-6 wins, 8.983% 7+ wins))
7-3: 3.516% (94.532% 0-7 wins, 5.468% 8+ wins))
8-3: 2.197% (96.729% 0-8 wins, 3.271% 9+ wins))
9-3: 1.343% (98.072% 0-9 wins, 1.928% 10+ wins))
10-3: 0.806% (98.877% 0-10 wins, 1.123% 11+ wins))
11-3: 0.476% (99.353% 0-11 wins, 0.647% 12+ wins))
12-2: 0.476% 12-2, (99.829% less or equal, 0.171% higher))
12-1: 0.146% 12-1, (99.976% less or equal, 0.024% higher))
12-0: 0.024% 12-0, (100.000% less or equal, 0.000% higher))


I believe that my simulation differs in that it uses "perfect matchmaking", without any randomness. For example, an 0-2 player will only ever get matched with another 0-2 player, and the result is that there will always be one of those players ending with an 0-3 run.

I've also played around with different player distributions. For example, under the assumption that more higher-skilled players tend to play Arena (because they find it profitable), that lower-skilled players avoid Arena (more efficient to buy packs), and that there's a crop of new, unskilled players playing their free Arena run. The other distribution types I've tried are Normal (bell curve) and equal. So far, these different distributions don't seem to affect the results much in the long run, in a situation where there is perfect matchmaking.

I agree that the matchmaking algorithm is the most important factor in the distribution of arena win rates.

I will make my computer source code available for people to view and run. I'm currently looking into where best to host the code. Thanks, MarcoBrei for inspiring this effort.

Edit: added: The source code and sample run output.

If you take your assumption, there is no need for a computer simulation, since it can be solved analytically. As has been done here:

On May 08 2014 02:49 GuntherBovine wrote:
If you assume that every arena match is between two decks with the same record, you can calculate how many decks achieve a certain record. Here are the chances of getting a particular record, the cumulative chance of getting that many wins and the inverse of the first column, which gives you that one out of X decks gets that many wins:
0-3 - 12.50% - 12.50% - 8.0
1-3 - 18.75% - 31.25% - 5.3
2-3 - 18.75% - 50.00% - 5.3
3-3 - 15.63% - 65.63% - 6.4
4-3 - 11.72% - 77.34% - 8.5
5-3 - 8.20% - 85.55% - 12.2
6-3 - 5.47% - 91.02% - 18.3
7-3 - 3.52% - 94.53% - 28.4
8-3 - 2.20% - 96.73% - 45.5
9-3 - 1.34% - 98.07% - 74.5
10-3 - 0.81% - 98.88% - 124.1
11-3 - 0.48% - 99.35% - 210.1
12-2 - 0.48% - 99.83% - 210.1
12-1 - 0.15% - 99.98% - 682.7
12-0 - 0.02% - 100.00% - 4096.0

Countdown to victory: 1 200!
Came Norrection
Profile Joined March 2011
Canada168 Posts
Last Edited: 2014-05-11 04:08:39
May 11 2014 04:06 GMT
#32
I ran my own simulation as well, and I don't think my results matter too much but I did find an interesting fact of how the distribution of luck and skill in the game.

1 run of my simulation is 10000 players playing and all players finish their runs if the games don't ends up being odd. My algorithm does flat distribution of skill into 100 bins which means equal chance of players of any skill from 0-99, about 100 players per bin. I did NOT use normal distribution mostly because skill is something that is only determined relative to others and I already use a normal distribution for deciding on who wins. Matchmaking is given on number of wins within 1 of each other, so a 7 win person can play a 6,7,8 win person and I think this is close to what blizzard uses since it will match you against people who have different record than you in arena.

For deciding who wins a game, I use the following formula:
+ Show Spoiler +
winner = As + r
A is a factor of how much skill comes into play
s is the difference in skill level of the players
r is a normal distributed Gaussian


For a game of pure skill, where As >>>> r:
average run length is 2.89
5.3% makes it to 12 wins
the average run length of the top 1% of players is 11.89 with a 79.86% win rate

For a game of about even skill and luck, where As ~ r:
average run length is 2.972
1.92% makes it to 12 wins
the average run length of the top 1% of players is 7.5 with a 71.49% win rate

For a game of pure luck, where As <<<<< r:
average run length is 2.994
0.62% makes it to 12 wins
the average run length of the top 1% of players is 2.835 with a 48.59% win rate

I just find it interesting to see how much luck is perceived in the game vs my simulation. My guess is skill is a lot more important than luck give how the win rates of players are distributed.
"The lie is just a great story ruined by the truth."
Prev 1 2 All
Please log in or register to reply.
Live Events Refresh
RSL Revival
07:00
Season 4: Playoffs Day 7
Cure vs Rogue
Maru vs TBD
MaxPax vs TBD
Tasteless224
CranKy Ducklings91
Rex35
LiquipediaDiscussion
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
WinterStarcraft687
Tasteless 224
Rex 35
StarCraft: Brood War
Sea 9792
Zeus 1782
Shuttle 624
JulyZerg 57
Aegong 34
sSak 34
GoRush 29
NaDa 29
NotJumperer 7
Dota 2
NeuroSwarm153
League of Legends
JimRising 742
Counter-Strike
Stewie2K785
m0e_tv598
Organizations
Counter-Strike
PGL1652
Other Games
gamesdonequick1003
BasetradeTV86
StarCraft: Brood War
UltimateBattle 72
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 14 non-featured ]
StarCraft 2
• 3DClanTV 22
• CranKy Ducklings SOOP3
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• Azhi_Dahaki23
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• Stunt584
Upcoming Events
uThermal 2v2 Circuit
7h
BSL
12h
Afreeca Starleague
1d 3h
Wardi Open
1d 3h
Replay Cast
1d 17h
Sparkling Tuna Cup
2 days
Kung Fu Cup
3 days
The PondCast
4 days
Replay Cast
4 days
Replay Cast
5 days
[ Show More ]
CranKy Ducklings
6 days
BSL
6 days
Replay Cast
6 days
Liquipedia Results

Completed

Escore Tournament S2: W1
WardiTV Winter 2026
NationLESS Cup

Ongoing

BSL Season 22
CSL Elite League 2026
ASL Season 21
CSL Season 20: Qualifier 2
StarCraft2 Community Team League 2026 Spring
RSL Revival: Season 4
Nations Cup 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026

Upcoming

CSL 2026 SPRING (S20)
IPSL Spring 2026
Acropolis #4
BSL 22 Non-Korean Championship
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
IEM Cologne Major 2026
Stake Ranked Episode 2
CS Asia Championships 2026
Asian Champions League 2026
IEM Atlanta 2026
PGL Astana 2026
BLAST Rivals Spring 2026
CCT Season 3 Global Finals
IEM Rio 2026
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.