|
United States24612 Posts
I've always thought it's cool to take what you've learned about math (in school, etc) and apply it to real life. By real life of course I'm talking about video games (well, REAL life too, but I'm just gonna talk about games now). I just had one of these moments so I figured I'd share it.
Recently I've been playing through Vagrant Story, which is a great game by the way. When you attack you have the option to chain multiple hits together, one after the other. So let's say you attack and deal 10 damage... you can just stop there or you can try to get a second attack in on the same 'turn.' You need to time the button press right in order to get the second hit, but if you do then he'll do additional damage.
I'm going to simply the system a bit. Technically this isn't exactly how it works, but for the sake of discussion: if you deal 10 damage on the first attack, the you'll deal 12 on the second attack. You'll deal 14 on the third attack, etc. In case you are wondering... yes there is a reason not to just always chain the maximum amount of attacks together you can: your "risk" goes up as you use chain abilities making your accuracy go down. This doesn't affect the upcoming calculation but should explain why chaining to infinity is not always preferable.
Time to apply what I've learned previously about sequences and series to this situation... the amount of damage Ashley (the main character) deals after one hit is A (let's say). The first time he attacks we'll call the 0th attack (the first chain attack would be n=1, etc). The amount of damage he does on a given attack is:
A+2n
so if it was the third attack (second chain attack) then he would deal A+6 damage. The total amount of damage he has dealt from the first attack through the nth attack is:
SUM from 0 to N of A+2n
where N is the last attack of the chain. So if Ashley attacks once, does 10 damage, then attacks two more times, the total damage is 10+12+14 = 36. In order to get the general formula for how much damage is dealt in total in N+1 turns, we need to solve that summation. First, I split it into two separate summations which you can do when summing the addition of two terms:
SUM from 0 to N of A plus SUM from 0 to N of 2n
The left sum I will solve in one step. The right sum I will start by pulling out the coefficient of 2:
(N+1)A plus 2*SUM from 0 to N of n
You might recall the famous math problem (what is the sum of all integers from 0 to 100). We use the result of that to solve the remaining summation. Read the spoiler for background.
+ Show Spoiler [background of sum of 1 to 100] +Gauss is credited with this story but I don't think he was the first person to come up with this... anyway, Gauss as a child was in school and his teacher asked the students to add up all numbers from 1 to 100. Most of Gauss' classmates started furiously adding everything by hand. Gauss just thought about the problem for a minute and then reasoned it out this way: suppose you write all the integers from 1 to 100 in order from left to right. Now suppose you write them all again from 100 to 1, left to right, directly under the first row. Now, he thought, add each column. In each of the 100 columns you get a sum of 101. 100 cases of 101 adds up to 10100. 10100 is the sum of 1 to 100 TWICE. Therefore, cutting the number in half, 5050, is the sum from 1 to 100 of integers. His teacher was of course impressed. From this we see that the sum of consecutive integers is the lowest integer plus the highest integer times half the total quantity of integers being added.
Solving the right summation we get:
(N+1)A plus 2*( [0+N]/2 * [n+1] )
Manipulating the expression:
AN + A + N^2 + N
Manipulating a bit more:
A + N(A+N+1)
Let's test this expression out with the earlier example of attacking for 10 damage, and then doing two follow up attacks. A=10, N=2 so:
damage = 10 + 2(10+2+1) = 36 (correct)
Sample Problem An enemy boss has 400 HP. Ashley's first attack does 10 damage. How many times will Ashley need to chain attack in order to kill the boss?
A + N(A+N+1) >= HP 10 + N(11+N) >= 400 N^2 + 11N - 390 >= 0 (notice we are applying that cute inequality stuff most of us learned years ago)
N=15
Check: 10+12+14+16+18+20+22+24+26+28+30+32+34+36+38 = 400
Therefore, after the initial attack, Ashley will need to chain 15 times. The total number of attacks of that turn would be 16, however.
Practically, this doesn't save you that much time over just using trial and error with the math method used in the 'check' above. The real power of using formulation rather than calculator work is for extreme cases.
Question 1 Ashley deals 5 damage on this first attack, and follows up with 200 chain attacks. How much damage does this deal?
Question 2 Ashley deals 5 damage on his first attack and needs to kill a boss with an HP of 1,000,000. How many chain attacks will it take?
Challenge: As I said at the beginning, this is not technically how Vagrant story works. There are different chain abilities that you can use depending on what you've unlocked and how you set your character. After the initial attack you can use 'heavy shot' which deals only 70% of the damage the initial attack did. Alternately, you can use temper which deals 40% of the damage of the initial attack. You can't use the same chain ability twice in a row so you could alternate between heavy shot and temper (temper also restores DP to weapons but we aren't talking about that). The increment of two damage only happens every OTHER chain attack after the first chain attack. So if you deal 100 damage with your first attack, then you will deal 70 with a heavy shot, 40 with a temper, then 72 wit a heavy shot, 42 with a temper, 74 with a heavy shot, 44 with a temper, etc (I'm not 100% sure I have it exactly correct but that's gotta be close). If anyone comes up with a general formula for total damage dealt after N chain attacks if the odd chain attacks are heavy shots and the even chain attacks are tempers then I'll be very impressed... I haven't tried it yet so I'm not sure how difficult it is.
   
|
I have no intention of taking on the questions or the challenge, but I want to say that I totally love Vagrant Story. That was such an underrated game from the PS1 era. Really happy to see people still playing it and giving it the respect it deserves.
|
United States24612 Posts
On April 23 2011 02:06 Rayeth wrote: I have no intention of taking on the questions or the challenge, but I want to say that I totally love Vagrant Story. That was such an underrated game from the PS1 era. Really happy to see people still playing it and giving it the respect it deserves. Yeah I actually got it when it came out and didn't fully appreciate it. A few years ago I played it some more and appreciated it. This time I love it.
It just gets better and better lol
Was considering making some leather hoplite armor and a wood hoplite shield but it's just too annoying to farm leather.
|
Interesting read, although I never heard about that game.
BTW, the other day I was trying to determine the winner between two groups of units in Starcraft BW, because I wanted to know why a FD worked, and I came up with this formula:
Uf = Ui - ([Vi]*(Vdamage-Uarmor))/(Uhp*Vcooldown)
Where U and V are homogeneous groups of units, f stands for "final number of units" and i for "initial number of units". This formula calculates the total damage in a round, and if you iterate it (I programmed it in Java) it can determine which group wins. The [Vi] part stands for math.ceil, not math.floor, because a damaged unit still fires during a round.
Of course, the damage count can only be precise in TvT (because Z and P regenerate hp/shields) and It doesn't take damage type or unit size into consideration. It's doesn't consider micro or range ether, and damage is calculated as if it were optimal (exact amount of damage to kill one unit).
I know it can be improved, but it helps a lot to know how many rines do I need to kill two goons.
edit: oh, I almost forgot, the game
|
i'm with that first guy. no intention of doing the questions/challenges, but man, vagrant story is a great game. probably up there in my top 3 of favorite ps1 games next to xenogears
|
challenge: + Show Spoiler +It's just a chain of the first attack with half the number of attacks, (odd = rounded up) + a chain of the second attack with half the number of attacks, (odd = rounded down)
example: 40+70+42+72+44... = 40+42+44... + 70+72...
You figured out the regular chain formula already.
|
You can map anything into differential equations, including StarCraft.
|
United States24612 Posts
On April 23 2011 05:39 G3CKO wrote: You can map anything into differential equations, including StarCraft. Can you give examples? Are you exaggerating by saying 'everything'?
|
On April 23 2011 05:44 micronesia wrote:Show nested quote +On April 23 2011 05:39 G3CKO wrote: You can map anything into differential equations, including StarCraft. Can you give examples? Are you exaggerating by saying 'everything'?
The rate of income vs. worker count. I can actually work this out give me a few minutes as I haven't done diffiques in a while. Differential equation is essentially how the world works you could say.
|
On April 23 2011 05:39 G3CKO wrote: You can map anything into differential equations, including StarCraft.
You sound like one of my professors at university.
"Everything is quantifiable!"
But it's true, make it abstract enough to see numbers and you're good to go. The only difficulty is combining sets.
Starcraft example: 'Efficiency' is a broad term, but if we can somehow combine all inputs like individual unit micro, cost of said units, etc... then we can compare it. ^^
Edit: Spelling T_T
|
Due to the discrete nature of games, differential equations are (sadly) rarely to never useful.
Usually you get highly involved algebra
http://forums.civfanatics.com/showthread.php?t=266397&page=4
+ Show Spoiler +(T_c - T_f) * c > T_c
Where T_c is number of turns to grow, if a new cottage is built T_f is number of turns to grow, if a new farm is built c is the remaining number of cottages to be built
So, finding the values of T_c and T_f:
T_c = G / (B + f)
T_f = G / (B + f + 1)
Where G is the amount of Food required for growth, B is the food bonus of the single Food Resource f is the current number of Farms
So,
(G/(B+f) - G/(B+f+1) ) * (C - (n-f-1-1) ) > G/(B+f)
where n is the current pop count C is the final number of cottages
Now, divide both sides by G and multiply by (B+f):
(1 - (B+f)/(B+f+1)) * (C-n+f+2) > 1
or
(1 - (1- 1/(B+f+1) ) * (C-n+f+2) > 1
or
1/(B+f+1) * (C-n+f+2) > 1
or
C - n + f + 2 > B + f + 1
Subtracting (f+1) from both sides gives us:
C - n + 1 > B
or
n < C - B + 1
Resulting in the the same formula arrived at by vicawoo.
So, this means, with C=15, B=5, you should Farm when:
n < 15 - 5 + 1
n < 11
Note that this result is independent of both G (the food cost of pop growth) and f (the current number of farms), even though they were used in the inequality.
or
+ Show Spoiler +Let M(n,d[n+1]) be the increase in empirewide maintenance by adding an n+1st city at a distance of d[n+1] from the capital (In this notation d[i] is the distance from the capital of the ith city).
In the long run, if we increase empire wide maintenance by 4, we would have to switch two tiles to 2 food 2 commerce tiles to maintain equilibrium. However, we get 1 commerce from the new city itself; you may even get an additional 1 if you have a trade route.
So new city negatively impacts empire wide production by ( M(n,d[n+1]) - 1 )/tile_c tiles, which would have otherwise yielded tile_y (tile yield). So the cost of the new city in food/hammers is M(n,d[n+1])*tile_y/tile_c Note: [spoiler]If we give up a riverside mine for a riverside cottage, we are only netting one commerce, so in practice tile_c is increase in commerce from switching from our production oriented tile to our commerce tile. If by chance we have a domestic trade route to the new city, we can just pretend the new city costs the empire one less gold coin in maintenance[/spoiler]
Profit = Revenue - Cost Profit = tile_y[n+1] - ( (M(n,d[n+1] ) -1 )*tile_y/tile_c
We divide our profit by our initial investment, that is 100 production.
Now we compare this to growing our ith city from size p[i] to p[i]+1.
Again Profit = Revenue - Cost Cost = d[i]/29 + n/60 Note that in most cases this cost will usually be much less than 1. Profit = yield'_fh + (yield'_c- d[i]/29 + n/60)*tile_y/tile_c In most of these cases we're debating working a mine or it's commerce equivalent. And the investment is 2*(10+p[i]) food, with an opportunity cost of 2*(10+p[i])*(f[i]+h[i])/f[i] food and hammers, where f[i] is the food surplus and h[i] is the hammer production of the ith city.
Ok, we have tons of variables. Let's simplify. Since the increase in maintenance from growing is much less than 1 and we're dealing with integer production values, we'll ignore that. We'll assume we are growing to work a mine, that is, a net +2 food/hammer production.
(tile_y[n+1] + ( 1+tile_c[n+1]- M(n,d[n+1] )*2/tile_c)/100 > 2/( 2*( 10+p[i] )*( f[i]+h[i] )/f[i] )
So what's the point of all this, since sometimes we may only care about maximizing hammers and other times we may care more about our tech rate? It provides a numerical value in the commerce adjusted value of new cities.
|
United States24612 Posts
On April 23 2011 06:36 igotmyown wrote:Due to the discrete nature of games, differential equations are (sadly) rarely to never useful. Usually you get highly involved algebra http://forums.civfanatics.com/showthread.php?t=266397&page=4+ Show Spoiler +(T_c - T_f) * c > T_c
Where T_c is number of turns to grow, if a new cottage is built T_f is number of turns to grow, if a new farm is built c is the remaining number of cottages to be built
So, finding the values of T_c and T_f:
T_c = G / (B + f)
T_f = G / (B + f + 1)
Where G is the amount of Food required for growth, B is the food bonus of the single Food Resource f is the current number of Farms
So,
(G/(B+f) - G/(B+f+1) ) * (C - (n-f-1-1) ) > G/(B+f)
where n is the current pop count C is the final number of cottages
Now, divide both sides by G and multiply by (B+f):
(1 - (B+f)/(B+f+1)) * (C-n+f+2) > 1
or
(1 - (1- 1/(B+f+1) ) * (C-n+f+2) > 1
or
1/(B+f+1) * (C-n+f+2) > 1
or
C - n + f + 2 > B + f + 1
Subtracting (f+1) from both sides gives us:
C - n + 1 > B
or
n < C - B + 1
Resulting in the the same formula arrived at by vicawoo.
So, this means, with C=15, B=5, you should Farm when:
n < 15 - 5 + 1
n < 11
Note that this result is independent of both G (the food cost of pop growth) and f (the current number of farms), even though they were used in the inequality. or + Show Spoiler +Let M(n,d[n+1]) be the increase in empirewide maintenance by adding an n+1st city at a distance of d[n+1] from the capital (In this notation d[i] is the distance from the capital of the ith city).
In the long run, if we increase empire wide maintenance by 4, we would have to switch two tiles to 2 food 2 commerce tiles to maintain equilibrium. However, we get 1 commerce from the new city itself; you may even get an additional 1 if you have a trade route.
So new city negatively impacts empire wide production by ( M(n,d[n+1]) - 1 )/tile_c tiles, which would have otherwise yielded tile_y (tile yield). So the cost of the new city in food/hammers is M(n,d[n+1])*tile_y/tile_c Note: [spoiler]If we give up a riverside mine for a riverside cottage, we are only netting one commerce, so in practice tile_c is increase in commerce from switching from our production oriented tile to our commerce tile. If by chance we have a domestic trade route to the new city, we can just pretend the new city costs the empire one less gold coin in maintenance[/spoiler]
Profit = Revenue - Cost Profit = tile_y[n+1] - ( (M(n,d[n+1] ) -1 )*tile_y/tile_c
We divide our profit by our initial investment, that is 100 production.
Now we compare this to growing our ith city from size p[i] to p[i]+1.
Again Profit = Revenue - Cost Cost = d[i]/29 + n/60 Note that in most cases this cost will usually be much less than 1. Profit = yield'_fh + (yield'_c- d[i]/29 + n/60)*tile_y/tile_c In most of these cases we're debating working a mine or it's commerce equivalent. And the investment is 2*(10+p[i]) food, with an opportunity cost of 2*(10+p[i])*(f[i]+h[i])/f[i] food and hammers, where f[i] is the food surplus and h[i] is the hammer production of the ith city.
Ok, we have tons of variables. Let's simplify. Since the increase in maintenance from growing is much less than 1 and we're dealing with integer production values, we'll ignore that. We'll assume we are growing to work a mine, that is, a net +2 food/hammer production.
(tile_y[n+1] + ( 1+tile_c[n+1]- M(n,d[n+1] )*2/tile_c)/100 > 2/( 2*( 10+p[i] )*( f[i]+h[i] )/f[i] )
So what's the point of all this, since sometimes we may only care about maximizing hammers and other times we may care more about our tech rate? It provides a numerical value in the commerce adjusted value of new cities.
This was my thinking but I figured I'd give G3CKO his time to shine lol
|
On April 23 2011 05:44 micronesia wrote:Show nested quote +On April 23 2011 05:39 G3CKO wrote: You can map anything into differential equations, including StarCraft. Can you give examples? Are you exaggerating by saying 'everything'? Apparently, the UC Berkeley Starcraft class had some pretty crazy math going on in it. I think one of the recommended prerequisites was Differential Calculus, but I can't seem to access the class's site so I can't confirm.
|
I've done a few projects for math in games! They are pretty damn fun. I remember one of them being applying markov chains to monopoly in order to figure out what spots you land on the most (and thus rake in the most money) in the long run. Probability and linear algebra can be applied to soooo many board games it's insane. I think I remember seeing another way to use markov chains in Risk as well, I just didnt take an interest in that article because I didnt know the game very well =P
Basically if you're able to classify things into sets, you could probably find out a ton of things that wouldnt usually be obvious combinatorically. If not, you could probably apply a monte carlo type thing to atleast probabilities of things happening.
I love applied math so much sometimes it scares me.
|
I've done a lot of math for games too. The fact that games are discretized, making use of algebra important is very true for games. I think this is a good thing. You shouldn't have to use complex math to have to solve real world problems.
I remember doing a lot of math for age of empires 3 and age of empires 2 (villager second calculations) but overall they weren't a substitute for actually testing out my hypothesis. The nice thing about age of empires is that you're working with such large numbers that things tend to work out like they are continuous so you can model things using calculus pretty accurately.
More often than not, I and other theorists, ended up making tables using basic algebra. There's a saracen market trading table for age of empires 2 on the aoe2 heaven forums. People figured out a way to make saracens get more resources by using a market. Cyclohexane combined and created a lot of knowledge about age of empires 3 into one beautiful quick reference guide.
Math should be used more often in games. Sometimes it's not enough to know that one unit is better than another, you want to be able to mathematically prove it so that you can combat more situations that experience can't teach you. The people who make good timing pushes know exactly how long it takes for upgrades to finish and buildings to build. Even starcraft 2 pros don't use enough math.
I remember hearing commentators talk about how Idra's opponent had more workers on the same number of bases as Idra. They berate him, but they don't look at the saturation charts posted on teamliquid to realize that Idra is effectively mining just as much with his normal workers (mules excluded) as his opponent because his bases are just as saturated on fewer workers. Idra was effectively up the cost of 10 drones, 500 minerals, 10 supply slots (late game), and 10 larva, from what a lesser zerg would be at.
Using math on games can have its limitations though. Sometimes the math gets too complex, so I just run simulations, or mass games to improve.
These days, I'm far more interested in Blackjack, and Poker Theory, because I think it's more applicable. It's unfortunate that there are so many people smarter than me in the world. I know that if there were ways to make lots of money through either blackjack or poker, that they'd be doing it.
|
|
|
|