Anda di halaman 1dari 9

11MAR-SG310-WCO-V2D1 (Algorithm Analysis & Design)

Project 1 -
Methodologies for
Analyzing
Algorithms
StarCraft

Greg Walls
1

Methodologies for analyzing algorithms


Measuring a given algorithms run time depends on a number of given factors. There is no
perfect, “one size fits all” method to do this, but in this first section I will discuss a couple different
methodologies for doing so. Given that an algorithms run time will generally increase with higher
amounts of data input, and that its run time will vary with various degrees of hardware proficiency, the
running time of the same algorithm on the same input data will be smaller if the computer has, say, a
much faster processor or if the implementation is done in a program compiled into native machine code
instead of interpreted implementation run on a virtual machine. [ CITATION Mic02 \l 1033 ]

Pseudo-Code

One methodology to prove algorithm correctness is pseudo-code, a mixture of natural and high-
level programming language. Pseudo-code is used to describe an algorithm in a way that is intended for
human eyes only and not for coding purposes, although parts of the pseudo-code could be written in
programming language. By using pseudo-code we can outline and justify the correctness of our
algorithm by doing a simple analysis of the procedures outlined. “Pseudo-code descriptions [need to be]
detailed enough to fully justify the correctness of the algorithm they describe, while being simple
enough for human readers to understand.” [ CITATION Mic02 \l 1033 ]

The Random Access Machine (RAM) Model

Expanding a step further then pseudo-code, the random access machine model assigns value to
each set of primitive operations. A primitive operation can be anything from assigning a value to a
variable to returning a method call, or as simple at performing an arithmetic operation. It’s a low-level
instruction with an execution time that depends on the hardware and software environment, but is
mostly consistent across the board. Using this approach the RAM model views the computer as a CPU
unit connected to a bank of memory cells, each of which stores a number, character string, or an
address that is the value of a base type. For simplification no limitations are put on the size of numbers
that can be stored at any one time. We also assume that the CPU can perform any primitive operations
in a constant number of steps (regardless of input size). The end result of this is a semi-accurate bound
on the number of primitive operations an algorithm performs directly to the running time of that
algorithm. [ CITATION Mic02 \l 1033 ]

Average-Case and Worst-Case Analysis

Using the RAM model we can now write our pseudo-code and then analyze it to fine our
average-case and worst-case scenarios. An algorithm that we design may run faster on some inputs than
it does on others. We may want to express this difference in the form of an average-case, or what the
algorithm will do a majority of the time based on any number of random variable inputs, and worst case,
which is how our algorithm may perform given that whatever variable passed to it would cause it to run
for the longest amount of time possible.
2

Recursive Algorithms

Other than iteration of an algorithm, there are also recursive algorithms, or algorithms that
make calls to themselves. In a recursive algorithm a certain procedure makes calls to itself as a
subroutine given that the call is given to solve a sub-problem of a smaller size. This allows the algorithm
to keep breaking the problem down into smaller and smaller bits until a solution is reached and passed
back up through the recursive calls.

Big-Oh Notation

Big-Oh notation is used to characterize a function as closely as possible in mathematical terms.


In short Big-Oh notation determines the processing time for a particular algorithm. In this paper we will
attempt to use Big-Oh notation to describe the general running times of certain algorithms employed by
a video game (StarCraft).

Game Description – StarCraft II


The game I will by discussing in this paper is StarCraft II. StarCraft II is a real time strategy (RTS)
game created by Blizzard entertainment and released in 2010. The game was released for PC. A follow
up title to the original StarCraft game, which was widely accepted as a benchmark for all other RTS
games due to its exceptional balance between gameplay and strategies used, and free online
multiplayer play. StarCraft II continues the legacy with another installment in the series that has meet
and surpassed expectations.

Game Plot

StarCraft II’s plot revolves around three races in the game, the Terrain, a group of humans
outcast from the Earth’s solar system, the Protoss, a highly advanced alien race based mostly on
technology, and the Zerg, an infectious alien race that has super-evolved to be a hive-mind style species.
In the first installment of StarCraft II’s trilogy set, the player follows the story of Jim Raynor, an ex-Sheriff
and “Sons of Korhol” soldier turned rebel after a betrayal by his former leader, the now oppressive
Emperor Mensk. As Jim leads a rebellion to overthrow and undermine the new oppressive regime, he
must also fight back the Zerg which are now invading and destroying the Terrains colonies and
settlements. The leader of the alien menace, The Zerg Queen is Raynor’s friend Sarah Karegan who was
betrayed and left to the Zerg to be captured and mutated by Mensk. Raynor is contacted by a Dark
Templar from the Protoss, making him aware of a set of ancient powerful relics that Karegan now seeks
to control. The player must track down these relics to unlock their power and stop the Zerg and save
their race.

Game-play mechanics

The game itself is an RTS. The player has a top-down “bird’s eye view” of the game. The goal is
to build and manage a base and army to achieve objectives during the game. The player must manage
3

resources and units to do this. The game now becomes mostly strategic as the player must balance
resource income with expenditures on units, buildings and upgrades to defeat their enemy. This
strategic balance has turned into a gaming mechanic as players learn how to balance each of these
things in their own stylized way. It has even lead to professional strategies and algorithms developed by
gamers who have mathematically calculated which sequence of events leads to the most units in the
shortest amount of time with the least amount of resources used. These algorithms were achieved by
clocking build times in the game mechanics. I will be using algorithm analysis to look at these custom
make algorithms, plus blizzards algorithm for unit building. StarCraft II also featured an engine that
allowed it to make cut scenes using its own gameplay engine, allowing for less CG cinematic and more
seamless transitions between gameplay and cut scene. Also the game features an in game command-
center that the player will visit between levels. Here the player has the ability to manage resources to
upgrade their units permanently for all missions using “credits” acquired while playing through the
missions.

StarCraft II – Algorithms
Pathfinding

In a strategy game like StarCraft, the player must navigate many units through a world map from
one point to another. The player has little control how the units pathfind from one point to the next. It is
assumed on the part of the player that the units will find the most direct path to where they are sent. In
a geographic 3-dimensional world full of obstacles algorithms had to be built to help units find their way
around the map. StarCraft II uses an AI pathfinding technique called flocking, or swarm AI. The effect is
coordinated movement the same as what you get with a school of fish or a flock of birds. Although a lot
of information about the style in which Blizzard implemented their pathfinding is not available the most
likely scenario is that they used an advanced algorithm that finds the fewest amount of waypoints and
allows autonomous steering behavior for units to smoothly hug their way around obstacles and other
units. In StarCraft 2, units will avoid obstacles and other units (but also flock together) using steering
behavior. Logically, every unit has sensors which when colliding with another unit, will signal for the unit
to turn in an appropriate direction to avoid it. This allows units to weave in and out without calculating a
whole new path or losing momentum, in a worst case scenario the units can ignore the collision radius,
allowing for more fluid movement and higher movement efficiency overall.
For example; if a group of units wanted to get into an enemy base and the entrance to said base is a
bottle-neck type landscape, all the units need to fit through a small space in the quickest, most efficient
manner. The last thing the player wants is to micro-manage all the movements of an army of units to get
through a small space. It would be more preferable for the units to guide themselves through the
obstacle allowing the player to better enjoy the gameplay then worry about how their units will move.
Effective pathfinding in this scenario would mean the group of units would have to anticipate the bottle-
neck and thin their ranks so as to fit through the obstacle as soon as they get to it without bunching up
and blocking each other as soon as they get there. [ CITATION slu10 \l 1033 ]
Pseudo-Code:
4

Algorithm: findPath(U, L, P)
Input: A unit U that is has location L with target location P
Output: Unit U moves towards location P from location L
if L == P
return;
else
move towards P
if found obstacle
steer around obstacle towards P
if can’t steer
wait;
findPath(U, L, P)

Big-Oh notation:
f(n) = 2 + n + n
f(n) = 2 + 2n
O(2n)
0(n)

The usefulness and necessity of a pathfinding algorithm is very important in a strategy game. The player
does not have the time the micro-manage all the movements of an entire army of units. The player must
feel like they are a commander moving an army and that each unit is intelligent enough to know how to
move in an intrinsic way around obstacles to comply with commands.

Genetic Build Algorithm

Although not necessarily a part of the game play, genetic build algorithms are used by gamers to
determine the best way to build in the fastest, most efficient way. They do this by analyzing the
algorithms built into StarCraft 2 for build times and resource gathering. Because of their widespread
popularity with the game, these algorithms deserve some recognition. A clever programmer realized
that StarCraft 2 seemed to have a set vocabulary, and that the game is deterministic to a certain point.
He then realized that he could write a program that could learn how to perform a build order better.
[ CITATION Lom10 \l 1033 ] A build order refers to the exact opening steps you take early in the game
that best supports the strategy you are trying to conduct. Build orders generally only cover the very
early game because once you’ve scouted the enemy; you have to begin to react to what he’s doing and
modify it as you go. In this way, build order in real time strategy games are very much akin to openings
in chess. They are the soul of the entire game about to be played. So here are the facts:

1. The program in question optimizes Zerg build orders (which is one race in starcraft), this is a
rather significant choice because the mechanics of the zerg race are arguably the most difficult
to manage (esp. for build order optimization).

2. Of most interest are “rush” build orders. This means “how quickly can I get N of this type of
unit?”.
5

3. There are two primary resources that workers collect in starcraft: gas and minerals.

4. Zerg also have a third de factor resource: larva. Larvae are used to create ALL zerg units,
including workers.  So long as you have less than three, they regenerate at a fixed rate (note:
this means any time spent at three larva delays all future larva production — very bad).

5. Most units require some building to be constructed in order to be “unlocked” (and many of
these buildings require others as prerequisites – this is the so-called tech tree)

6. Creating a building causes you to lose the worker who creates it (so the longer you can wait,
the more resources that worker can collect before building the building)

[ CITATION Lou10 \l 1033 ]

One of the most successful genetic build order algorithms created is called EvolutionChamber. A genetic
algorithm is a type of optimization algorithm that tries to find optimal solution using a method
analogous to biologic evolution. What this means is that you take a batch of initial build order algorithms
and analyze them based on effectiveness and eliminate the weakest ones, and then modify the others
until you get one algorithm that is the best. The program’s input is simply the desired game state. In
practice, this means “make N units” to determine some rush build order (but it also allows for other
types of builds, like make N workers with some defensive structures and a small army). Here are some of
the highlights:

1. It’s written in Java using JGAP.

2. A ‘chromosome’, in this case, is an array of ‘actions’ that can be done in game. (e.g., 1) Build a
drone. 2) Build a drone. 3) Build a spawning pool. 4) Build an overlord. And so on.)

3. Invalid actions (i.e., trying to build a unit you cannot build because you do not have the tech
necessary) are ignored (this allows for “junk DNA”).

4. An action that can’t be done YET (not enough minerals!) causes the simulation to wait until it
can be done.

5. It uses some fairly standard mutation types (deletion, insertion, and one strange one called
“overlording”)

6. It uses the “many villages” approach where there are several separate populations
evolving independently.

7. Populations that are deemed to be stagnant are annihilated and replaced by a variant of the


most successful.
6

8. The fitness function is really a measure of distance from the “desired” state and the current
state (this is measured by the difference in resources required to get there), taking into
account the time required (less time is always better).
[ CITATION Lou10 \l 1033 ]

A genetic algorithm works like this; It begins with a random initial population. The algorithm then
creates a sequence of new populations. At each step the algorithm uses the individuals in the current
generation to create the next population. To create the new population, the algorithm performs the
following steps: 1. Score each member the current population by computing its fitness value. 2. Scale
the raw fitness scores to convert them into a more usable range of values. 3. Select members called
“parents” based on their fitness. 4. Some of the individuals in the current population that have lower
fitness are chosen as “elite”. These elite individuals are passed to the next population. 5. Produces
children from the parents. Children are produced either by making random changes to a single parent
(called a mutation) or by combining the vector entries of a pair of parents (called crossover). 6. Replace
the current population with the children to form the next generation.

Pseudo-Code

for all members of population

sum += fitness of this individual

end for

for all members of population

probability = sum of probabilities + (fitness / sum)

sum of probabilities += probability

end for

loop until new population is full

do this twice

number = Random between 0 and 1

for all members of population

if number > probability but less than next probability

then you have been selected

end for

end
7

create offspring

end loop

Big-Oh notation

O(n)
8

Works Cited
Brandy, L. (2010, November). Using genetic algorithms to find Starcraft 2 build orders. Retrieved April 9,
2011, from lbrandy.com: http://lbrandy.com/blog/2010/11/using-genetic-algorithms-to-find-
starcraft-2-build-orders/

Lomilar. (2010, October 13). Zerg Build Order Optimizer. Retrieved April 9, 2011, from team liquid:
http://www.teamliquid.net/forum/viewmessage.php?topic_id=160231

Michael T. Goodrich, R. T. (2002). Algorithm Design. Crawfordsville: John Wiley & Sons, Inc.

sluggaslamoo. (2010, June 22). The Mechanics of Starcraft 2. Retrieved April 9, 2011, from team liquid:
http://www.teamliquid.net/forum/viewmessage.php?topic_id=132171

Anda mungkin juga menyukai