Anda di halaman 1dari 49

SEQUENCING

& SORTING
Budi Hartono, ST, MPM
Algorithm is sequencing

 Instructionsare performed one by one


 Each instruction is done once

 Processors will perform the process exactly


the same as instructed
Sequence does matter
Algorithm Sequence_1 Algorithm Sequence_2

Declaration: Declaration:
A, B: integer A, B: integer

Description: Description:
A  10 A  10
A 2*A B A
B A A 2*A

Write (B) Write (B)


20 10
Example – Calculating the Midpoint

 Two points P1(x1, y1) & P2(x2,y2)


Midpoint (x3, y3)
x1 + x2 y1 + y2
x3 = y3 =
2
2
Algorithm MidPoint …
Declaration P3.x  (P1.x+P2.x)/2
type point : record < x, y: real P3.y  (P1.y+P2.y)/2
> Write (P3.x, P3.y)
P1, P2, P3 : point
Description
read (P1.x, P1.y)
read (P2.x, P2.y)

Example - Converting Seconds to Hours
 4000 seconds = 1 hour + 6 minutes+40 seconds
Algorithm Conversion
Declaration
Type hour : record < hh: integer
mm: integer
ss: integer
>
H : hour
totalSeconds : integer
Spillage : integer { temporary variable}
Description:

read (totalSeconds)
J.Hh  totalSeconds div 3600
Spillage  totalSeconds mod 3600
J.Mm  spillage div 60
J.Ss  spillage mod 60
Write (J.hh, J.mm, J.ss)
SORTING
What Are Sorting Algorithms?

 A way to put items sequentially based on


some value attached to that item
 Sorting the cards when playing poke
 Sorting the scores after an examination
Why Are Sorting Algorithms Important?

 Wide use in research of almost all disciplines


 Statistics – categorize samples
 Chemistry – handle experimental data
 Sociology – telephone questionnaire

 Basement of complex algorithms


Sorting

 Two types: ascending, descending


Ascending : 3, 18, 25, 44, 99
Descending : 99, 44, 25, 18, 3

 Many methods, e.g.:


 Straight selection (already done)
 Bubble-sort / exchange sort
 Straight insertion
 Quick Sort
How To Evaluate Different Sorting
Algorithms?

 Basic evaluation
 Time to complete sorting
 Function of number of items to be sorted
(1) Straight Selection
 Recall previous discussion

1 2 3 4 5 6 N=6
12 29 17 56 11 23

Order from the


- Iterations  N-1 times smallest value

- Find the smallest


- Position exchange
(1) Straight Selection
Example
Pass # Results
0 12 29 17 56 11 23
1 11 12 29 17 56 23
2 11 12 17 29 56 23
3 11 12 17 23 29 59
4 11 12 17 23 29 59
5 11 12 17 23 29 59
Straight Selection
Total cycles of ‘comparison processing’ (K):
K = (N2-N) / 2

Where, N = number of data


(2) Bubble-Sort
 How to find the smallest number (min) ?
 by comparing two adjacent numbers

1 2 3 4 5 6 N=6
87 74 71 54 88 60

Need TEMP as well


Bubble Sort
Pass # Results
0 87 74 71 54 88 60
1 74 71 54 87 60 88
2 71 54 74 60 87 88
3 54 71 60 74 87 88
4 54 60 71 74 87 88
5 54 60 71 74 87 88

Total cycles of ‘comparison processing’ (K):


K = (N2-N) / 2

Where, N = number of data


Homework

I leave the flowchart for homework


 (bubble-sort method)
Bubble Sort -Time Complexity

 Number of items: n

 Worst case:
 (n – 1) + (n – 2) + … + 2 + 1= n (n – 1) / 2

 Time complexity: o (n^2)


(3) Straight insertion
sorting small number of elements
(3) Straight insertion
(3) Straight insertion
N=6

1 2 3 4 5 6 7 8
10 5 7 9 2 1 8 6
(3) Straight insertion

Pass # Results
0 10 5 7 9 2 1 8 6
1 5 10 7 9 2 1 8 6
2 5 7 10 9 2 1 8 6
3 5 7 9 10 2 1 8 6
4 2 5 7 9 10 1 8 6
5 1 2 5 7 9 10 8 6
6 1 2 5 7 8 9 10 6
7 1 2 7 6 7 8 9 10
Straight insertion - Flow Chart
START

Read N, X[I]

FOR I=1 TO N-1


PRINT RESULT

T = X[I]
X[0]=T
J=I-1 STOP

YES
WHILE T<x[J]

X [J+1] = X[J]
J=J-1 NO
X[J+1]=T

NEXT I
Straight insertion - Time
Complexity

 Number of items: n

 Worst case:
 1 + 2 + 3 + … + (n – 1) = n (n – 1) / 2

 Time complexity: o (n^2)


(4) Binary (Heap) Insertion
 More efficient than straight insertion method
1 2 3 4 5 6 7 8
10 5 7 9 2 1 8 6

•Similar to Straight Insertion


•Different method in insertion
(4) Binary (Heap) Insertion

 Steps:
1. Define Location
2. Shift all ordered data
3. Insert the new data
(4) Binary (Heap) Insertion

1 2 3 4 5 6 7 8
10 5 7 9 2 1 8 6

(1) Define Location


- LB = 1 on the ordered array
- UB = I - 1
- Mid = (LB + UB) /2

5 7 10 9 2 1 8 6

LB UB
MID Q

assume Q < MID = MID-1


compare MID
MID = LB = UB
(4) Binary (Heap) Insertion

(2) Shift to the right  the ordered array on the


right side of the location

(3) Insert
Pass # Results
0 10 5 7 9 2 1 8 6
1 5 10 7 9 2 1 8 6
2 5 7 10 9 2 1 8 6
3 5 7 9 10 2 1 8 6
4 2 5 7 9 10 1 8 6
5 1 2 5 7 9 10 8 6
6 1 2 5 7 8 9 10 6
7 1 2 7 6 7 8 9 10
Data to be #Location for
Iteration # Counter inserted #LB #UB #MID Inserting

5 6 1 1 5 3 -
1 1 2 2 -
1 1 1 1 1

6 7 8 1 6 4 -
8 4 5 4 -
8 5 5 5 5

(4) Binary (Heap) Insertion


Summary of Sorting Algorithms
Algorithm Time Notes
selection-sort O(n2) slow (good for small inputs)

insertion-sort O(n2) slow (good for small inputs)

O(n log n)
quick-sort fastest (good for large inputs)
expected

heap-sort O(n log n)  fast (good for large inputs)

merge-sort O(n log n)  fast (good for huge inputs)


(5) Quick Sort
 By: CAR Hoare

Based on two operations:


• Splitting the array into two sub-arrays by placing the first
array element in a middle position such that all the
numbers to the right are greater than all numbers to the left
(DivideAndConquer).

• Repeating the previous step recursively by dividing each


sub-array into two sub-arrays in the same manner until an
empty sub-array is reached.
(5) Quick Sort

1 2 3 4 5 6 7
10 2 17 7 16 3 9
N=7

X
location = J

smaller greater
(5) Quick Sort
 10, 2, 17, 7, 16, 3, 9.
Quick-Sort
 Quick-sort is a randomized
sorting algorithm based on
the divide-and-conquer x
paradigm:
– Divide: Pick a random
element x (called the
pivot) and partition S into x
 L elements less than x
 E elements equal to x L E G
 G elements greater than x
– Recurse: Sort L and G
– Conquer: Join L, E, and
G x
Partition
 We partition an input Algorithm partition(S, p)
sequence as follows: Input sequence S, position p of pivot
 We remove, in turn, each
element y from S and Output subsequences L, E, G of the
 We insert y into L, E or G, elements of S less than, equal to,
depending on the result of or greater than the pivot, resp.
the comparison with the L, E, G ← empty sequences
pivot x
x ← S.remove(p)
 Each insertion and removal
is at the beginning or at the while ¬S.isEmpty()
end of a sequence, and y ← S.remove(S.first())
hence takes O(1) time if y < x
 Thus, the partition step of L.insertLast(y)
quick-sort takes O(n) time
else if y = x
 Pivot sometimes chosen as
first element of S (assuming E.insertLast(y)
S unsorted), or median of else { y > x }
first, middle, and last G.insertLast(y)
elements of S
return L, E, G
Quick-Sort Tree
 An execution of quick-sort is depicted by a binary tree
 Each node represents a recursive call of quick-sort and stores
 Unsorted sequence before the execution and its pivot
 Sorted sequence at the end of the execution
 The root is the initial call
 The leaves are calls on subsequences of size 0 or 1
 I.e., recurse on L or G if it has 2 or more members

7 4 9 6 2 → 2 4 6 7 9

4 2 → 2 4 7 9 → 7 9

2→ 2 9→ 9
Execution Example
 Pivot selection

7 2 9 43 7 6 1 → 1 2 3 4 6 7 8 9

7 2 9 4 → 2 4 7 9 3 8 6 1 → 1 3 8 6

2→ 2 9 4 → 4 9 3→ 3 8→ 8

9→ 9 4→ 4
Execution Example (cont.)
 Partition, recursive call, pivot selection

7 2 9 4 3 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1→ 2 4 7 9 3 8 6 1 → 1 3 8 6

2→ 2 9 4 → 4 9 3→ 3 8→ 8

9→ 9 4→ 4
Execution Example (cont.)
 Partition, recursive call, base case

7 2 9 43 7 6 1→ → 1 2 3 4 6 7 8 9

2 4 3 1 →→ 2 4 7 3 8 6 1 → 1 3 8 6

1→ 1 9 4 → 4 9 3→ 3 8→ 8

9→ 9 4→ 4
Execution Example (cont.)
 Recursive call, …, base case, join

7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1 → 1 2 3 4 3 8 6 1 → 1 3 8 6

1→ 1 4 3 → 3 4 3→ 3 8→ 8

9→ 9 4→ 4
Execution Example (cont.)
 Recursive call, pivot selection
7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1 → 1 2 3 4 7 9 7 1 → 1 3 8 6

1→ 1 4 3 → 3 4 8→ 8 9→ 9

9→ 9 4→ 4
Execution Example (cont.)
 Partition, …, recursive call, base case

7 2 9 43 7 6 1→ 1 2 3 4 6 7 8 9

2 4 3 1 → 1 2 3 4 7 9 7 1 → 1 3 8 6

1→ 1 4 3 → 3 4 8→ 8 9→ 9

9→ 9 4→ 4
Execution Example (cont.)
 Join, join

7 2 9 4 3 7 6 1 → 1 2 3 4 6 7 7 9

2 4 3 1 → 1 2 3 4 7 9 7 → 17 7 9

1→ 1 4 3 → 3 4 8→ 8 9→ 9

9→ 9 4→ 4
Worst-case Running Time
 The worst case for quick-sort occurs when the pivot is the unique
minimum or maximum element
 One of L and G has size n − 1 and the other has size 0
 The running time is proportional to the sum
n + (n − 1) + … + 2 + 1
 Thus, the worst-case running time of quick-sort is O(n2)
depth time

0 n

1 n−1

… …

n−1 1
Expected Running Time
 Consider a recursive call of quick-sort on a sequence of size s
 Good call: the sizes of L and G are each less than 3s/4
 Bad call: one of L and G has size greater than 3s/4
7 2 9 43 7 6 19 7 2 9 43 7 6 1

2 4 3 1 7 9 7 1 → 1 1 7294376

Good call Bad call

 A call is good with probability 1/2


 1/2 of the possible pivots cause good calls:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Bad pivots Good pivots Bad pivots


Expected Running Time, Part 2
 Probabilistic Fact: The expected number of coin tosses required in
order to get k heads is 2k
 For a node of depth i, we expect
 i/2 ancestors are good calls
 The size of the input sequence for the current call is at most (3/4)i/2n
 Therefore, we have expected height time per level
s(r) O(n)
 For a node of depth 2log4/3n, the
expected input size is one
 The expected height of the s(a) s(b) O(n)
quick-sort tree is O(log n)
O(log n)
 The amount or work done at the s(c) s(d) s(e) s(f) O(n)
nodes of the same depth is O(n)
 Thus, the expected running time
of quick-sort is O(n log n)
total expected time: O(n log n)
Summary of Sorting Algorithms
Algorithm Time Notes
selection-sort O(n2) slow (good for small inputs)

insertion-sort O(n2) slow (good for small inputs)

O(n log n)
quick-sort fastest (good for large inputs)
expected

heap-sort O(n log n)  fast (good for large inputs)

merge-sort O(n log n)  fast (good for huge inputs)


Notice the following points in the DivideAndConquer procedure:

 Two search operations started ‘together’, one from the left, and one
from the right.
 At the beginning of the search operations, the index Left is the same
as First, and
 the index Right is the same as Last. During the operation, Right
moves to the left, and Left moves to the right.
 The search operations continue as long as the two search
operations do not meet.
 This occurs when Right=Left. At this point, the WHILE loop ends and
the first element takes a new position stored in the index Mid.
 The DivideAndConquer procedure does the main process, which is
arranging the array passed to it. This procedure is called from the
quicksort procedure

Anda mungkin juga menyukai