donderdag 21 december 2017

Algorithms at Khan Academy part 4

Quick Sort

Quicksort has a couple of other differences from merge sort. Quicksort works in place. And its worst-case running time is as bad as selection sort's and insertion sort's: Θ(n2) \Theta(n^2) . But its average-case running time is as good as merge sort's: Θ(nlgn) \Theta(n \lg n) . So why think about quicksort when merge sort is at least as good? That's because the constant factor hidden in the big-Θ notation for quicksort is quite good. In practice, quicksort outperforms merge sort, and it significantly outperforms selection sort and insertion sort.


var quickSort = function(array, p, r) {
 if(p<r){
        var q = partition(array,p,r);
        quickSort(array,p,q-1);
        quickSort(array,q+1,r);
    }
};

Challenge: implement partition

// Swaps two items in an array, changing the original array
var swap = function(array, firstIndex, secondIndex) {
    var temp = array[firstIndex];
    array[firstIndex] = array[secondIndex];
    array[secondIndex] = temp;
};

var partition = function(array, p, r) {
    // Compare array[j] with array[r], for j = p, p+1,...r-1
    // maintaining that:
    //  array[p..q-1] are values known to be <= to array[r]
    //  array[q..j-1] are values known to be > array[r]
    //  array[j..r-1] haven't been compared with array[r]
    // If array[j] > array[r], just increment j.
    // If array[j] <= array[r], swap array[j] with array[q],
    //   increment q, and increment j.
    // Once all elements in array[p..r-1]
    //  have been compared with array[r],
    //  swap array[r] with array[q], and return q.
      var q = p;
   
    for(var j = q; j < r; j++) {
        if(array[j] <= array[r]) {
            swap(array, j, q);
            q++;
        }
    }
   
    swap(array, r, q);
    return q;
};

Graph Representation










var edgeList = [ [0, 2], [1, 3], [2, 3], [2, 4], [3, 5], [4, 5] ];

var adjMatrix = [
     [[0,0,1,0,0,0], [0,0,0,1,0,0], [0,0,0,1,1,0], [0,0,0,0,0,1], [0,0,0,0,0,1], [0,0,0,0,0,0]]
    ];

var adjList = [
    [[2], [3], [3, 4], [5], [5], []]   
    ];

Breadth-first search, also known as BFS, finds shortest paths from a given source vertex to all other vertices, in terms of the number of edges in the paths.

Breadth-first search assigns two values to each vertex v v vv:
  • A distance, giving the minimum number of edges in any path from the source vertex to vertex v v vv.
  • The predecessor vertex of v v vv along some shortest path from the source vertex. The source vertex's predecessor is some special value, such as null, indicating that it has no predecessor.


how to keep track of which vertices have already been visited but have not yet been visited from. We use a queue, which is a data structure that allows us to insert and remove items, where the item removed is always the one that has been in the queue the longest. We call this behavior first in, first out. A queue has three operations:
  • enqueue(obj) inserts an object into the queue.
  • dequeue() removes from the queue the object that has been in it the longest, returning this object.
  • isEmpty() returns true if the queue currently contains no objects, and false if the queue contains at least one object.


BFS is a simple strategy in which the root node is expanded first, then all the successors of the root node are expanded next, then their successors and so on until the best possible path has been found. Due to the fact that this strategy for graph traversal has no additional information about states beyond that provided in the problem definition, Breadth First Search is classed as an uninformed or blind search.

BFS uses a queue data structure which is a 'First in, First Out' or FIFO data structure. This queue stores all the nodes that we have to explore and each time a node is explored it is added to our set of visited nodes.
If we were to conduct a breadth first search on the binary tree above then it would do the following:
  1. Set Node 1 as the start Node
  2. Add this Node to the Queue
  3. Add this Node to the visited set
  4. If this node is our goal node then return true, else add Node 2 and Node 3 to our Queue
  5. Check Node 2 and if it isn’t add both Node 4 and Node 5 to our Queue. 
  6. Take the next node from our Queue which should be Node 3 and check that.
  7. If Node 3 isn’t our goal node add Node 6 and Node 7 to our Queue.
  8. Repeat until goal Node is found.

 Challenge: Implement breadth-first search

/* A Queue object for queue-like functionality over JavaScript arrays. */
var Queue = function() {
    this.items = [];
};
Queue.prototype.enqueue = function(obj) {
    this.items.push(obj);
};
Queue.prototype.dequeue = function() {
    return this.items.shift();
};
Queue.prototype.isEmpty = function() {
    return this.items.length === 0;
};

/*
 * Performs a breadth-first search on a graph
 * @param {array} graph - Graph, represented as adjacency lists.
 * @param {number} source - The index of the source vertex.
 * @returns {array} Array of objects describing each vertex, like
 *     [{distance: _, predecessor: _ }]
 */
var doBFS = function(graph, source) {
    var bfsInfo = [];

    for (var i = 0; i < graph.length; i++) {
        bfsInfo[i] = {
            distance: null,
            predecessor: null };
    }

    bfsInfo[source].distance = 0;

    var queue = new Queue();
    queue.enqueue(source);

    // Traverse the graph
   
    // As long as the queue is not empty:
    //  Repeatedly dequeue a vertex u from the queue.
    // 
    //  For each neighbor v of u that has not been visited:
    //     Set distance to 1 greater than u's distance
    //     Set predecessor to u
    //     Enqueue v
    //
    //  Hint:
    //  use graph to get the neighbors,
    //  use bfsInfo for distances and predecessors
        while(!queue.isEmpty()){
                    var vertex= queue.dequeue();

                for(var i=0; i< graph[vertex].length; i++){
                    var  neighbour = graph[vertex][i];

                    if(bfsInfo[neighbour].distance===null){
                        bfsInfo[neighbour].distance = bfsInfo[vertex].distance + 1;
                        bfsInfo[neighbour].predecessor=vertex;
                        queue.enqueue(neighbour);
                    }
                }
        }
               
   
    return bfsInfo;
};


var adjList = [
    [1],
    [0, 4, 5],
    [3, 4, 5],
    [2, 6],
    [1, 2],
    [1, 2, 6],
    [3, 5],
    []
    ];
var bfsInfo = doBFS(adjList, 3);
for (var i = 0; i < adjList.length; i++) {
    println("vertex " + i + ": distance = " + bfsInfo[i].distance + ", predecessor = " + bfsInfo[i].predecessor);
}

A graph in which E=V1 |E| = |V|-1 E=V1vertical bar, E, vertical bar, equals, vertical bar, V, vertical bar, minus, 1 is called a free tree.)

woensdag 20 december 2017

Algorithms at Khan Academy part 3

we'll see two other sorting algorithms, merge sort and quicksort, whose running times are better. In particular, merge sort runs in Θ(nlgn) \Theta(n \lg n) time in all cases, and quicksort runs in Θ(nlgn) time in the best case and on average, though its worst-case running time is Θ(n2) \Theta(n^2) .

During the Mergesort process the objects of the collection are divided into two collections. To split a collection, Mergesort will take the middle of the collection and split the collection into its left and its right part.

Divide-and-conquer algorithm:
  1. Divide the problem into a number of subproblems that are smaller instances of the same problem.
  2. Conquer the subproblems by solving them recursively. If they are small enough, solve the subproblems as base cases.
  3. Combine the solutions to the subproblems into the solution for the original problem.

Here's how merge sort uses divide-and-conquer:
  1. Divide by finding the number q q qq of the position midway between p p pp and r r rr. Do this step the same way we found the midpoint in binary search: add p p pp and r r rr, divide by 2, and round down.
  2. Conquer by recursively sorting the subarrays in each of the two subproblems created by the divide step. That is, recursively sort the subarray array[p..q] and recursively sort the subarray array[q+1..r].
  3. Combine by merging the two sorted subarrays back into the single sorted subarray array[p..r].

Challenge: Implement merge sort

// Takes in an array and recursively merge sorts it
var mergeSort = function(array, p, r) {
if (p < r) {
        var q = floor((p + r)/2);
        mergeSort(array, p, q);
        mergeSort(array, q + 1, r);
        merge(array, p, q,  r);
    }
};

Challenge: implement merge

var merge = function(array, p, q, r) {
    var lowHalf = [];
    var highHalf = [];

    var k = p;
    var i;
    var j;
    for (i = 0; k <= q; i++, k++) {
        lowHalf[i] = array[k];
    }
    for (j = 0; k <= r; j++, k++) {
        highHalf[j] = array[k];
    }

    k = p;
    i = 0;
    j = 0;
   
    // Repeatedly compare the lowest untaken element in
    //  lowHalf with the lowest untaken element in highHalf
    //  and copy the lower of the two back into array
    while (i<=(q-p) && j<=(r-q-1) ) {
if (lowHalf[i] < highHalf[j]) {
array[k] = lowHalf[i];
i++;
k++;
} else {
array[k] = highHalf[j];
j++;
k++;
}

}
   
    // Once one of lowHalf and highHalf has been fully copied
    //  back into array, copy the remaining elements from the
    //  other temporary array back into the array
    while (i <= (q-p)) {
array[k] = lowHalf[i];
i++;
k++;
}

while (j <= (r-q-1)) {
array[k] = highHalf[j];
j++;
k++;
}
     
};

VS quick sort: Quicksort can sort "inline" in an existing collection, e.g. it does not have to create a copy of the collection while Mergesort requires a copy.

One other thing about merge sort is worth noting. During merging, it makes a copy of the entire array being sorted, with one half in lowHalf and the other half in highHalf. Because it copies more than a constant number of elements at some time, we say that merge sort does not work in place. By contrast, both selection sort and insertion sort do work in place, since they never make a copy of more than a constant number of array elements at any one time. Computer scientists like to consider whether an algorithm works in place, because there are some systems where space is at a premium, and thus in-place algorithms are preferred.

If the amount of memory available to us is limited, it might be in our best interest to switch to another sorting algorithm like selection sort or insertion sort, since they require less memory (both of them require only constant memory, because they don't have to copy anything). Thus, it often boils down to a trade-off between memory and speed. If you're lacking in memory, you might have to use a slower or less desirable algorithm like selection sort. But if you have plenty of memory, then maybe you can afford a faster algorithm like merge sort.

dinsdag 19 december 2017

Algorithms at Khan Academy part 2

Recursive algorithms
The technique recursion is an algorithm to solve a problem by solving a smaller instance of the same problem, unless the problem is so small that we can just solve it directly.

The factorial function = very useful for when we're trying to count how many different orders there are for things or how many different ways we can combine things.

Iterative factorial

var factorial = function(n) {
    var result = 1;
for(var i = 1; i <= n; i++){
    result *= i;
}

    return result;
};

A recursive solution would involve the function calling itself
Induction starts from the base case(s) and works up, while recursion starts from the top and works downwards until it hits a base case.

With induction we know we started on a solid foundation of the base cases, but with recursion we have to be careful when we design the algorithm to make sure that we eventually hit a base case.

Recursive factorial
 var factorial = function(n) {
    // base case:
if(n === 0){
    return 1;
}
    // recursive case:
    return n * factorial(n-1);
};

println("The value of 0! is " + factorial(0) + ".");
println("The value of 5! is " + factorial(5) + ".");

Program.assertEqual(factorial(0), 1);
Program.assertEqual(factorial(5), 120);
Program.assertEqual(factorial(4), 24);
Program.assertEqual(factorial(3), 6);

We can distill the idea of recursion into two simple rules:
  1. Each recursive call should be on a smaller instance of the same problem, that is, a smaller subproblem.
  2. The recursive calls must eventually reach a base case, which is solved without further recursion.
A palindrome is a word that is spelled the same forward and backward. For example, rotor and redder.

Challenge: is a string a palindrome?

// Returns the first character of the string str
var firstCharacter = function(str) {
    return str.slice(0, 1);
};

// Returns the last character of a string str
var lastCharacter = function(str) {
    return str.slice(-1);
};

// Returns the string that results from removing the first
//  and last characters from str
var middleCharacters = function(str) {
    return str.slice(1, -1);
};

var isPalindrome = function(str) {
    // base case #1
    if(str.length <= 1){ return true;}
    // base case #2
    if(firstCharacter(str) !== lastCharacter(str)){ return false;}
    // recursive case
    return isPalindrome(middleCharacters(str));
};

var checkPalindrome = function(str) {
    println("Is this word a palindrome? " + str);
    println(isPalindrome(str));
};
 

recursive algorithm for computing xn x^n xnx, start superscript, n, end superscript:
  • The base case is when n=0 n = 0 n, equals, 0, and x0=1 x^0 = 1 x, start superscript, 0, end superscript, equals, 1.
  • If n n nn is positive and even, recursively compute y=xn/2 y = x^{n/2} y, equals, x, start superscript, n, slash, 2, end superscript, and then xn=yyx, start superscript, n, end superscript, equals, y, dot, y. Notice that you can get away with making just one recursive call in this case, computing xn/2x, start superscript, n, slash, 2, end superscript just once, and then you multiply the result of this recursive call by itself.
  • If n n nn is positive and odd, recursively compute xn1x, start superscript, n, minus, 1, end superscript, so that the exponent either is 0 or is positive and even. Then, xn=xn1x, start superscript, n, end superscript, equals, x, start superscript, n, minus, 1, end superscript, dot, x.
  • If n n nn is negative, recursively compute xn x^{-n} xnx, start superscript, minus, n, end superscript, so that the exponent becomes positive. Then, xn=1/xnx, start superscript, n, end superscript, equals, 1, slash, x, start superscript, minus, n, end superscript.

Challenge: Recursive powers

 var isEven = function(n) {
    return n % 2 === 0;
};

var isOdd = function(n) {
    return !isEven(n);
};

var power = function(x, n) {
    println("Computing " + x + " raised to power " + n + ".");
    // base case
    if(n === 0 && x^0 === 1){return 1^n;}
    // recursive case: n is negative
    if(n < 0) {
        return 1 / (power(x, abs(n))); //Make sure it's float AND abs
    }

    if(isOdd(n)) { return(power(x,n-1) * x);}
    // recursive case: n is odd
    if (isEven(n)) {
var abc = power(x, n/2);
return(abc*abc);
}
   
    return x * power(x, n-1);

};

var displayPower = function(x, n) {
    println(x + " to the " + n + " is " + power(x, n));
};

var displayPower = function(x, n) {
    println(x + " to the " + n + " is " + power(x, n));
};

displayPower(3, 0);
Program.assertEqual(power(3, 0), 1);
displayPower(3, 1);
Program.assertEqual(power(3, 1), 3);

Challenge: Solve Hanoi recursively

var solveHanoi = function(numDisks, fromPeg, toPeg) {
    // base case:  no disks to move
    if(numDisks === 0){
        return 1;}
    // recursive case:
    var sparePeg = hanoi.getSparePeg(fromPeg, toPeg);
solveHanoi(numDisks - 1, fromPeg, sparePeg);
hanoi.moveDisk(fromPeg, toPeg);
solveHanoi(numDisks - 1, sparePeg, toPeg);
};

solveHanoi(5, "A", "B");
Program.assertEqual(hanoi.isSolved("B"),true);