Tuesday, March 24, 2009

The Programatic Programmer. Book authors: Andrew Hunt and David Thomas (notes)

Programmer Goals:
1. Learn at least one new language every year.
2. Read a technical book each quarter.
3. Read nontechnical books, too.
4. Take classes.
5. Participate in local user groups.
6. Experiment with different environments.
7. Stay current
8. Get wired.

DRY - Don't Repeat Yourself.

The Law of Demeter for functions attempts to minimize coupling between modules in any given program.

Metadata - is data about data. Put Abstractions in Code, Details in Metadata.

MVC is typically taught in the context of GUI development, it is really general-purpose programming technique.
- Model. The abstract data representing the target object. The model has no direct knowledge of any views or controllers.
- View. A way to interpret the model. It describes to changes in the model and logical events from the controller.
- Controller. A way to control the view and provide the mode with new data. It publishes events to both model and the view.

The O() Notation
Is a mathematical way of dealing with approximation. Some common O() notations

O(1) - Constant
(access element in array, simple statements)
O(log(n)) - Logarithmic (binary search)
O(n) - Linear (sequential search)
O(n lg(n)) - Worse the linear, but not much worse
(average runtime of quicksort, heapsort)
O(n2) - Square law (selection and insertion sorts)
O(n3) - Cubic (multiplication of 2 n x n matrices)
O(Cn) - Exponential (traveling salesman problem, set partitioning)


Common Sense Estimation
Simple Loops:
If a simple loop runs 1 to n, then the algorithm
becomes to be O(n)-time increases linearly with n.
Examples include exhaustive searches, finding the maximum value in an array, and generating checksums.

Nested Loops:
If you nest a loop inside another, then your algorithm becomes
O(m x n), where m and n are two loops' limits. This commonly
occurs in simple sorting algorithms, such as buble sort,
where the outter loop scans each element in the array in turn,
and the inner loop works out where to place that element in the
sorted result. Such sorting algorithms tend to be O(n2)

Binary Chop:
If your algorithm halves the set of thins is considers each time
around the loop, then it is likely to be algorithmic, O(lg(n)).
A binary search of a sorted list, traversing a binary tree, and
finding the first set bit in a machine word can all be O(lg(n)

Divide and Conquer:
Algorithms that partition their input, work on the two halves
independently, and then combine the result can be O(nlg(n)). The
classic example is quicksort, which works by partitioning the
data into two halves and recursively sorting each. Although
technically O(n2), because its behavior degrades when it is fed
sorted input, the average runtime of quicksort is O(nlg(n))

Combinatoric:
Whenever algorithms start looking at the permutations of things, their
running times may get out of hand. This is because permutations involve
factorials (there are 5! = 5 x 4 x 3 x 2 x 1) = 120 permutations of the
digits from 1 to 5). Time a combinatoric algorithm for five elements: it will
take six times longer to run it for six, and 42 longer for seven. Examples
include algorithms for many of the acknowledged hard problems - the traveling salesman problem, optimally packing things into a container, partitioning a set of numbers so that each set has the same total and so on. Often, heuristics are used to reduce the running times of these types of algorithms in particular problem domains.

1 comment: