Friday, 30 November 2012

3-1-4: Turing Machines

Background:
In the early 20th Century, mathematicians wondered if there existed a process that could determine in a finite number of operations whether any mathematical statement was true or false. This challenge intrigued Alan Turing, and in 1934 he set about devising an abstract model of computation that he hoped would enable an effective procedure to be found which would give a yes or no answer to Hilbert's tenth problem (what the problem was named, after David Hilbert). Essentially, the effective procedure was an algorithm to solve this problem.

Solvable and Unsolvable Problems:

  • In 1930, Kurt Godel showed that there were mathematical statements that it would be impossible to prove or disprove.
  • In 1936, Alan Turing wrote a paper called 'On Computable Numbers'. This paper introduced the concept of the Turing Machine- a universal machine capable of solving any problem that can be represented algorithmically.
In Turing's human computer, computing is normally done by writing certain symbols on paper. Turing imagined this paper to be divided into squares like a child's arithmetic book.

How it Relates to Computers:

The work done by Turing was the base of the first computer.

The Turing Machine consists of:
  • an input alphabet
  • a tape that is infinitely long
  • an output alphabet (that can be printed on the tape)
  • a tape-head that is used to read/write to teh cells
  • a finite set of states- incliding a start state
  • a program (set of instructions)

Friday, 23 November 2012

3-1-3- Finite State Machines

accepts an input and processes an output. They are used extensivley in computer science applications such as:
  • Traffic light control systems
  • Specifying a computer language
  • Programs for spell checking
  • Networking protocols
State Transition Diagrams:A State Transition Diagram is used to show an FSM as a picture.

Finite State Automata:A type of FSM that has a finishing state and produces only a YES or NO output based on the final state.
e.g. State 2 = YES, other state = NO

(in example: input 0101; output NO--- input 110; output YES)
FSM With Outputs:

Mealy Machines
- the outputs are on the transitions/ arrows (L=Lines)

Moore Machines- the outputs are on the circles/states (O=Circles)



Mealy Machine
example: O|A- input is 0, output is A.
(input=110, output=BBA)

Moore Machine:example: C is the output on state 1.
(input=110, output=CBCA)






Transition Table:

Shows the states and transitions in an FSM in a table format:

Current State | S1   S1   S2   S2   S3   S3
            Input  | 0      1     0      1    0      1 
    Next State | S2   S3   S3   S3   S1   S1
          Output | A    B     C     B     C     B



                                                                Transition FunctionShows the states and transitions as a series of functions.
(input symbol, current state) (output symbol, current state)

(0, S1) -> (A, S2)
(1, S1) -> (B, S3)
etc.


Deterministic and Non-Deterministic FSA

Deterministic- only one transition arrow for each input.

Non-deterministic- has multiple transitions for one input. So there is more than one possible route around the state transition diagram.


TERMINOLOGY:

Halting State
- a state with no output.

Trigger - an input which triggers an action.

Thursday, 15 November 2012

3-1-2: Comparing Algorithms

Constant Complexity O(1)- takes the same amount if time no matter how big the problem is.
e.g. Sorting the first ten records in a file- no matter how many files there are, it will take the same amount of time to sort the first ten files.

Linear Complexity O(n)- as the size of the problem increases, the time taken to solve the problem increases at the same rate.
e.g. mowing the lawn- if there is a 2m strip of lawn to be mowed, it will take half the time as a 4m stip of lawn.

Logarithmic Complexity O(log n)- as the size of the problem grows, the time taken to solve it increases also, but at a slower rate.
e.g. a binary search- you start by looking at the middle item of a list. If this is not the item being searched for, then you look at the middle item of the half of the list where the item is being searched for wil be found (if its in the list). You keep doing this until the item is found, or you discover the item is not in the list. The size of the list being searched through havles with each comparison.

Polynomial Complexity O(nk)- as the size of the problem grows, the time taken to solve it grows faster, but can be solved in a reasonable amount of time.
e.g. shaking hands with everyone in the room- if there are two people in the room, only 1 handshake will be needed. With 3 people in the room, 3 handshakes are needed. With 4 people in the room, 6 handshakes are needed, and so on.

Exponential Complexity O(kn)- as the size of the problem grows, the time taken to solve it grows by larger and larger amounts. Only small size versions of these problems can be solved in a reasonable amount of time.
e.g. Emperor's Chess Board Problem.

Factorial Complexity O(n!) even worse than exponential. Cannot be solved in a reasonable time.
e.g. The Travelling Salesman problem.

Monday, 12 November 2012

3-1-1 Information Hiding and Abstraction

Information Hiding:
Hiding the design complexity of a system to simplify it for the user.
  • Real world example of information hiding: driving a car.
  • Computing example of information hiding: Windows

Abstraction:
The process of removing unnecessary details (information hiding) in the representation of a system.
  • Real world example of abstraction: map of an Underground system.
Abstraction in Computing= generalising.

Representation- used to produce the complexity of a problem, in a simple design.


Levels of Abstraction:
  • Appication: hides very complex processes behind an icon.
  • High Level Programming Language: hides a number of low-level commands behind a high-level commmand.
  • Machine Code: hides the complexity of the underlying electronics behind a machine code command
  • Electronics: hides the physical processes of atoms and neutrons with basic electronic componenets.