Test 1 Flashcards Preview

Programming Language Concepts > Test 1 > Flashcards

Flashcards in Test 1 Deck (45)
Loading flashcards...

What is the CPU?

Central Processing unit, or the "Brain" of the computer. Controls all other computer functions. Also known as the microprocessor or simply processor.


What is the BUS?

Computer Components are connected by a bus, or a group of parallel wires that carry signals and data between components.


What is main memory?

Main memory is made up of capacitors. If a capacitor is charged it's state is 1 or ON, or set. If not its state is 0 or OFF or reset or cleared.


How is memory divided?

Divided into cells where each cell contains 8 bits (a 1 or 0) called a byte. Each cell is uniquely numbered, with a number called its address.


What kind of storage is Main Memory?

Volatile, meaning if power is lost the information in main memory is lost.


How do other components access information in Main Memory?

Other components can get the information at a particular address in MM, known as READ. This does not alter the contents. Other components can also store information at a particular address known as WRITE, which alters the contents of MM.


What kind of access is MM, what does this mean?

Random access or direct access means any memory address can be reached in the same amount of time because we can go directly to the address we want without starting from 0 and reading everything until we get to where we want (sequential address).


What are some kinds of Secondary Storage Media?

Disks (floppy, hard, removable)
This kind of storage is called persistent (permanent) because it is nonvolatile.


What are some I/O Devices?

Also known as peripheral devices.
Keyboard, mouse, monitor, disk drive, CD or DVD drive, printer, scanner.


What are the kinds of languages?

Machine languages: machine dependent, a set of binary strings, each of them is a sequence of ‘0’ and ‘1’, of fixed length.

Assembly languages: English-like abbreviations to represent operations of computer.  

Example: ADD A Reg1

High-level language (HLL): Assembly languages are still too troublesome even for simple operations like  C=A+B. An HLL has structures well recognized by humans. 


What are some high level languages?

Fortran: IBM during 50’s. Scientific/engineering computation.

COBOL (Common Business Oriented Language), Late 50’s. Database

C, 70’s at Bell Lab. System language. Many modern OSs are written using C.

C++, 80’s

Java: Early 90’s SUN Microsystems. Written using C++.


What is the process of running a normal HLL?

HLL is turned into Assembly Language by the compiler, then the assembler turns it into machine code, then the loader loads it onto your computer to run.


How does Java run programs?

Compiler turns program into byte code, which is then loaded into and executed by the Java virtual machine which is then run on the real machine.


What does readability mean?

How easy is it to read and understand programs written in the programming language?

Arguably the most important criterion!

Factors effecting readability include:

Simplicity: too many features is bad

Orthogonality: small set of primitive constructs combinable in a small number of ways to build the language’s control and data structures

Makes the language easy to learn and read

Meaning is context independent

Control statements

Data type and structures

Syntax considerations


What does writability mean?

How easy is it to write programs in the language?

Factors effecting writability:

Simplicity and orthogonality

Support for abstraction


Fit for the domain and problem


What are the factors in reliability?


Type checking

Exception handling


Readability and writability


What are the categories of cost?


Programmer training

Software creation



Compiler cost

Poor reliability



What other concepts are important in determining the usefulness of a PL?




Good fit for hardware (e.g., cell) or environment (e.g., Web)



What are von Neumann Machines?

John von Neuman is generally considered to be the inventor of the "stored program" machines, the class to which most of today's computers belong
One CPU + one memory system that contains both program and data
Focus on moving data and program instructions between registers in CPU to memory locations
Fundamentally sequential


What is the von Neumann Diagram?

^ ^
| |
v v
Control --> Arithmetic Output



What were some PL Methodologies through the decades?

50s and early 60s: Simple applications; worry about machine efficiency
Late 60s: People efficiency became important; readability, better control structures. maintainability
Late 70s: Data abstraction
Middle 80s: Object-oriented programming
90s: distributed programs, Internet
00s: Web, user interfaces, graphics, mobile, services
10s: parallel computing, cloud computing?, pervasive computing?, semantic computing?, virtual machines?


What are the big 4 PL Paradigms? What are some others?

The big four PL paradigms:
Imperative or procedural (e.g., Fortran, C)
Object-oriented (e.g. Smalltalk, Java)
Functional (e.g., Lisp, ML)
Rule based (e.g. Prolog, Jess)

Scripting (e.g., Python, Perl, PHP, Ruby)
Constraint (e.g., Eclipse)
Concurrent (Occam)


What are some language design tradeoffs?

Reliability versus cost of execution
Ada, unlike C, checks all array indices to ensure proper range but has very expensive compilation

Writability versus readability
(2 = 0 +.= T o.| T) / T < iN
APL one-liner producing prime numbers from 1 to N, obscure to all but the author

Flexibility versus safety
C, unlike Java, allows one to do arithmetic on pointers, see worse is better


What are some implementation methods?

Direct execution by hardware
e.g., native machine language

Compilation to another language
e.g., C compiled to Intel Pentium 4 native machine language

Interpretation: direct execution by software
e.g., csh, Lisp, Python, JavaScript

Hybrid: compilation then interpretation
Compilation to another language (aka bytecode), then interpreted by a ‘virtual machine’, e.g., Java, Perl

Just-in-time compilation
Dynamically compile some bytecode to native code (e.g., V8 JavaScript engine)


What are some implementation issues?

1. Complexity of compiler/interpreter (interpreted best)
2. Translation speed (interpreted best)
3. Execution speed (compiled best)
4. Code portability (interpreted best)
5. Code compactness (interpreted best)
6. Debugging ease (interpreted best)


Who are some of the first programmers?

Jacquard loom of early 1800s
Translated card patterns into cloth designs
Charles Babbage’s analytical engine (1830s & 40s)
Programs were cards with data and operations. Steam powered!
Ada Lovelace – first programmer


Who was Konrad Zuse?

Konrad Zuse began work on Plankalkul (plan calculus), the first algorithmic programming language, with an aim of creating the theoretical preconditions for the formulation of problems of a general nature.
Seven years earlier, Zuse had developed and built the world's first binary digital computer, the Z1. He completed the first fully functional program-controlled electromechanical digital computer, the Z3, in 1941.
Only the Z4 – the most sophisticated of his creations -- survived World War II.


Who was Von Neumann?

Von Neumann led a team that built computers with stored programs and a central processor
ENIAC wasprogrammed with patch cards


What were machine codes? What was wrong with them?

Initial computers (1940s) were programmed in raw machine codes.
These were entirely numeric.
What was wrong with using machine code? Everything!
Poor readability
Poor modifiability
Expression coding was tedious
Inherit deficiencies of hardware, e.g., no indexing or floating point numbers


What were some Pseudocodes?

Short Code or SHORTCODE - John Mauchly, 1949.
Pseudocode interpreter for math problems, on Eckert and Mauchly’s BINAC and later on UNIVAC I and II.
Possibly the first attempt at a higher level language.
Expressions were coded, left to right, e.g.:
X0 = sqrt(abs(Y0))
00 X0 03 20 06 Y0
Some operations:
01 – 06 abs 1n (n+2)nd power
02 ) 07 + 2n (n+2)nd root
03 = 08 pause 4n if <= n
04 / 09 ( 58 print & tab

More Pseudocodes
Speedcoding; 1953-4
A pseudocode interpreter for math on IBM701, IBM650.
Developed by John Backus
Pseudo ops for arithmetic and math functions
Conditional and unconditional branching
Autoincrement registers for array access
Slow but still dominated by slowness of s/w math
Interpreter left only 700 words left for user program
Laning and Zierler System - 1953
Implemented on the MIT Whirlwind computer
First "algebraic" compiler system
Subscripted variables, function calls, expression translation
Never ported to any other machine