Csc201: Introduction To Computer Programming I What Is A Computer?
Csc201: Introduction To Computer Programming I What Is A Computer?
What is a Computer?
A computer is basically defined as a tool or machine used for processing data to give required
information. It is capable of:
a. taking input data through the keyboard (input unit)
b. storing the input data in a diskette, hard disk or other medium
c. processing it at the central processing unit (CPU) and
d. giving out the result (output) on the screen or the Visual Display Unit (VDU).
• 1946-1959.
• Started with using vacuum tubes
• Tubes like electric bulbs produced a lot of heat
• very expensive , could be afforded only by very large organizations,
• expensive to power,
• huge size,
• supported machine language only, not portable,
• often unreliable, slow, limited internal storage, and punched cards were used to enter data
• batch processing operating systems were used.
• Examples of computer of this generation were: ENIAC, EDVAC, UNIVAC, IBM-701,
IBM-650, etc.
1
• 1971-1980.
• Use of VLSI circuits
• More powerful, and compact, reliable, and affordable. As a result, it gave rise to personal
computer (PC) revolution.
• Examples: DEC 10, STAR 1000, PDP 11,
• CRAY-1 (Super Computer), CRAY-X-MP (Super Computer), etc.
(v.) th
5 Generation
• 1980-till date.
• production of microprocessor chips
• more user friendly interfaces with multimedia features,
• availability of very powerful and compact computers at cheaper rates.
• Some computers types of this generation are: Desktop, Laptop, NoteBook, UltraBook, and
ChromeBook.
Hardware & Software – physical and tangible components of computer/ non- physical
(i.) System Software
• Collection of programs design to operate and control computer
• Designed in machine languages
Examples
• Operating System
• Compilers
• Interpreters
• Assemblers
What is Programming
A computer program is a series of instructions given to the computer at a particular time to solve a
problem. It is a set of codes that instructs the computer to carry out some processes. It is usually written
in a particular programming language, and named according to the language used in writing it.
Programming Languages
Programming languages are languages through which we can instruct the computer to carry out some
processes or tasks. It is the language through which we can pass instructions to user and the system.
They are also designed to communicate ideas about algorithms between human beings and computers.
Programming languages can be used to execute a wide range of algorithms, that is, an instruction could
be executed through more than a procedure of execution.
Programming is the process of writing programs.
Machine Language:
Machine language is a set binary coded instruction, which consists of zeros
(0) and ones (1). Machine language is peculiar to each type of computer. The first generation of
computers was coded in machine language that was specific to each model of computer.
Some of the shortcomings of the machine language were:
1. Coding in machine language was a very tedious and boring job
2. Machine language was not user-friendly. That is the user had to remember a long list of codes,
numbers or operation codes and know where instructions were stored in computer memory.
3. Debugging any set of codes is a very difficult task since it requires going through the program
instruction from the beginning to the end.
The major advantage of machine language is that it requires no translation since it is already in machine
language and is therefore faster to execute.
The major advantage of the assembly language is that programs written in it are easier to read and more
user friendly than those written in machine language, especially when comments are inserted in the
codes.
Stages of Programming
The preparation of a computer program involves a set of procedure. These steps can be classified into
eight major stages, viz
(i) Problem Definition
4
(ii) Devising the method of solution
(iii) Developing the method using suitable aids, e.g. pseudo code or flowchart.
(iv) Writing the instructions in a programming language
(v) Transcribing the instructions into “machine sensible” form
(vi) Debugging the program
(vii) Testing the program
(viii) Documenting all the work involved in producing the program.
1. Problem definition
The first stage requires a good understand of the problem. The programmer (i.e. the
person writing the program) needs to thoroughly understand what is required of a
problem. A complete and precise unambiguous statement of the problem to be
solved must be stated. This will entail the detailed specification which lays down the input,
processes and output-required.
2. Devising the method of solution
The second stage involved is spelling out the detailed algorithm. The use of a
computer to solve problems (be it scientific or business data processing problems)
requires that a procedure or an algorithm be developed for the computer to follow in solving the
problem.
3. Developing the method of solution
There are several methods for representing or developing methods used in solving a problem.
Examples of such methods are: algorithms, flowcharts, pseudo code, and decision tables.
4. Writing the instructions in a programming language
After outlining the method of solving the problem, a proper understanding of the syntax of the
programming language to be used is necessary in order to write the series of instructions required
to get the problem solved.
5. Transcribing the instructions into machine sensible form
After the program is coded, it is converted into machine sensible form or machine language. There
are some manufacturers written programs that translate users program (source program) into
machine language (object code). These are called translators and instructions that machines can
execute at a go, while interpreters accept a program and executes it line-by-line.
During translation, the translator carries out syntax check on the source program to detect errors
that may arise from wrong use of the programming language.
6. Program debugging
A program seldomly executes successfully the first time. It normally contains a few errors (bugs).
General violations of the rule of programming or pitfalls of computer arithmetic are referred to as
Errors or Program Errors Debugging is the process of locating and correcting errors. During
program execution, errors detected will be listed, corrected by the programmer and program re-
run carried out again. After corrections have been made, the program is again read into the
computer and again processed by the language translator. This is repeated over and over again
until the program is error-free. There are three classes of errors.
(i.) Syntax errors: Caused by mistake coding (illegal use of a feature of the programming
language). Errors due to misspelled (spelling errors) statements or instructions or wrong
use or non use of punctuation marks (such as commas, colon etc.) where necessary. When
such errors are detected, the programmer should look at the program for possible spelling
errors or omission of punctuation marks.
5
(ii.) Logic errors: Caused by faulty logic in the design of the program. The program will work
but not as intended.
(iii.) Execution errors: The program works as intended but illegal input or other
circumstances at run-time makes the program stop. E.g.
– Conversion Error: numbers are mostly stored in their binary form. A decimal number 0.5
is represented by the binary number 0.1 in such computers. Again, a number 1/10 is
represented by 0.0001100110011……. which has no finite binary representation. Depending
on the computer word length, errors will be noticed if the arithmetic operations become
bigger.
– Round-off Error: this occurs when a specific significant digits are used, for example, 1/3
which supposed to be 0.3333333….. could be stored as 0.333.
– Run-time Error: this error occurs when the translator come across wrong mathematical
operation during program execution. This could occur, as an example, when a number is
divided by zero.
7. Program testing
The purpose of testing is to determine whether a program consistently produces correct or expected
results. A program is normally tested by executing it with a given set of input data (called test
data), for which correct results are known.
8. Program documentation:
Documentation of the program should be developed at every stage of the programming cycle. The
following are documentations that should be done for each program.
6
(iv.) It provides information as to the use of the program to those unfamiliar with it.
(v.) It provides operating instructions to the computer operator.
i. Program Correctness: a good program must be able to solve the intended problem with relevant
results. The output (result) must be readily available for testing with assumed or calculated
results using real or dummy data.
ii. Documentation: every module or procedure must be preceded with comments on brief
explanation of the module in the program. These make programs easy to read and understand by
other users that may want to modify or improve on the program. Complete documentation of the
whole program is also necessary to give details of the input, output, processing tasks and manual
guide.
iii. Robustness and Scalability: programs that can survive various unexpected events are said to
be robust and those that can easily be upgraded are scalable. They are sometimes called safe or
defensive programs because of the way they are written, the choice of variable names, surviving
incorrect data etc.
iv. User Interface: a good look or design of the medium of interaction of the user and the program
must be well taken care so as to have a good user interface. This is the part of the program that
performs the dialog aspect of the program with the user and must be easy and friendly to use.
v. Program Style: the programming language rules are to be adapted in writing programs. You
should not do what is not to be done with the programming language in question.
vi. Use of Tools and Library Functions: programming languages have special tools and libraries
that can assist in developing computer programs. Some editors are also used to enhance the
development of programs.
Characteristics of Algorithms
The following are the major considerations in the design of algorithms
• An algorithm must have a beginning and an end
• The non- ambiguity requirement for each step of an algorithm cannot be compromised.
• The range of inputs for which an algorithm works has to be specified carefully
• The same algorithm can be represented in several different ways
7
• Several algorithms for solving the same problem may exist
• Algorithms for the same problem can be based on very different ideas and can solve the problem
with dramatically different speeds
• It must terminate at a reasonable period of time.
8
When you draw a flowchart, you should use industry-standard shapes to represent each step in the
process. You usually draw the flow from top to bottom or from left to right. Arrows connect the shapes
to define the flow.
Flowchart Symbols
Flowcharts are drawn with the help of symbols. The following are the most commonly used flowchart
symbols and their functions:
9
10
Case
Example 2: Find the difference and the division of two numbers and display the results.
Algorithm:
Step 1: Start
Step 2: Input N1
Step 3: Input N2
Step 4: D = N1 –N2
Step 5: V = N1 / N2
Step 6: Output D
Step 7: Output V
Step 8: Stop
12