[go: up one dir, main page]

0% found this document useful (0 votes)
10 views13 pages

Module 1 (Part-1)

The document provides an overview of computers, their characteristics, advantages, and disadvantages, as well as the evolution of computing technology from early counting devices to modern AI and quantum computing. It details the generations of computers, highlighting key developments and pioneers such as Charles Babbage and Alan Turing. Additionally, it explains the architecture of computers, including the CPU, memory units, and the role of various components in processing data.

Uploaded by

ashekv074
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views13 pages

Module 1 (Part-1)

The document provides an overview of computers, their characteristics, advantages, and disadvantages, as well as the evolution of computing technology from early counting devices to modern AI and quantum computing. It details the generations of computers, highlighting key developments and pioneers such as Charles Babbage and Alan Turing. Additionally, it explains the architecture of computers, including the CPU, memory units, and the role of various components in processing data.

Uploaded by

ashekv074
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Fundamentals of computers and Computational thinking

COMPUTER
A computer is an electronic device that receives, processes, and computes data to get the desired output.
The term ‘Computer’ is derived from the word compute means to calculate.
1.2 CHARACTERISTICS OF COMPUTERS
• Speed: Perform millions of instructions per second, completing tasks far faster than humans.
• Accuracy: Deliver precise results with minimal errors if the input is correct.
• Automation: Execute programmed tasks automatically without human intervention.
• Storage: Store and quickly retrieve large amounts of data in various formats.
• Versatility: Handle a wide range of tasks, from simple to complex.
• Diligence: Work continuously without fatigue or loss of efficiency.
• Connectivity: Link to the internet and other systems for global communication and data sharing.
• Consistency: Follow instructions exactly, producing the same results every time.
• High Storage Capacity: Store vast amounts of information beyond human capability.
• Control: Operate under software instructions to manage tasks efficiently.
1.3 ADVANTAGES OF COMPUTER
• Multitasking: Perform multiple tasks and complex calculations in seconds.
• Speed: Complete tasks at incredible speeds, far faster than humans.
• Cost-Effective Storage: Store large amounts of data cheaply via centralized databases.
• Accuracy: Execute computations precisely; most errors come from incorrect user input.
• Data Security: Use protective measures to block malicious programs and threats.
• Productivity: Boost output by completing more tasks in less time.
• Remote Collaboration: Enable flexible work and teamwork via cloud and the internet.
• Communication: Support instant global interaction through email, video calls, and messaging.
• Information Access: Provide rapid access to vast online knowledge.
• Entertainment & Creativity: Offer tools for gaming, media, and digital content creation.
1.4 DISADVANTAGES OF COMPUTER
• Viruses & Hacking: Unauthorized access or malicious software spread via emails, websites, or
USBs.
• Cyber Crimes: Crimes like cyberstalking and fraud are committed using computers and
networks.
• Job Loss: Older generations with little computer knowledge faced reduced employment
opportunities.
• High Cost: Computers remain costly for many people.
• Distractions: High entertainment value can lead to wasted time online.
• Health Issues: Prolonged use causes eye strain, neck/back pain, and other hazards.
• Social Isolation: Excessive online activity can reduce face-to-face interactions.
• Environmental Impact: Production and disposal create e-waste, harming the environment.
2. THE EVOLUTION OF COMPUTERS
2.1 Early Counting Devices (Pre-Computer Era)
The Abacus (c. 4000 BCE)
• Invented in China; considered the first computing device.
• Used beads on rods to perform basic arithmetic (addition, subtraction).
• Spread across Asia, becoming an essential calculation tool.
Napier’s Bones (1617)
• Created by John Napier to aid multiplication and division.
• Consisted of ivory rods engraved with numbers.
• Introduced the decimal point, simplifying calculations.
• Key role in the development of logarithms.
2.2 Mechanical Calculators (17th–19th Century)
Pascaline (1642–1644)
• Invented by Blaise Pascal; performed addition and subtraction using gears and wheels.
• Designed to assist his father, a tax collector.
Stepped Reckoner (1673)
• Developed by Gottfried Wilhelm Leibniz; improved Pascal’s design.
• Could perform addition, subtraction, multiplication, and division.
• Used fluted drums instead of gears.
Difference Engine (1820s)
• Designed by Charles Babbage (“Father of Modern Computing”).
• Intended to calculate polynomial functions automatically.
• Never completed during his lifetime but demonstrated potential for automated computation.
Analytical Engine (1830s)
• Advanced version of the Difference Engine; designed as a general-purpose mechanical
computer.
• Featured a control unit, memory, and punch card I/O system.
• Never built but anticipated modern computing principles.

2.3 The Rise of Electronic Computing (1890s–1940s)


Tabulating Machine (1890)
• Invented by Herman Hollerith; based on punch cards for data tabulation and sorting.
• Used by the U.S. Census in 1890.
• Led to the creation of IBM in 1924.
Differential Analyzer (1930s)
• Invented by Vannevar Bush; first electronic analog computer.
• Used vacuum tubes to perform calculations (25 in a few minutes).
Mark I (1944)
• Built through an IBM–Harvard partnership; first programmable digital computer.
• Designed to perform large and complex calculations.

2.4 The Era of Transistors (1950s–1960s)


Transistor Computers (1950s)
•Replaced vacuum tubes with smaller, more reliable, and energy-efficient transistors.
•Made computers more compact and affordable.
UNIVAC I (1951)
• The first commercially successful computer by Eckert and Mauchly.
• Used for scientific and business applications.

2.5 The Rise of Integrated Circuits (1960s–1970s)


Integrated Circuits (1960s)
• Allowed multiple transistors on a single chip, reducing size and cost while boosting
performance.
IBM System/360 (1964)
•Mainframe family using ICs; set new standards for business, government, and academic
computing.
• Offered compatibility across machines.
Minicomputers & Microcomputers
• Introduction of the microprocessor led to smaller, affordable systems (e.g., PDP-8, PDP-11).
• Paved the way for personal computers.
2.6 The Personal Computer Revolution (1970s–1980s)
Apple II (1977)
• Developed by Steve Jobs and Steve Wozniak.
• One of the first successful personal computers; ran basic applications.
IBM PC (1981)
• Standardized the PC market; easily upgradable and software-compatible.
Macintosh (1984)
• Introduced the graphical user interface (GUI), making computers more user-friendly.

2.7 The Internet and Networking Era (1990s–Present)


World Wide Web (1990s)
• Invented by Tim Berners-Lee; enabled global information access.
• Led to web browsers like Netscape Navigator and Internet Explorer.
Cloud Computing (2000s–Present)
• Allows remote data storage and access over the internet.
• Services like Google Drive, Dropbox, and AWS transformed data management.
2.8 Modern Day and Future of Computing
Artificial Intelligence (AI)
•Uses machine learning and deep learning to make decisions, recognize patterns, and
understand language.
• Powers virtual assistants, autonomous vehicles, and more.
Quantum Computing (Emerging)
• Uses quantum mechanics to solve problems beyond classical computing.
• Promises breakthroughs in cryptography and material science.
Internet of Things (IoT)
• Connects devices to collect and share data (e.g., smart homes, wearables).
• Changing how humans interact with their environment.
3 GENERATIONS OF COMPUTERS

3.1. First Generation (1940s-1950s): Vacuum Tubes

• These computers were based on the vacuum tube technology.


• They were the fastest computing devices of their time (computation time was
milliseconds),
• These computers were very large and required a lot of space for installation.
• Since thousands of vacuum tubes were used, they generated a large amount of heat.
Therefore, air conditioning was essential
• These were non-portable and very slow equipment.
• They lacked versatility and speed.
• They were very expensive to operate and used a large amount of electricity.
• These machines were unreliable and prone to frequent hardware failures. Hence,
constant maintenance was required.
• Since machine language was used, these computers were difficult to program and use.
• Each component had to be assembled manually. Hence, the commercial appeal of these
computers was poor.
• Examples: ENIAC (Electronic Numerical Integrator and Computer), UNIVAC I
(Universal Automatic Computer).
3.2. Second Generation (1950s-1960s): Transistors

• These machines were based on the transistor technology


• They were smaller as compared to first-generation computers.
• The computational time of these computers was reduced to microseconds from
milliseconds.
• They were more reliable and less prone to hardware failure. Hence, required less
frequent maintenance
• They had better portability and generated less amount of heat.
• Assembly language was used to program computers. Hence, programming became more
time-efficient and less cumbersome.
• Second-generation computers still required air conditioning
• Manual assembly of individual components into a functioning unit was still required.
• Examples: IBM 7094, CDC 1604.

3.3. Third Generation (1960s-1970s): Integrated Circuits (ICs)

• These computers were based on integrated circuit (IC) technology.


• They were able to reduce computational time from microseconds to nanoseconds.
• They were easily portable and more reliable than the second generation.
• These devices consumed less power and generated less heat. In some cases, air
conditioning was still required.
• The size of these computers was smaller compared to previous computers.
• Since, failure of hardware occurred very rarely, its maintenance cost was very low.
• Extensive use of high-level languages became possible.
• Manual assembling of individual components was not required, so it reduced the large
requirement of labor and cost. However, highly sophisticated technologies were required
for the manufacture of IC chips.
• Commercial production became easier and cheaper.
• Examples: IBM System/360, DEC PDP-8.
3.4. Fourth Generation (1970s-Present): Microprocessors

• Fourth-generation computers are microprocessor-based systems.


• These computers are very small in size.
• Fourth-generation computers are the cheapest among all the other generations.
• They are portable and very reliable.
• These machines generate a negligible amount of heat, hence they do not require air
conditioning
• Hardware failure is negligible, so minimum maintenance is required.
• The production cost was very low. In addition, labour and cost involved at the assembly
stage are also minimal.
• Graphical user interface and pointing devices enable users to learn to use the computer
quickly.
• Interconnection of computers leads to better communication and resource sharing
• Examples: Intel 4004 (the first microprocessor), Apple Macintosh, IBM PC.
3.5. Fifth Generation (Present and Beyond): Artificial Intelligence (AI)

• This generation is still in development and is centered on the concept of artificial


intelligence.
• It aims to create computers that can understand natural language, learn, and reason.
• Key technologies include parallel processing, quantum computing, nanotechnology, and
advancements in natural language processing and expert systems.
• The goal is to build computers that can make decisions and solve complex problems
without human intervention.
• Examples: AI-driven robotics, natural language processing applications (like virtual
assistants), self-driving cars, and advanced quantum computers.

4. PIONEERS AND CONTRIBUTORS OF THE COMPUTING SYSTEM

4.1.Charles Babbage and the Mechanical Computer


• Charles Babbage, considered the "Father of the Computer," was an English mathematician
and inventor who designed the first automatic digital computers.
• The Difference Engine: Babbage's first major design (1820s) was a mechanical calculator
intended to compute mathematical tables, eliminating human error automatically. While a
portion of it was built, the full machine was never completed in his lifetime due to technical
and funding challenges.
• The Analytical Engine: This was Babbage's most revolutionary design, conceived in the
1830s. It was a general-purpose, programmable machine that included key components of
modern computers: a "mill" (equivalent to a CPU), a "store" (memory), and input/output
devices. The machine was to be programmed using punched cards, a concept inspired by the
Jacquard loom. Although it was never built, the Analytical Engine's design laid the
conceptual foundation for all subsequent digital computers.
4.2 John Von Neumann and the Architecture of the Computer
• Von Neumann architecture was first published by John von Neumann in 1945.
• His computer architecture design consists of a Central Processing Unit (Control
Unit, Arithmetic and Logic Unit (ALU), Registers), Memory Unit, and Inputs/Outputs.
• Von Neumann architecture is based on the stored-program computer concept, where
instruction data and program data are stored in the same memory. This design is still used in
most computers produced today.

Central Processing Unit (CPU)

• The Central Processing Unit (CPU) is the electronic circuit responsible for executing the
instructions of a computer program.
• It is sometimes referred to as the microprocessor or processor.
• The CPU contains the ALU, CU, and a variety of registers.

Registers

• Registers are high-speed storage areas in the CPU. All data must be stored in a register before
it can be processed.
MAR Memory Address Register Holds the memory location of data that needs to be accessed

MDR Memory Data Register Holds data that is being transferred to or from memory

AC Accumulator Where intermediate arithmetic and logic results are stored

PC Program Counter Contains the address of the next instruction to be executed

CIR Current Instruction Register Contains the current instruction during processing

Arithmetic and Logic Unit (ALU)

The ALU allows arithmetic (add, subtract, etc) and logic (AND, OR, NOT, etc) operations to be
carried out.

Control Unit (CU)

• The control unit controls the operation of the computer’s ALU, memory and input/output
devices, telling them how to respond to the program instructions it has just read and
interpreted from the memory unit.
• The control unit also provides the timing and control signals required by other computer
components.

Buses

Data is transmitted from one part of a computer to another, connecting all major internal components
to the CPU and memory.

A standard CPU system bus is comprised of a control bus, data bus, and address bus.

Address Bus Carries the addresses of data (but not the data) between the processor and memory

Data Bus Carries data between the processor, the memory unit and the input/output devices

Carries control signals/commands from the CPU (and status signals from other
Control Bus
devices) in order to control and coordinate all the activities within the computer
Memory Unit

• The memory unit consists of RAM, sometimes referred to as primary memory. Unlike a hard
drive (secondary memory), this memory is fast and also directly accessible by the CPU.
• RAM is split into partitions. Each partition consists of an address and its contents (both
in binary form).
• The address will uniquely identify every location in the memory.
• Loading data from permanent memory (hard drive), into the faster and directly accessible
temporary memory (RAM), allows the CPU to operate much quickly.
4.3. Alan Turing and the Turing Machine
• Alan Turing was a British mathematician, logician, and cryptanalyst who provided the
theoretical basis for computing before electronic computers were even a reality.
• The Turing Machine: In his 1936 paper, "On Computable Numbers," Turing described a
hypothetical device known as a "Turing machine."
• This was not a physical machine but an abstract model of a computing device that could
perform any task that could be reduced to a set of logical steps.
• This concept proved that a single machine could be designed to solve any computable
problem, making it the theoretical blueprint for the digital computer.
• Wartime Contributions: During World War II, Turing was a key figure at Bletchley Park,
where he played a crucial role in breaking the German Enigma code, a contribution that
significantly shortened the war.
4.4. J. Presper Eckert, John Mauchly, and the ENIAC
• J. Presper Eckert and John Mauchly were American engineers who designed and built the
first general-purpose electronic digital computer.
• The ENIAC: The Electronic Numerical Integrator and Computer (ENIAC) was built at the
University of Pennsylvania between 1943 and 1946. It was a massive machine, weighing
over 60,000 pounds and containing more than 17,000 vacuum tubes. The ENIAC was
designed to calculate artillery firing tables for the U.S. Army, but its first task was
performing calculations for the hydrogen bomb. Its speed—it could perform 5,000 additions
per second—was a thousand times faster than any existing electromechanical machine.
4.5. John McCarthy and the Rise of AI
• John McCarthy was an American computer scientist and cognitive scientist who is one of
the "founding fathers" of artificial intelligence.
• Coined the Term "Artificial Intelligence": In 1955, McCarthy coined the term "artificial
intelligence" in a proposal for the 1956 Dartmouth workshop, a conference that is widely
considered the birth of the AI field.
• LISP Programming Language: In 1958, he developed the LISP programming language,
which became the language of choice for AI applications for decades due to its ability to
process symbolic information.
• Time-Sharing and Other Innovations: McCarthy was also a pioneer in time-sharing systems,
which allowed multiple users to access a single computer simultaneously. His other
contributions include inventing garbage collection and co-founding the Stanford Artificial
Intelligence Laboratory.

5 ADVANCEMENT IN CONTEMPORARY COMPUTING

1. Single-Core, Dual-Core, and Multi-Core Processors


• Single-Core Processor: This is the traditional CPU design, containing only one processing
core. It can execute one instruction at a time. To handle multiple tasks (multitasking), the
operating system rapidly switches between them, creating the illusion of parallel execution.
• Dual-Core & Multi-Core Processors: A multi-core processor is a single chip containing
two or more independent processing cores. A dual-core processor has two cores, a quad-
core has four, and so on.

Advantage:

o Each core can execute instructions independently, allowing for true parallel
processing.
o This significantly improves performance for multi-threaded applications (e.g., video
editing, 3D rendering, modern games), allowing the computer to run multiple
background tasks without slowing down the primary application.
o Instead of one core handling all tasks, the workload is distributed across multiple
cores, increasing overall speed and efficiency.
2. Graphics Processing Unit (GPU)
• Unlike a CPU, which has a few powerful cores optimized for sequential tasks, a GPU
has thousands of smaller, more efficient cores.
• Originally designed to accelerate the rendering of graphics for video games, GPUs are
now used for a wide range of tasks that involve repeating the same calculations on large
sets of data.

Key Applications:

o High-end Gaming: Rendering complex 3D graphics in real-time.


o Artificial Intelligence (AI) and Machine Learning: Training neural networks,
which involves massive, parallel computations.
o Scientific Computing: Running simulations for fields like physics and climate
modeling.
3. Accelerated Processing Unit (APU)
• An APU is a single chip that combines a CPU and a GPU. This integration is typically
seen in processors from AMD (Advanced Micro Devices).
• The goal of an APU is to provide a balance of general-purpose and graphics-processing
power in a single, cost-effective, and power-efficient package.
• Advantages:
o Cost and Space-Saving: It eliminates the need for a separate, dedicated graphics
card, making it ideal for laptops and budget-friendly desktop computers.
o Faster Communication: Because the CPU and GPU are on the same chip, they
can communicate with each other much faster than a separate CPU and GPU.
• Limitations: While a significant improvement over basic integrated graphics, an APU's
GPU component is generally not as powerful as a high-end, dedicated graphics card.
4. Quantum Processing Unit (QPU)
The QPU is a revolutionary leap in computing that operates on the principles of quantum
mechanics.
• Fundamental Difference from Classical Processors:
o Classical Computers (CPU/GPU/APU): Use bits that can represent a value of
either 0 or 1.
o Quantum Computers (QPU): Use qubits that can represent 0, 1, or a
superposition of both states simultaneously.
• By leveraging quantum phenomena like superposition and entanglement, a QPU can
perform certain calculations in parallel in a way that is impossible for a classical
computer.
• This allows it to explore a vast number of possibilities at once.
• QPUs are not designed to replace classical CPUs for everyday tasks. They are highly
specialized and still in the experimental phase.
• Key Applications:
o Breaking modern encryption algorithms.
o Developing new materials and drugs.
o Optimizing complex systems in finance and logistics.

You might also like