CENG 103 – 105 – 107
Computer Programming I
Week 2
Dr. Yucel Tekin
1.9 Typical C Program-Development Environment
• C systems generally consist of several parts
– a program-development environment
– the language
– the C standard library
1.9 Typical C Program-Development Environment
• C programs typically go through six phases to be executed
– Edit
– Preprocess
– Compile
– Link
– Load
– Execute
1.9.1 Phase 1: Creating a Program
• Consists of editing a file in an editor program:
1.9.2 Phases 2 and 3: Preprocessing and Compiling a C
Program (1 of 3)
• You give the command to compile the program
• Compiler translates it into machine-language code
• The compilation command invokes a preprocessor first
– Performs text manipulations on a program’s source-code files
– Inserting the contents of other files
– Text replacements
1.9.2 Phases 2 and 3: Preprocessing and Compiling a C
Program (2 of 3)
1.9.2 Phases 2 and 3: Preprocessing and Compiling a C
Program (3 of 3)
• A syntax error occurs when the compiler cannot recognize a statement
because it violates the language rules
– The compiler issues an error message to help you locate and fix the
incorrect statement.
• The C standard does not specify the wording for error
• Syntax errors = compile errors = compile-time errors.
1.9.3 Phase 4: Linking
• C programs use functions defined elsewhere
– standard libraries, open-source libraries or private libraries of a particular
project.
• The object code produced by the C compiler typically contains “holes”
• Linker links a program’s object code with the code for the missing functions to
produce an executable image (with no missing pieces)
1.9.4 Phase 5: Loading (1 of 2)
• Before a program can execute, the operating system must load it into
memory
• Loader transfers the executable image from disk to memory
• Additional components from shared libraries that support the program also
are loaded
1.9.4 Phase 5: Loading (2 of 2)
1.9.5 Phase 6: Execution
• Finally, the computer, under the control of its CPU,
executes the program one instruction at a time
1.9.6 Problems That May Occur at Execution Time
• Errors that occur as programs run are called runtime errors or execution-
time errors
• Fatal errors cause a program to terminate immediately without successfully
performing its job
• Nonfatal errors allow programs to run to completion, often producing
incorrect results
1.10 Test-Driving a C Application in Windows, Linux
and macOS (1 of 2)
• In this section, you’ll compile, run and interact with your first C application
• Guess-the-number game picks a random number from 1 to 1000 and
prompts you to guess it
1.10 Test-Driving a C Application in Windows, Linux
and macOS (1 of 2)
• If you guess correctly, the game ends
• If you guess incorrectly, the application indicates whether your guess is higher or
lower than the correct number
• There’s no limit to your number of guesses, but you should be able to guess a
number from 1 to 1000 correctly in 10 or fewer tries
1.10 Test-Driving a C Application in Windows, Linux
and macOS (2 of 2)
• You’ll create this application in Chapter 5’s exercises
• Usually, this application randomly selects the correct answers
• We disabled random selection for the test drives
• The application uses the same correct answer every time you run it so you
can use the same guesses we use and see the same results
• This answer may vary by compiler
1.10.1 Compiling and Running a C Application with Visual Studio
Code
1.11 Internet, World Wide Web, the Cloud and IoT (1 of 2)
• Late 1960s, ARPA—the Advanced Research Projects Agency of the United States Department of
Defense—rolled out plans for networking the main computer systems of approximately a dozen ARPA-
funded universities and research institutions
1.11 Internet, World Wide Web, the Cloud and IoT (1 of 2)
• Became known as the ARPANET, the precursor to today’s Internet
• Today’s fastest Internet speeds are on the order of billions of bits per second, with trillion-bits-per-second
(terabit) speeds already being tested
– In 2020, Australian researchers successfully tested a 44.2 terrabits per second Internet connection
1.11 Internet, World Wide Web, the Cloud and IoT (1 of 2)
• ARPANET main benefit proved to be the capability for quick and easy communication via what came to
be known as electronic mail (e-mail)
• Billions of people worldwide now use the Internet to communicate quickly and easily
1.11 Internet, World Wide Web, the Cloud and IoT (2 of 2)
• The protocol (set of rules) for communicating over the ARPANET became known as
the Transmission Control Protocol (TCP)
– Ensured that messages, consisting of sequentially numbered pieces called
packets, were properly delivered from sender to receiver, arrived intact and were
assembled in the correct order
1.11.1 The Internet: A Network of
Networks
• In parallel with the early evolution of the Internet, organizations
worldwide were implementing their own networks
• One challenge was to enable these different networks to
communicate with each other
• ARPA accomplished this by developing the Internet Protocol
(IP), which created a true “network of networks,” the Internet’s
current architecture
• The combined set of protocols is now called TCP/IP
• Each Internet-connected device has an IP address
1.11.2 The World Wide Web: Making the Internet User-
Friendly
• The World Wide Web (simply called “the web”) is a
collection of hardware and software associated with
the Internet that allows computer users to locate and
view documents on almost any subject
• 1989, Tim Berners-Lee of CERN (the European
Organization for Nuclear Research) began developing
HyperText Markup Language (HTML)—the
technology for sharing information via “hyperlinked”
text documents Inventor of the Web Sir Tim Berners-Lee
1.11.2 The World Wide Web: Making the Internet User-
Friendly
• Also wrote communication protocols such as HyperText Transfer Protocol (HTTP) to
form the backbone of his new hypertext information system
• In 1994, Berners-Lee founded the World Wide Web Consortium
1.11.3 The Cloud
• More and more computing today is done “in the cloud”—that is, using software and data
distributed across the Internet worldwide
1.11.3 The Cloud
• Cloud computing allows you to increase or
decrease computing resources to meet your
needs at any given time
– Shifts to the service provider the burden
of managing these apps (such as
installing and upgrading the software,
security, backups and disaster recovery).
1.11.3 The Cloud
• The apps you use daily are heavily dependent on various cloud-based services.
• These services use massive clusters of computing resources (computers, processors, memory,
disk drives, etc.) and databases that communicate over the Internet with each other and the
apps you use.
1.11.3 The Cloud
• A service that provides access to itself over the Internet is known as a web service.
1.11.3 The Cloud—Software as a Service
• Cloud vendors focus on service-oriented architecture (SOA) technology
• They provide “as-a-Service” capabilities that applications connect to and use in the cloud.
Common services provided by
cloud vendors include:
• Big data as a Service (BDaaS)
• Hadoop as a Service (HaaS)
• Infrastructure as a Service (IaaS)
• Platform as a Service (PaaS)
• Software as a Service (SaaS)
• Storage as a Service (SaaS)
1.11.3 The Cloud—Mashups
• Mashups (web application hybrid) enable you to rapidly develop powerful
software applications by combining (often free) complementary web
services and other forms of information feeds
• ProgrammableWeb ( https://programmableweb.com/ ) provides
a directory of nearly 24,000 web services and almost 8,000 mashups.
• They also provide how-to guides and sample code for working with
web services and creating your own mashups
• Some of the most widely used web services are Google Maps and
others provided by Facebook, Twitter and YouTube
1.11.4 The Internet of Things (1 of 2)
• A thing is any object with an IP address and the ability to send, and in some cases receive, data
automatically over the Internet
• a car with a transponder for paying tolls,
• monitors for parking-space availability in a garage,
• a heart monitor implanted in a human,
• water-quality monitors,
• a smart meter that reports energy usage,
• radiation detectors,
• item trackers in a warehouse,
• mobile apps that can track your movement and
location,
• smart thermostats that adjust room temperatures
based on weather forecasts and activity in the home,
and
• intelligent home appliances.
1.11.4 The Internet of Things (2 of 2)
• According to statista.com , there are already over 19 billion IoT
devices in use today, and could be over 30 billion IoT devices in 2025
1.12 Software Technologies
• Refactoring
– Reworking programs to make them clearer and easier to maintain while
preserving their correctness and functionality.
– Many IDEs contain built-in refactoring tools
• Design patterns
– Proven architectures for constructing flexible and maintainable object-
oriented software
– The field of design patterns tries to enumerate those recurring patterns,
encouraging software designers to reuse them to develop better-quality
software using less time, money and effort.
• Software Development Kits (SDKs)
– The tools and documentation that developers use to program applications.
1.12 Software Technologies
• Refactoring
– Reworking programs to make them clearer and easier to maintain while preserving their
correctness and functionality.
– Many IDEs contain built-in refactoring tools
1.12 Software Technologies
• Design patterns
– Proven architectures for constructing flexible and maintainable object-oriented software
– The field of design patterns tries to enumerate those recurring patterns, encouraging software
designers to reuse them to develop better-quality software using less time, money and effort.
1.12 Software Technologies
• Software Development Kits (SDKs)
– The tools and documentation that developers use to program applications.
– SDKs let developers easily and quickly build the standard components of their apps and add
functionality to them.
– SDKs are usually all-in-one products and don't need to be integrated with other components,
which can slow down the development process.
1.13 How Big Is Big Data?
• Data is now as crucial as writing programs
• According to IBM, approximately 2.5 quintillion bytes (2.5 exabytes) of data
are created daily, and 90% of the world’s data was created in the last two
years
1.13 How Big Is Big Data?
• According to IDC, the global data supply will reach 181 zettabytes (equal to
181 trillion gigabytes or 181 billion terabytes) annually by 2025
1.13 How Big Is Big Data? Megabytes (MB)
• One megabyte is about one million (actually 220 ) bytes
• Many of the files we use daily require one or more MBs of storage
– MP3 audio files—High-quality MP3s range from 1 to 2.4 MB per
minute.
– Photos—JPEG format photos taken on a digital camera can require
about 8 to 10 MB per photo.
– Video—Each minute of video can require many megabytes of storage.
For example, on one of our iPhones, the Camera settings app reports
that 1080p video at 30 frames-per-second (FPS) requires 130 M
B/minute and 4K video at 30 FPS requires 350 MB/minute.
1.13 How Big Is Big Data? Gigabytes (GB)
• One gigabyte is about 1000 megabytes (actually 230 bytes)
• A dual-layer DVD can store up to 8.5 GB, which translates to:
– as much as 141 hours of MP3 audio,
– approximately 1000 photos from a 16-megapixel camera,
– approximately 7.7 minutes of 1080p video at 30 FPS, or
– approximately 2.85 minutes of 4K video at 30 FPS.
• The current highest-capacity Ultra HD Blu-ray discs can store up to
100 GB of video
• Streaming a 4K movie can use between 7 and 10 GB per hour (highly
compressed).
1.13 How Big Is Big Data? Terabytes (TB)
• One terabyte is about 1000 gigabytes (actually 240 bytes)
• Recent disk drives for desktop computers come in sizes up to 20 TB,
which is equivalent to:
– approximately 28 years of MP3 audio,
– approximately 1.68 million photos from a 16-megapixel camera,
– approximately 226 hours of 1080p video at 30 FPS, or
– approximately 84 hours of 4K video at 30 FPS.
• Nimbus Data now has the largest solid-state drive (SSD) at 100 TB, which
can store five times the 20-TB examples of audio, photos and video listed
above
1.13 How Big Is Big Data? Petabytes, Exabytes and
Zettabytes (1 of 2)
• There are over four billion people online, creating about 2.5 quintillion bytes
of data each day
– 2500 petabytes (each petabyte is about 1000 terabytes) or 2.5 exabytes
(each exabyte is about 1000 petabytes).
• A March 2016 Analytics-Week article stated that by 2021 there would be
over 50 billion devices connected to the Internet and, by 2020, there would
be 1.7 megabytes of new data produced per second for every person on
the planet
1.13 How Big Is Big Data? Petabytes, Exabytes and
Zettabytes (2 of 2)
• That’s about
– 13 petabytes of new data per second,
– 780 petabytes per minute,
– 46,800 petabytes (46.8 exabytes) per hour, or
– 1,123 exabytes per day—1.123 zettabytes (ZB) per day (each zettabyte is
about 1000 exabytes)
• Equivalent to over 5.5 million hours (over 600 years) of 4K video every day or
approximately 116 billion photos every day!
1.13 How Big Is Big Data? Computing Power Over the
Years (1 of 2)
• Today’s processor performance is often measured in terms of FLOPS (floating-
point operations per second)
• Currently, the fastest supercomputer—Fujitsu’s Fugaku—is capable of 442
petaflops.
• Distributed computing can link thousands of personal computers via the Internet
to produce even more FLOPS
• Companies like IBM are now working toward supercomputers
capable of exaflops (1018 FLOPS)
1.13 How Big Is Big Data? Computing Power Over the
Years (2 of 2)
• The quantum computers now under development theoretically could operate at
18,000,000,000,000,000,000 times the speed of today’s “conventional computers”
– In one second, a quantum computer theoretically could do staggeringly more
calculations than the total that have been done by all computers since the
world’s first computer appeared
– Could wreak havoc with blockchain-based cryptocurrencies like Bitcoin
– Engineers are already rethinking blockchain to prepare for such massive
increases in computing power.
1.13 How Big Is Big Data? Processing the World’s Data
Requires Lots of Electricity
• Data from the world’s Internet-connected devices is exploding
• Processing that data requires tremendous amounts of energy
• According to a recent article, energy use for processing data in 2015 was growing at 20% per year and
consuming approximately three to five percent of the world’s power
– Total data-processing power consumption could reach 20% by 2025
• Another enormous electricity consumer is the blockchain-based cryptocurrency Bitcoin
• Processing just one Bitcoin transaction uses approximately the same amount of energy as powering the
average American home for a week
• According to some estimates, a year of Bitcoin transactions consumes more energy than many
countries
1.13 How Big Is Big Data? Big-Data Opportunities
• The big-data explosion is likely to continue exponentially for years to come.
• It’s crucial for businesses, governments, the military, and even individuals to get a handle on all this data
• Big data’s appeal to big business is undeniable, given the rapidly accelerating accomplishments
• Many companies are making significant investments and getting valuable results through technologies
like big data, machine learning and natural-language processing
• This is forcing competitors to invest as well, rapidly increasing the need for computing professionals with
computer-science and data-science experience
• This growth is likely to continue for many years