10/22/2013
Human Factors Engineering Dr. Osama Al Meanazel
Lecture 11 (Information Theory) October 22, 2013
Overview
Laura was running late for an appointment in a large, unfamiliar city and relied on her new navigation device to guide her. She had read somewhat confusing instructions and realized the importance of the voice display mode so that she could hear the directions to her destination without taking her eyes off the road. She had reminded herself to activate it before she got into heavy traffic, but the traffic suddenly increased, and she realized that she had forgotten to do so. Being late, however, she did not pull over but tried to remember the sequence of mode switches necessary to activate the voice mode. She couldnt get it right, but she managed to activate the electronic map. However, transposing its north up representation to accommodate her southbound direction of travel was too confusing. Finally lost, she pulled out her cellular phone to call her destination, glanced at the number she had written down, 303-462-8553, and dialed 303462-8533. Getting no response, she became frustrated. She looked down to check the number and dialed it carefully. Unfortunately, she did not see the car rapidly converging along the entrance ramp to her right, and only at the last moment the sound of the horn alerted her that the car was not yielding. Slamming on the brakes, heart beating fast, she pulled off to the side to carefully check her location, read the instructions, and place the phone call in the relative safety of the roadside.
10/22/2013
Information Theory
Information theory defines information as reduction in uncertainty Hence the occurrence of highly certain events conveys less information
Information theory measures information in units of bits (symbolized by H)
A bit is the amount of information required to decide between two equally likely alternatives
Information Theory
If "N" be the number of equally likely alternatives, the amount of information in bits transmitted is: H = log2 N
N
Examples:
Tossing an unbiased coin Tossing a die
Log2N 1 1.584963 2 2.321928 2.584963
2 3 4 5 6
10/22/2013
Information theory
Information theory
The reduction from maximum amount of information as alternative become unequally likely
Amount of information Bits 1.2 1 0.8 0.6 0.4 0.2 0 1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49 52 55 58 61 64 67 70 73 76 79 82 85 88 91 94 97 100
10/22/2013
Information Theory: Transmitted Info
Consider the following stimulus and its associated response
Stimulus
Response
Signal Noise
Signal
Noise
Example
Consider 4 signals A, B, C, and D. Each signal is shown twice Suppose all the response are correct Find the amount of information transmitted
10/22/2013
Example
Consider 4 signals A, B, C, and D. Each signal is shown twice Suppose all the response are incorrect Find the amount of information transmitted
Disadvantages of Information Theory
Most of its concept are descriptive rather than explanatory
It offers only the most primitive clues about the underlying psychological mechanisms of information processing
10/22/2013
An Application of Information Theory
Choice reaction time experiment: the subject must make secrete and separate responses to different stimulus Example: a person is required to push one of four buttons depending on which of four lights comes on
Hick (1952): reaction time increases as the number of equally likely alternatives increase linear relationship
Hyman (1953): same experiment but with different probabilities of occurrence Hick-Hyman Law: Choice reaction time is a linear function of the stimulus information
Displaying Information
Information comes to us through stimuli.
This information can come:
Directly through observation (direct sensing)
Example: watching a plane in the sky
Indirectly through some intervening medium (indirect sensing)
Example: watching the radar for a blip which refers to a plane
Human factors aspect of design comes into the picture when we use indirect sensing.
Types of Information?
10/22/2013
Classification of Information
Quantitative Information: reflects the quantitative value of some variable
Example: speed of a car on a speedometer
Qualitative Information: displays that show the trend, rate of change, direction of change
Example: engine temperature indicated by a qualitative dial
Status Information: displays the condition of a system(on/off indications, indications of independent conditions)
Example: channel indicator on the TV
Warning and Signal Information: indicates emergency /unsafe conditions
Example: lighthouse beacons
Classification of Information
Representational Information: pictorial or graphic representations of objects, areas or other configurations
Example: heartbeats on an oscilloscope
Identification Information: method to display a static condition, situation or object
Example: traffic lanes
Alphanumeric and Symbolic Information: displaying information using signs, labels, instructions, braille, music notes Time phased Information: information which is displayed at definite time periods (cycle)
Example: blinker lights