EP1410129A2 - Computer security system identifying suspect behavior - Google Patents
Computer security system identifying suspect behaviorInfo
- Publication number
- EP1410129A2 EP1410129A2 EP01912701A EP01912701A EP1410129A2 EP 1410129 A2 EP1410129 A2 EP 1410129A2 EP 01912701 A EP01912701 A EP 01912701A EP 01912701 A EP01912701 A EP 01912701A EP 1410129 A2 EP1410129 A2 EP 1410129A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- request
- data processing
- processing system
- approach
- prohibited operation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000012545 processing Methods 0.000 claims abstract description 41
- 238000000034 method Methods 0.000 claims description 39
- 238000013459 approach Methods 0.000 claims description 35
- 230000000694 effects Effects 0.000 claims description 30
- 230000007246 mechanism Effects 0.000 claims description 13
- 238000006243 chemical reaction Methods 0.000 claims description 12
- 230000000593 degrading effect Effects 0.000 claims description 10
- 238000012544 monitoring process Methods 0.000 claims description 8
- 230000005540 biological transmission Effects 0.000 claims 2
- 241000700605 Viruses Species 0.000 abstract description 6
- 238000004590 computer program Methods 0.000 abstract description 2
- 230000003542 behavioural effect Effects 0.000 description 58
- 230000006399 behavior Effects 0.000 description 30
- 230000009471 action Effects 0.000 description 17
- 230000008569 process Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 9
- 230000003044 adaptive effect Effects 0.000 description 8
- 230000007613 environmental effect Effects 0.000 description 6
- 230000004048 modification Effects 0.000 description 3
- 230000015556 catabolic process Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 235000014510 cooky Nutrition 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/554—Detecting local intrusion or implementing counter-measures involving event detection and direct action
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
- G06F21/566—Dynamic detection, i.e. detection performed at run-time, e.g. emulation, suspicious activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
Definitions
- This invention relates to a system and method for providing security for a computer system. Specifically, this invention relates to protecting a data processing system from operations that degrade the data processing system.
- Existing computer security systems generally require prior knowledge of a potentially degrading virus (or other harmful computer program, command or instruction) in order to protect against the virus.
- this prior knowledge takes the form of a signature that identifies the program, command or instruction as a harmful behavior.
- the security system is substantially constantly updated with the newest signatures. Thereafter, the security system monitors all files, including incoming files and resident files, for any identified signatures, and when, the system identifies the files, it destroys them.
- the data processing system includes an operating system.
- the method preferably requires substantially continually monitoring the operating system for any request from any processor subsystem, determining whether the request involves a prohibited operation, and, if the request involves a prohibited operation, adhering to a predetermined reaction protocol to protect the data processing system from the request. If the request is determined not to involve a prohibited operation, the method then determines whether the request indicates an approach to a prohibited operation — i.e., the request is not consistent with the current chain of events, or is inappropriate for in view of the current chain of events, and the request may potentially degrade the data processing system. If the request indicates an approach to a prohibited operation, the method preferably requires assessing on a scale whether the approach potentially degrades the data processing system. If the approach is assessed as sufficiently potentially degrading to the data processing system, then the method requires adhering to a predetermined reaction protocol to protect the data processing system from the request.
- FIG. 1 is a block diagram of a preferred embodiment of a system according to the invention.
- FIG. 2 is block diagram of the behavioral subsystem of system of FIG. 1 according to the invention.
- FIG. 2 A is a block diagram a preferred embodiment of the adapative heuristic algorithm within the behavioral subsystem of FIG. 2 according to the invention
- FIG. 3 is a block diagram of a preferred embodiment of the environmental integrity subsystem of the system of FIG. 1 according to the invention
- FIG. 4 is a block diagram of a preferred embodiment of the access control subsystem of the system of FIG. 1 according to the invention.
- FIG. 5 is a block diagram of a preferred embodiment of the external connectivity subsystem of the system of FIG. 1 according to the invention.
- FIG. 6 is a block diagram of a preferred embodiment of the man-machine interface (MMI) subsystem of the system of FIG. 1 according to the invention
- FIG. 7 is a block diagram of a preferred embodiment of the file-activity request subsystem of the system of FIG. 1 according to the invention.
- MMI man-machine interface
- the system and method according to the invention that protects the computer system from known and unknown viruses as well as other unauthorized intrusions and disruptions is provided.
- the system utilizes a behavioral subsystem that operates by safeguarding against a certain baseline of known prohibited behavior.
- This baseline hereinafter referred to as the behavior set
- the behavioral subsystem also has an ability to modify its database of prohibited behavior by experience in order to adapt to new or unusual circumstances, thereby safeguarding against heretofore unrecognized, yet potentially degrading behavior, while continuing to safeguard against the baseline of known prohibited behavior.
- Any new learned behavior is added to the then current behavior view to yield an advanced behavior view, hereafter referred to as a world view.
- the behavioral subsystem employs an adaptive heuristic algorithm.
- the system according to the invention operates as follows: First, the requests that are received by a system according to the invention are referred to individual subsystems.
- the specific subsystem to which a request is referred depends on the origination of the request -- e.g., a request for a file operation is referred to the file-activity subsystem.
- the individual subsystem preferably screens the incoming requests to determine whether the requests are directed toward a protected resource. If they are directed toward a protected resource, the subsystems pass the request to the behavioral subsystem for analysis.
- the behavioral subsystem uses the adaptive heuristic algorithm to monitor and analyze potentially system-degrading input requests.
- the adaptive heuristic algorithm weights the request according to the operation requested — e.g., write, read, modify, create etc. — and analyzes those requests in order to determine whether the requests represent an approach to prohibited behavior.
- the algorithm preferably uses a modality of the behavioral subsystem -- i.e., a determination of the current circumstances which provides a threshold of what activity is permitted — together with the weight of the request to determine whether the request should be allowed to pass to the operating system, or be terminated.
- the modality of the system is set by monitoring the most recent user actions to determine whether the present request is appropriate in light of the current user status.
- an unrecognized request may preferably be terminated when it is either unsolicited by the user, or otherwise inconsistent with the current pattern of events.
- FIG. 1 shows a block diagram of a preferred embodiment of a data-processing system 100 according to the invention.
- Each subsystem (environmental integrity subsystem 300, access control subsystem 400, external connectivity subsystem 500, man-machine subsystem 600 and file-activity subsystem 700) provides requests to the operating system 900.
- each of the subsystems may also provide selected requests to the behavioral subsystem 200.
- the requests are passed to behavioral subsystem 200 when the subsystem determines that the requests are directed to a protected resource — e.g., the win.ini file, the autoexec.bat file, and similarly other files that are required for configuring or maintaining the system.
- Behavioral subsystem 200 continually monitors and analyzes any request that it receives using adaptive heuristic algorithm 250.
- FIG. 2 shows the process of analysis used by behavioral subsystem 200 to determine the nature of request 202.
- behavioral subsystem 200 analyzes each request 202 to determine whether it is a request that exists in the behavior set 203.
- This set contains known prohibited operations — i.e., operations that contradict basic system requirements for either sustainability or integrity — e.g., operations that degrade or destroy a protected resource, as described above. Such operations represent a danger to data-processing system 100 that is constant over time and is considered static in nature.
- request 202 is determined to be for a known prohibited operation, then behavioral subsystem 200 requires adherence to a predetermined reaction protocol 205 to protect data-processing system 100 from request 202.
- reactions in such a reaction protocol preferably include terminating any files with a parent-child relationship to the requesting file — i.e., terminating any files associated with producing such a request, and, preferably, also, where applicable, any files that communicated with the requesting file.
- behavioral subsystem 200 analyzes request 202 to determine whether it is for an operation that is part of the world model 204.
- the world model includes a program, process or series of processes that is determined during the present user session to be a prohibited operation).
- behavioral subsystem 200 analyzes whether there is an indication in the request of an approach to a prohibited operation 206. If there is such an indication, behavioral subsystem 200 then determines whether request 202 potentially degrades data processing system 100. This process will be explained in greater detail with respect to FIG. 2A in which adaptive heuristic algorithm 250 used by behavioral subsystem 200 is described.
- behavioral subsystem 200 adheres to a predefined behavioral reaction 208 as in the case of a known prohibited behavior, and preferably terminates the request and deletes the file that requested it. Thereafter, behavioral subsystem 200 preferably updates the world model to include the terminated request as representative of a prohibited behavior 210.
- behavioral subsystem 200 allows the activity to continue, preferably under a heightened monitoring status.
- One example of such a command would be the execution of a word processing program which initiates a call to open a template.
- the template may be considered a protected resource and, therefore, executing the word processing program affects a protected resource, but does not necessarily approach a prohibited operation.
- the heightened monitoring status requires tracking the request to the protected resource. Thus, if necessary, changes or modifications to the protected resource can be reversed.
- FIG. 2A shows in greater detail the operation of the adaptive heuristic algorithm 250 within behavioral subsystem 200.
- Modality 201 provides header information for incoming system request 202.
- the header information contained in modality 201 provides the threshold information for determination of whether there is an indication of an approach to a known prohibited operation in request 202.
- the modality preferably exists in three different states: home, stay and away.
- Stay mode may be to permit internal movement, but to prohibit external-to-internal movement.
- the final burglar alarm mode which provides the highest level of awareness is Away. In this mode, all movements are preferably prohibited, whether external-to-internal or solely internal.
- Home mode defines a level of awareness of behavioral subsystem 200 that indicates a software modification or software installation process. While in this state, behavioral subsystem 200 allows the user to install or update system software. Behavioral subsystem 200 continues to safeguard system resources that should not be affected by a standard install/update process. Behavioral subsystem 200 can be informed of this state by a direct user command or by some other suitable process.
- the state of heightened awareness may be implemented by tracking any changes made to a template by a user. If the user destroys the template, or modifies it in some undesirable fashion, or in the alternative, if a macro virus destroyed a template, the tracking function of the heightened awareness may be used to reverse the effects of the undesirable behavior.
- Stay mode defines a level of awareness of behavioral subsystem 200 that may preferably indicate standard user activity. Specifically, this level of awareness is designed to protect the system when the user is connected to an external resource -- e.g., the Internet. During this state, a heightened awareness preferably surrounds all critical system resources as well as user-specified files. In this state, behavioral subsystem 200 pays particular attention to behavior that is considered inappropriate or inconsistent.
- Away mode defines a level of awareness of behavioral subsystem 200 that may preferably indicate a condition of non-user activity. This mode represents behavioral subsystem 200's highest state of awareness. While in this mode, behavioral subsystem 200 lowers the threshold of determination of prohibited behavior in order to prevent system modifications that would lead to a degradation of system integrity.
- FIG. 2 A shows a flow chart depicting the operation of the adaptive heuristic algorithm 250 within behavioral subsystem 200.
- the algorithm is used by behavioral subsystem 200 to determine whether a request indicates an approach to prohibited behavior.
- This algorithm operates to classify incoming requests based on the subsystem from which the request came, the weight — i.e., the particular nature -- of the request 253 and the present modality 254.
- the algorithm then employs delimiters of significance, as known to those skilled in the art, or other suitable processes, such as the S-curve to determine whether the request represents an approach to prohibited behavior.
- the modality sets the threshold for determining at what point on the S- curve an approach to prohibited behavior is determined to have occurred 256.
- certain requests e.g., certain registry requests required by a software install/update — will be permitted while during away mode those same requests will be terminated. If an approach to a prohibited operation is indicated, the request is terminated and the world model is preferably updated 257. If the request is determined not to approach a prohibited operation, the activity is permitted 258.
- FIG. 3 shows the environmental integrity subsystem 300. This subsystem monitors registry action request 301 and passes the request to the behavioral subsystem 200 preferably only if the request is to perform an action that affects a protected resource 302. Otherwise, the request is passed through environmental integrity subsystem 300 and the activity is continued 305.
- behavioral subsystem 200 When environmental integrity subsystem 300 passes a request to the behavioral subsystem 200, the behavioral subsystem 200 performs the analysis described above with reference to FIGS. 2 and 2 A. A portion of the analysis performed by behavioral subsystem 200 includes assigning a weight to the requested operation based on the nature of the operation — e.g., open, read, write, modify and create. This weight is used by behavioral subsystem 200 as one of the criteria in the 'assessment whether the request approaches a prohibited behavior.
- a weight to the requested operation based on the nature of the operation — e.g., open, read, write, modify and create. This weight is used by behavioral subsystem 200 as one of the criteria in the 'assessment whether the request approaches a prohibited behavior.
- the environmental integrity subsystem monitors registry action requests 301 and passes the information to the behavioral subsystem as required.
- Behavioral subsystem 200 receives the registry action request along with the modality 304 preferably only if the action request affects a protected resource 302, Behavioral subsystem 200 then assigns an associate "weight" to the requested activity on the registry including: open, read, write, modify and create. If the action request does not affect a protected resource, then the action is permitted to continue 303.
- FIG. 4 shows the access control subsystem 400.
- Access control subsystem 400 monitors access control action requests 402 — e.g., a request to change modalities or a request to change configuration settings — and passes the information to behavioral subsystem 200 (along with the modality 201) preferably only if the action request affects a protected resource 403. The behavioral subsystem then assigns a weight to the requested activity including: open, read, write, modify and create. If the request does not affect a protected resource, then the action request is permitted to continue 404.
- access control action requests 402 e.g., a request to change modalities or a request to change configuration settings — and passes the information to behavioral subsystem 200 (along with the modality 201) preferably only if the action request affects a protected resource 403.
- the behavioral subsystem assigns a weight to the requested activity including: open, read, write, modify and create. If the request does not affect a protected resource, then the action request is permitted to continue 404.
- FIG. 5 shows the external connectivity subsystem 500.
- External connectivity subsystem 500 monitors external connectivity action requests 502 -- e.g., an example of an internal to external request is a user request to access a web-page; and an example of an external to internal request is a web-page trying to access the "cookie" or other information from the user system — and assesses whether the action request is internally or externally generated 503. If the subsystem determines the action request to be externally generated, the subsystem preferably then assesses whether the external request is an answer to an internal connectivity action request 504. If the external connectivity subsystem determines the external request is not in response to an internal request, the subsystem then preferably updates the world model to include the behavior as a prohibited operation and terminates the activity 506. If the external request is in response to an internal request, the activity is continued.
- external connectivity action requests 502 e.g., an example of an internal to external request is a user request to access a web-page; and an example of an external to internal request is a web-page
- the external connectivity subsystem passes the information to behavioral subsystem 200 which assigns a weight to the requested socket communications including: socket listen, set local socket, remote socket connect open/close requests and local socket connect open/close requests and allows the activity to continue.
- FIG. 6 shows the Man-Machine Interface (MM1) subsystem 600.
- MMI subsystem 600 preferably is configured to receive input from a Keyboard 601, Mouse 602, removable media 603, microphone 604, stylus or touch-screen 605 or other input device — e.g., Joystick -- 615, or any combination of these devices.
- MMI subsystem 600 monitors the current user input in order to determine user activity/inactivity via a timing mechanism 607. If MMI subsystem 600 determines that there has been no recent user activity 609, it adjusts the modality of behavioral subsystem 200 to away 610.
- MMI subsystem 600 In away mode, when MMI subsystem 600 detects external stimuli 611, it preferably demands user authentication 612. If the authentification fails 613, the system remains in away mode. If the demand is answered correctly 614, behavioral subsystem 200's modality is then shifted to stay 606 and standard user activity is permitted.
- subsystem timing mechanism 607 senses external stimuli 608, it preferably maintains the modality at stay 606. It then passes the information to behavioral subsystem 200, which monitors user activity.
- FIG. 7 shows the file-activity request 700 subsystem.
- File-activity request subsystem 700 receives a properly gated request 702 in the form of a file resource request 702 ⁇ e.g., for opening, modifying, creating or deleting a file -- and passes the information to behavioral subsystem 200 as required 706.
- Behavioral subsystem 200 receives the file resource request along with modality 201 and, when appropriate, allows the file resource to execute, thus becoming an active process 704.
- An example of an inappropriate file resource request is erasing win.ini and replacing it with a new win.ini.
- Behavioral subsystem 200 watches the currently running, spawning ⁇ i.e., creating of files by the currently running file — and terminating processes and assigns a weight to the request. The possible weight may include spawn and terminate process. If the additional resource request is assessed to be potentially degrading by behavioral subsystem 200, the request is denied and the parent process (including children) is terminated 711. If behavioral subsystem 200 determines that no potential system degradation exists in the request, it allows continued activity 710, to including spawning and termination of processes.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Virology (AREA)
- Storage Device Security (AREA)
Abstract
A behaviorally-based computer security system is provided. The system protects a data-processing system against viruses or other harmful computer programs, requests or instructions without requiring constant updates. The system also protects against newly-created viruses.
Description
BEHAVIORALLY-BASED COMPUTER SECURITY SYSTEM
Background of the Invention
This invention relates to a system and method for providing security for a computer system. Specifically, this invention relates to protecting a data processing system from operations that degrade the data processing system.
Existing computer security systems generally require prior knowledge of a potentially degrading virus (or other harmful computer program, command or instruction) in order to protect against the virus. Typically, this prior knowledge takes the form of a signature that identifies the program, command or instruction as a harmful behavior. To identify and destroy the harmful behavior, the security system is substantially constantly updated with the newest signatures. Thereafter, the security system monitors all files, including incoming files and resident files, for any identified signatures, and when, the system identifies the files, it destroys them.
However, existing security systems require constant updates to identify new harmful behavior, and, also, do not guard against harmful behavior of which the security systems are unaware — i.e., newly-created harmful behavior.
It would be desirable to provide a computer security system that protects against harmful behaviors without requiring constant updates.
It would also be desirable to provide a computer security system that protects against newly-created harmful behavior.
Summary of the Invention
Therefore, it is an object of this invention to provide a computer security system that protects against harmful behaviors without requiring constant updates.
It is also an object of this invention to provide a computer security system that protects against newly-created harmful behaviors.
A method for protecting a data processing system from a prohibited operation is provided. The data processing system includes an operating system.
The method preferably requires substantially continually monitoring the operating system for any request from any processor subsystem, determining whether the request involves a prohibited operation, and, if the request involves a prohibited operation, adhering to a predetermined reaction protocol to protect the data processing system from the request.
If the request is determined not to involve a prohibited operation, the method then determines whether the request indicates an approach to a prohibited operation — i.e., the request is not consistent with the current chain of events, or is inappropriate for in view of the current chain of events, and the request may potentially degrade the data processing system. If the request indicates an approach to a prohibited operation, the method preferably requires assessing on a scale whether the approach potentially degrades the data processing system. If the approach is assessed as sufficiently potentially degrading to the data processing system, then the method requires adhering to a predetermined reaction protocol to protect the data processing system from the request.
Finally, if the approach is assessed as one that would not degrade the data processing system, the request is passed on and the requested activity is allowed to proceed.
Brief Description of the Drawings The above and other objects and advantages of the invention will be apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings, in which like reference characters refer to like parts throughout:
FIG. 1 is a block diagram of a preferred embodiment of a system according to the invention;
FIG. 2 is block diagram of the behavioral subsystem of system of FIG. 1 according to the invention;
FIG. 2 A is a block diagram a preferred embodiment of the adapative heuristic algorithm within the behavioral subsystem of FIG. 2 according to the invention; FIG. 3 is a block diagram of a preferred embodiment of the environmental integrity subsystem of the system of FIG. 1 according to the invention;
FIG. 4 is a block diagram of a preferred embodiment of the access control subsystem of the system of FIG. 1 according to the invention;
FIG. 5 is a block diagram of a preferred embodiment of the external connectivity subsystem of the system of FIG. 1 according to the invention;
FIG. 6 is a block diagram of a preferred embodiment of the man-machine interface (MMI) subsystem of the system of FIG. 1 according to the invention; and
FIG. 7 is a block diagram of a preferred embodiment of the file-activity request subsystem of the system of FIG. 1 according to the invention.
Detailed Description of the Invention
The system and method according to the invention that protects the computer system from known and unknown viruses as well as other unauthorized intrusions and disruptions is provided. To accomplish these tasks, the system utilizes a behavioral subsystem that operates by safeguarding against a certain baseline of known prohibited behavior. This baseline, hereinafter referred to as the behavior set, is preprogrammed into the behavioral subsystem prior to installation. In addition, the behavioral subsystem also has an ability to modify its database of prohibited behavior by experience in order to adapt to new or unusual circumstances, thereby safeguarding against heretofore unrecognized, yet potentially degrading behavior, while continuing to safeguard against the baseline of known prohibited behavior. Any new learned behavior is added to the then current behavior view to yield an advanced behavior view, hereafter referred to as a world view. To determine whether new circumstances represent danger to the data-processing system, and are therefore part of the world view, the behavioral subsystem employs an adaptive heuristic algorithm.
The system according to the invention operates as follows: First, the requests that are received by a system according to the invention are referred to individual subsystems. The specific subsystem to which a request is referred depends on the origination of the request -- e.g., a request for a file operation is referred to the file-activity subsystem. The individual subsystem preferably screens the incoming requests to determine whether the requests are directed toward a protected resource. If they are directed toward a protected resource, the subsystems pass the request to the behavioral subsystem for analysis.
The behavioral subsystem uses the adaptive heuristic algorithm to monitor and analyze potentially system-degrading input requests. The adaptive heuristic algorithm weights the request according to the operation requested — e.g., write, read, modify, create etc. — and analyzes those requests in order to determine whether the requests represent an approach to prohibited behavior.
The algorithm preferably uses a modality of the behavioral subsystem -- i.e., a determination of the current circumstances which provides a threshold of what
activity is permitted — together with the weight of the request to determine whether the request should be allowed to pass to the operating system, or be terminated. The modality of the system is set by monitoring the most recent user actions to determine whether the present request is appropriate in light of the current user status. Thus, an unrecognized request may preferably be terminated when it is either unsolicited by the user, or otherwise inconsistent with the current pattern of events.
FIG. 1 shows a block diagram of a preferred embodiment of a data-processing system 100 according to the invention. Each subsystem (environmental integrity subsystem 300, access control subsystem 400, external connectivity subsystem 500, man-machine subsystem 600 and file-activity subsystem 700) provides requests to the operating system 900. In a system according to the invention, each of the subsystems may also provide selected requests to the behavioral subsystem 200. The requests are passed to behavioral subsystem 200 when the subsystem determines that the requests are directed to a protected resource — e.g., the win.ini file, the autoexec.bat file, and similarly other files that are required for configuring or maintaining the system.
Behavioral subsystem 200 continually monitors and analyzes any request that it receives using adaptive heuristic algorithm 250.
FIG. 2 shows the process of analysis used by behavioral subsystem 200 to determine the nature of request 202. First, behavioral subsystem 200 analyzes each request 202 to determine whether it is a request that exists in the behavior set 203.
This set contains known prohibited operations — i.e., operations that contradict basic system requirements for either sustainability or integrity — e.g., operations that degrade or destroy a protected resource, as described above. Such operations represent a danger to data-processing system 100 that is constant over time and is considered static in nature. If request 202 is determined to be for a known prohibited operation, then behavioral subsystem 200 requires adherence to a predetermined reaction protocol 205 to protect data-processing system 100 from request 202. Examples of reactions in such a reaction protocol preferably include terminating any files with a parent-child relationship to the requesting file — i.e., terminating any files associated with producing such a request, and, preferably, also, where applicable, any files that communicated with the requesting file.
If request 202 is not for a known prohibited operation, behavioral subsystem 200 analyzes request 202 to determine whether it is for an operation that is part of the world model 204. (The world model includes a program, process or series of
processes that is determined during the present user session to be a prohibited operation).
If request 202 is neither for a known prohibited operation nor for an operation that is part of the world model (or is for a known permitted operation, which is always permitted to be fulfilled), behavioral subsystem 200 analyzes whether there is an indication in the request of an approach to a prohibited operation 206. If there is such an indication, behavioral subsystem 200 then determines whether request 202 potentially degrades data processing system 100. This process will be explained in greater detail with respect to FIG. 2A in which adaptive heuristic algorithm 250 used by behavioral subsystem 200 is described.
If request 202 is determined by behavioral subsystem 200 to potentially degrade data-processing system 100, behavioral subsystem 200 adheres to a predefined behavioral reaction 208 as in the case of a known prohibited behavior, and preferably terminates the request and deletes the file that requested it. Thereafter, behavioral subsystem 200 preferably updates the world model to include the terminated request as representative of a prohibited behavior 210.
If request 202 is not for a known prohibited operation, nor for an operation • defined as prohibited in the world view, nor indicative of an approach to prohibited behavior, as shown in 207 and 209, behavioral subsystem 200 allows the activity to continue, preferably under a heightened monitoring status. One example of such a command would be the execution of a word processing program which initiates a call to open a template. The template may be considered a protected resource and, therefore, executing the word processing program affects a protected resource, but does not necessarily approach a prohibited operation. The heightened monitoring status requires tracking the request to the protected resource. Thus, if necessary, changes or modifications to the protected resource can be reversed.
FIG. 2A shows in greater detail the operation of the adaptive heuristic algorithm 250 within behavioral subsystem 200. Before describing the adaptive heuristic algorithm 250, however, it is important to describe the function of modality 201 in behavioral subsystem 200. (A more detailed description of the determination of the mode of behavioral subsystem 200 is described below with reference to FIG. 6.)
Modality 201, as shown in FIG. 2, provides header information for incoming system request 202. The header information contained in modality 201 provides the threshold information for determination of whether there is an indication of an approach to a known prohibited operation in request 202. The modality preferably exists in three different states: home, stay and away.
These modes are analogous to the modes of a burglar alarm system, each of which indicates a particular level of alarm awareness. In a burglar alarm, home indicates that the alarm should be at its lowest level of awareness because the user is "home". Stay indicates a higher level of awareness. This level is typically associated with a circumstance where the user is home, but still desires protection from intruders.
One implementation of Stay mode may be to permit internal movement, but to prohibit external-to-internal movement. The final burglar alarm mode which provides the highest level of awareness is Away. In this mode, all movements are preferably prohibited, whether external-to-internal or solely internal. The analogy will be further understood in light of the following explanation of the operation of the modalities of a computer security system according to the invention.
Home mode defines a level of awareness of behavioral subsystem 200 that indicates a software modification or software installation process. While in this state, behavioral subsystem 200 allows the user to install or update system software. Behavioral subsystem 200 continues to safeguard system resources that should not be affected by a standard install/update process. Behavioral subsystem 200 can be informed of this state by a direct user command or by some other suitable process. In home mode, the state of heightened awareness may be implemented by tracking any changes made to a template by a user. If the user destroys the template, or modifies it in some undesirable fashion, or in the alternative, if a macro virus destroyed a template, the tracking function of the heightened awareness may be used to reverse the effects of the undesirable behavior.
Stay mode defines a level of awareness of behavioral subsystem 200 that may preferably indicate standard user activity. Specifically, this level of awareness is designed to protect the system when the user is connected to an external resource -- e.g., the Internet. During this state, a heightened awareness preferably surrounds all critical system resources as well as user-specified files. In this state, behavioral subsystem 200 pays particular attention to behavior that is considered inappropriate or inconsistent.
Away mode defines a level of awareness of behavioral subsystem 200 that may preferably indicate a condition of non-user activity. This mode represents behavioral subsystem 200's highest state of awareness. While in this mode, behavioral subsystem 200 lowers the threshold of determination of prohibited behavior in order to prevent system modifications that would lead to a degradation of system integrity.
FIG. 2 A shows a flow chart depicting the operation of the adaptive heuristic algorithm 250 within behavioral subsystem 200. The algorithm is used by behavioral subsystem 200 to determine whether a request indicates an approach to prohibited behavior. This algorithm operates to classify incoming requests based on the subsystem from which the request came, the weight — i.e., the particular nature -- of the request 253 and the present modality 254. The algorithm then employs delimiters of significance, as known to those skilled in the art, or other suitable processes, such as the S-curve to determine whether the request represents an approach to prohibited behavior. The modality sets the threshold for determining at what point on the S- curve an approach to prohibited behavior is determined to have occurred 256. For example, during home mode, certain requests — e.g., certain registry requests required by a software install/update — will be permitted while during away mode those same requests will be terminated. If an approach to a prohibited operation is indicated, the request is terminated and the world model is preferably updated 257. If the request is determined not to approach a prohibited operation, the activity is permitted 258.
FIG. 3 shows the environmental integrity subsystem 300. This subsystem monitors registry action request 301 and passes the request to the behavioral subsystem 200 preferably only if the request is to perform an action that affects a protected resource 302. Otherwise, the request is passed through environmental integrity subsystem 300 and the activity is continued 305.
When environmental integrity subsystem 300 passes a request to the behavioral subsystem 200, the behavioral subsystem 200 performs the analysis described above with reference to FIGS. 2 and 2 A. A portion of the analysis performed by behavioral subsystem 200 includes assigning a weight to the requested operation based on the nature of the operation — e.g., open, read, write, modify and create. This weight is used by behavioral subsystem 200 as one of the criteria in the 'assessment whether the request approaches a prohibited behavior.
The environmental integrity subsystem monitors registry action requests 301 and passes the information to the behavioral subsystem as required. Behavioral
subsystem 200 receives the registry action request along with the modality 304 preferably only if the action request affects a protected resource 302, Behavioral subsystem 200 then assigns an associate "weight" to the requested activity on the registry including: open, read, write, modify and create. If the action request does not affect a protected resource, then the action is permitted to continue 303.
FIG. 4 shows the access control subsystem 400. Access control subsystem 400 monitors access control action requests 402 — e.g., a request to change modalities or a request to change configuration settings — and passes the information to behavioral subsystem 200 (along with the modality 201) preferably only if the action request affects a protected resource 403. The behavioral subsystem then assigns a weight to the requested activity including: open, read, write, modify and create. If the request does not affect a protected resource, then the action request is permitted to continue 404.
FIG. 5 shows the external connectivity subsystem 500. External connectivity subsystem 500 monitors external connectivity action requests 502 -- e.g., an example of an internal to external request is a user request to access a web-page; and an example of an external to internal request is a web-page trying to access the "cookie" or other information from the user system — and assesses whether the action request is internally or externally generated 503. If the subsystem determines the action request to be externally generated, the subsystem preferably then assesses whether the external request is an answer to an internal connectivity action request 504. If the external connectivity subsystem determines the external request is not in response to an internal request, the subsystem then preferably updates the world model to include the behavior as a prohibited operation and terminates the activity 506. If the external request is in response to an internal request, the activity is continued.
If the external connectivity action request 502 is determined to be an internal request, the external connectivity subsystem passes the information to behavioral subsystem 200 which assigns a weight to the requested socket communications including: socket listen, set local socket, remote socket connect open/close requests and local socket connect open/close requests and allows the activity to continue.
If the external connectivity subsystem determines the external action request is an answer to an internal request, it permits continued activity 508 a state of heightened awareness.
FIG. 6 shows the Man-Machine Interface (MM1) subsystem 600. MMI subsystem 600 preferably is configured to receive input from a Keyboard 601, Mouse 602, removable media 603, microphone 604, stylus or touch-screen 605 or other input device — e.g., Joystick -- 615, or any combination of these devices. MMI subsystem 600 monitors the current user input in order to determine user activity/inactivity via a timing mechanism 607. If MMI subsystem 600 determines that there has been no recent user activity 609, it adjusts the modality of behavioral subsystem 200 to away 610. In away mode, when MMI subsystem 600 detects external stimuli 611, it preferably demands user authentication 612. If the authentification fails 613, the system remains in away mode. If the demand is answered correctly 614, behavioral subsystem 200's modality is then shifted to stay 606 and standard user activity is permitted.
As long as the subsystem timing mechanism 607 senses external stimuli 608, it preferably maintains the modality at stay 606. It then passes the information to behavioral subsystem 200, which monitors user activity.
Home mode 617 is preferably activated by user command. It may be implemented when the user is in stay mode, as shown by user challenge authentication 616. If the user fails to correctly respond to user authentication 616, the system may preferably be shifted to away mode along path 613. FIG. 7 shows the file-activity request 700 subsystem. File-activity request subsystem 700 receives a properly gated request 702 in the form of a file resource request 702 ~ e.g., for opening, modifying, creating or deleting a file -- and passes the information to behavioral subsystem 200 as required 706. Behavioral subsystem 200 receives the file resource request along with modality 201 and, when appropriate, allows the file resource to execute, thus becoming an active process 704. An example of an inappropriate file resource request is erasing win.ini and replacing it with a new win.ini.
Upon an additional resource request 705 from the active process 704, the request is again passed to behavioral subsystem 200. Behavioral subsystem 200 watches the currently running, spawning ~ i.e., creating of files by the currently running file — and terminating processes and assigns a weight to the request. The possible weight may include spawn and terminate process. If the additional resource request is assessed to be potentially degrading by behavioral subsystem 200, the request is denied and the parent process (including children) is terminated 711.
If behavioral subsystem 200 determines that no potential system degradation exists in the request, it allows continued activity 710, to including spawning and termination of processes.
Thus, a behaviorally based computer security system is provided. Persons skilled in the art will appreciate that the present invention can be practiced by other than the described embodiments, which are presented for purposes of illustration rather than of limitation, and the present invention is limited only by the claims which follow.
Claims
1. A method for protecting a data processing system, said data processing system having an operating system, said method comprising: continually monitoring said operating system for any request from a processor subsystem; determining whether said request involves a prohibited operation; if said request is determined to involve a prohibited operation, adhering to a predetermined reaction to protect said data processing system from said request; if said request is determined not to involve a prohibited operation, determining whether there is an indication in said request of an approach to a prohibited operation; if there is an indication in said request of an approach to prohibited operation, assessing said approach on a scale to determine whether said approach potentially degrades said data processing system; if said approach is assessed as potentially degrading to said data processing system, adhering to said predetermined reaction to protect said operating system from said request; and if said approach is assessed as one that would not degrade said data processing system, allowing said request to be fulfilled.
2. The method of claim 1, wherein said assessing further comprises determining whether said request comprises an inconsistent request to said operating system.
3. The method of claim 1, wherein said assessing further comprises determining whether said request comprises a user-unsolicited request to said operating system which potentially degrades said operating system.
4. The method of claim 1, wherein said monitoring further comprises determining a security mode of a user interface.
5. The method of claim 4, wherein said determining said security mode comprises determining that said user interface is one of a group of modes consisting of a program install/update mode, a standard activity mode and a non-user activity mode.
6. The method of claim 1, if said approach is assessed as potentially degrading to said data processing system further comprising classifying said request as a prohibited operation.
7. The method of claim 1, wherein the allowing said request to be fulfilled comprises allowing said request to be fulfilled under heightened-awareness status.
8. The method of claim 1, wherein the determining whether there is an indication in said request of an approach to a prohibited operation comprises determining whether there is an indication in said request and in events immediately preceding said request of an approach to a prohibited operation.
9. A security system for protecting a data processing system, said data processing system having an operating system, said security system comprising: a monitor that continuously intercepts and analyzes any request to said operating system to determine whether said request involves a prohibited operation; a first adherence mechanism which invokes a predetermined operational reaction to protect said data processing system from said request when said request is determined to involve a prohibited operation; an assessment mechanism which assesses said request on a scale to determine whether said request potentially degrades said data processing system when a request is not a prohibited operation; a second adherence mechanism which invokes said predetermined operational reaction to protect said data processing system from said request when it is assessed as one that potentially degrades said data processing system; and a transmission mechanism, which transmits said request to said operating system when said request is not assessed as potentially degrading to said data processing system.
10. The security system of claim 9, wherein said assessment mechanism further assesses said request to determine whether said request comprises an inconsistent request to said operating system.
11. The security system of claim 9, wherein said assessment mechanism further assesses said request to determine whether said request comprises a user- unsolicited request to said operating system.
12. The security system of claim 9, wherein said system functions in one security mode of a group of modes consisting of program install/update mode, standard activity mode and non-user activity mode.
13. The security system of claim 9, wherein said system further comprises a security mode mechanism for determining said security mode of said system.
14. The security system of claim 9, wherein said system further comprises a classification mechanism that classifies as a bad operation any request assessed by said assessment mechanism as a request that potentially degrades said data processing system.
15. The security system of claim 9, wherein said transmission mechanism monitors said request under a heightened awareness after it transmits said request.
16. The security system of claim 9, wherein said assessment mechanism assesses said request and the events immediately preceding said request to determine whether said request potentially degrades said data processing system.
17. A method for protecting a data processing system, said data processing system having an operating system, said method comprising: continually monitoring said operating system for any request; determining whether said request involves a prohibited operation; if said request is determined to involve a known prohibited operation, adhering to a predetermined reaction to protect said data processing system from said request; if said request is determined not to involve a known prohibited operation, determining whether said request involves a known permitted operation; if said request is determined to involve said known permitted operation, allowing said request to be fulfilled; if said request does not involve said known prohibited operation or said known permitted operation, determining whether there is an indication in said request of an approach to said known prohibited operation; if there is an indication in said request of an approach to said known prohibited operation, assessing said approach on a scale to determine whether said approach potentially degrades said data processing system; if said approach is assessed as potentially degrading to said data processing system, adhering to said predetermined reaction to protect said operating system from said request; and if said approach is assessed as one that would not degrade said data processing system, allowing said request to be fulfilled.
18. The method of claim 17, wherein said assessing further comprises determining whether said request comprises an inconsistent request to said operating system.
19. The method of claim 17, wherein said assessing further comprises determining whether said request comprises a user-unsolicited request to said operating system which potentially degrades said operating system.
20. The method of claim 17, wherein said monitoring further comprises determining a security mode of a user interface.
21. The method of claim 20, wherein said determining said security mode comprises determining that said user interface is one of a group consisting of program install/update mode, standard activity mode and non-user activity mode.
22. The method of claim 17, if said approach is assessed as potentially degrading to said data processing system, classifying said request as said known prohibited operation.
23. The method of claim 17, if said approach is assessed as not potentially degrading to said data processing system, classifying said request as said known permitted operation.
24. The method of claim 17 wherein the allowing said request to be fulfilled comprises allowing said request to be fulfilled under heightened-awareness status.
25. The method of claim 17, wherein the determining whether there is an indication in said request of an approach to a prohibited operation comprises determining whether there is an indication in said request and in events immediately preceding said request of an approach to a prohibited operation.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US499494 | 1983-05-31 | ||
US49949400A | 2000-02-07 | 2000-02-07 | |
PCT/US2001/003842 WO2001057629A2 (en) | 2000-02-07 | 2001-02-07 | Computer security system indentifying suspect behaviour |
Publications (1)
Publication Number | Publication Date |
---|---|
EP1410129A2 true EP1410129A2 (en) | 2004-04-21 |
Family
ID=23985471
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP01912701A Withdrawn EP1410129A2 (en) | 2000-02-07 | 2001-02-07 | Computer security system identifying suspect behavior |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP1410129A2 (en) |
AU (1) | AU2001241454A1 (en) |
WO (1) | WO2001057629A2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7552473B2 (en) * | 2003-08-12 | 2009-06-23 | Symantec Corporation | Detecting and blocking drive sharing worms |
KR100897849B1 (en) | 2007-09-07 | 2009-05-15 | 한국전자통신연구원 | Abnormal process detection method and device |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5748964A (en) * | 1994-12-20 | 1998-05-05 | Sun Microsystems, Inc. | Bytecode program interpreter apparatus and method with pre-verification of data type restrictions |
AU6279896A (en) * | 1995-06-15 | 1997-01-15 | Fraudetect, L.L.C. | Process and apparatus for detecting fraud |
IL120632A0 (en) * | 1997-04-08 | 1997-08-14 | Zuta Marc | Multiprocessor system and method |
-
2001
- 2001-02-07 AU AU2001241454A patent/AU2001241454A1/en not_active Abandoned
- 2001-02-07 EP EP01912701A patent/EP1410129A2/en not_active Withdrawn
- 2001-02-07 WO PCT/US2001/003842 patent/WO2001057629A2/en not_active Application Discontinuation
Non-Patent Citations (1)
Title |
---|
See references of WO0157629A3 * |
Also Published As
Publication number | Publication date |
---|---|
WO2001057629A3 (en) | 2002-03-21 |
WO2001057629A9 (en) | 2002-10-31 |
WO2001057629A2 (en) | 2001-08-09 |
AU2001241454A1 (en) | 2001-08-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7516477B2 (en) | Method and system for ensuring that computer programs are trustworthy | |
US7565549B2 (en) | System and method for the managed security control of processes on a computer system | |
US8239947B1 (en) | Method using kernel mode assistance for the detection and removal of threats which are actively preventing detection and removal from a running system | |
US20110239306A1 (en) | Data leak protection application | |
US8286254B2 (en) | Behavioral learning for interactive user security | |
EP0325776A2 (en) | A trusted path mechanism for an operating system | |
MX2011000019A (en) | A system and method of data cognition incorporating autonomous security protection. | |
MXPA06001211A (en) | End user data activation. | |
KR20040101490A (en) | Detecting and countering malicious code in enterprise networks | |
CN102208004B (en) | Method for controlling software behavior based on least privilege principle | |
US7665139B1 (en) | Method and apparatus to detect and prevent malicious changes to tokens | |
CN113946825B (en) | Memory horse processing method and system | |
KR20040056998A (en) | Method and Apparatus for Detecting Malicious Executable Code using Behavior Risk Point | |
Huang et al. | A11y and Privacy don't have to be mutually exclusive: Constraining Accessibility Service Misuse on Android | |
KR20090026846A (en) | Internal and external network separation device and control method through dual independent environment | |
US20220191224A1 (en) | Method of threat detection in a threat detection network and threat detection network | |
CN109791588A (en) | Alleviate malicious action associated with graphical user-interface element | |
EP1410129A2 (en) | Computer security system identifying suspect behavior | |
US7721281B1 (en) | Methods and apparatus for securing local application execution | |
US8788845B1 (en) | Data access security | |
US8230116B2 (en) | Resumption of execution of a requested function command | |
KR102004505B1 (en) | System for real-time protection of computer storage devices using user behavior analysis and control method thereof | |
US8615805B1 (en) | Systems and methods for determining if a process is a malicious process | |
Filman et al. | SafeBots: a paradigm for software security controls | |
CN115292693A (en) | Method for enhancing node |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20030203 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
R17P | Request for examination filed (corrected) |
Effective date: 20030127 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20050901 |