US20150235027A1 - Malicious code detection - Google Patents
Malicious code detection Download PDFInfo
- Publication number
- US20150235027A1 US20150235027A1 US14/695,789 US201514695789A US2015235027A1 US 20150235027 A1 US20150235027 A1 US 20150235027A1 US 201514695789 A US201514695789 A US 201514695789A US 2015235027 A1 US2015235027 A1 US 2015235027A1
- Authority
- US
- United States
- Prior art keywords
- malicious code
- data
- pipeline
- bytes
- row
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
- G06F21/562—Static detection
- G06F21/564—Static detection by virus signature recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
- G06F21/566—Dynamic detection, i.e. detection performed at run-time, e.g. emulation, suspicious activities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
- G06F21/567—Computer malware detection or handling, e.g. anti-virus arrangements using dedicated hardware
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L63/00—Network architectures or network communication protocols for network security
- H04L63/14—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
- H04L63/1408—Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
- H04L63/1416—Event detection, e.g. attack signature detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/03—Indexing scheme relating to G06F21/50, monitoring users, programs or devices to maintain the integrity of platforms
- G06F2221/033—Test or assess software
Definitions
- malware Malicious computer code
- Malicious code may generally be considered as software that is designed to infiltrate a computing device without the informed consent of the device's owner or administrator.
- Malware in particular is a general term used by computer professionals to mean a variety of forms of hostile, intrusive, annoying, and/or unwanted software or program code.
- FIG. 1 is a diagram of a device in which malicious code detection is performed, according to an embodiment of the present disclosure.
- FIG. 2 is a diagram of the processing pipeline of the device of FIG. 1 in detail, according to an embodiment of the present disclosure.
- FIG. 3 is a diagram of a malicious code detector of the device of FIG. 1 in detail, according to an embodiment of the present disclosure.
- FIG. 4 is a flowchart of a method for malicious code detection, according to an embodiment of the present disclosure.
- a dedicated processor is added to a network device, a computing device, or another type of device for the primary if not sole purpose of detecting malicious code detection.
- adding dedicated processors is expensive, and typically results in higher energy consumption by the devices.
- adding a dedicated processor may still not alleviate the concern of reducing the performance of a device in completing other tasks.
- incoming data to a device may be temporarily stored within a queue. Before this data can be processed by the device in accordance with its existing tasks, the data is first scanned for malicious code. The data cannot be processed in accordance with the existing tasks of the device until this malicious code detection has occurred. Therefore, even if a dedicated processor is employed to perform malicious code detection, overall performance of the device may suffer.
- FIG. 1 shows a representative device 100 , according to an embodiment of the disclosure, which overcomes these shortcomings.
- the device 100 may be a networking device, such as a switch, router, or other type of networking device.
- the device 100 may alternatively or additionally be a computing device, like a general purpose computer such as a server computing device, a client computing device, a desktop computer, and/or a laptop computer, among other types of computing devices.
- the device 100 includes a processing pipeline 102 and a malicious code detector 104 .
- Both the pipeline 102 and the detector 104 are implemented at least in hardware.
- the pipeline 102 and the detector 104 are both implemented solely in hardware, such as by using appropriate application-specific integrated circuits (ASIC's), field-programmable gate arrays (FPGA's), and other types of hardware-only components.
- ASIC's application-specific integrated circuits
- FPGA's field-programmable gate arrays
- the pipeline 102 and the detector 104 may be implemented at least in hardware in so far as they are also software that is executed by a processor (which is hardware) to perform their respective functionalities.
- the data is moved through the pipeline 102 , as indicated by the arrow 106 .
- This processing is unrelated to the detection of malicious code. That is, the purpose of moving the data through the pipeline 102 to perform processing on the data is unrelated to the detection of malicious code.
- the processing is performed on the data as it is moved through the pipeline 102 in that the data is altered by a processing agent executed within the pipeline 102 , which may control the rate at which the data moves through the pipeline 102 .
- the data may be incoming data packets received from outside a network to which the device 100 is a gatekeeper.
- the pipeline 102 may be used to modify the header information of these data packets so that the packets are transmitted to the proper devices within the network. For instance, data packets relating to world-wide web (WWW) requests may be transmitted to a WWW server device on the network, data packets relating to file transport protocol (FTP) requests may be transmitted to an FTP server device on the network, and so on.
- WWW world-wide web
- FTP file transport protocol
- External devices on the network can thus view the network as having a single networking address, whereas in actuality the network is made up of a number of devices having corresponding (internal) network addresses.
- the pipeline 102 is therefore used in this example to alter the networking addresses of incoming data packets to the internal network addresses of the devices on the network that are to handle the data packets.
- the modification of the networking addresses of incoming data packets to the internal network addresses is one type of processing that can be performed on these data packets within the pipeline 102 .
- the detector 104 detects any malicious code within the data as the data is moved through the pipeline 102 , as indicated by the dotted line 108 .
- the detector 104 is able to detect malicious code within the data as the data is moved through the pipeline 102 , without delaying the movement of the data into, through, and out of the pipeline 102 .
- the data processing that is performed in the pipeline 102 is independent of the malicious code detection performed by the detector 104 . Data enters, moves through, and exits the pipeline 102 without waiting for the detector 104 to perform its detection.
- the embodiment of FIG. 1 is able to detect malicious code without reducing the overall performance of a device like the device 100 . Furthermore, the embodiment of FIG. 1 does not require potentially expensive and power-hungry dedicated processors for malicious code detection. Rather, the detector 104 can be implemented in hardware via much lower cost hardware components that consume much less power, as compared to dedicated processors.
- An additional benefit of the embodiment of FIG. 1 is that in at least some situations, all data that enters the device 100 is moved through the pipeline 102 for processing, such that the detector 104 detects malicious code within all this data.
- data is spot checked (i.e., randomly or selectively sampled) for the presence of malicious code. While such data sampling can be sufficiently sophisticated to more than likely catch all malicious code present within the data, it can never guarantee that all malicious code will be detected, since not all the data entering the device 100 is examined.
- FIG. 2 shows the processing pipeline 102 in more detail, according to an embodiment of the disclosure.
- the pipeline 102 includes a number of rows 202 A, 202 B, 202 C, . . ., 202 N, collectively referred to as the rows 202 .
- the rows 202 may also be referred to as the stages of the pipeline 102 .
- the row 202 A is the first row of the pipeline 102
- the row 202 N is the last row of the pipeline 102 .
- Each row 202 of the pipeline 102 stores the same number of bytes.
- each row 202 stores eight bytes. However, each row 202 may store a different number of bytes, such as sixteen bytes, thirty-two bytes, and so on.
- a number of bytes of the data equal to the number of bytes that each row 202 can store enters the pipeline 102 at the first row 202 A, and proceeds through the pipeline 102 on a row-by-row basis until the data exits the last row 202 N, as indicated by the arrow 106 .
- the first eight bytes of data enters the pipeline 102 at the first row 202 A.
- These first eight bytes of data then cascade down to the second row 202 B, and at the same time the second eight bytes of data enter the pipeline 102 at the first row 202 A.
- the first eight bytes of data move down to the third row 202 C, the second eight bytes move down to the second row 202 B, and the third eight bytes of data enter the pipeline at the first row 202 A. This process continues, until the first eight bytes of data enter and then exit the last row 202 N of the pipeline 102 , followed by the second eight bytes entering and then exiting the last row 202 N, and so on.
- the data may be altered, or processed.
- the header information of a data packet may be altered where the processing pipeline 102 is part of a gatekeeper networking device 100 .
- the networking address A.B.C.D may be replaced with the networking address E.F.G.H.
- the networking address A.B.C.D specifies the external networking address of the network as a whole of which the device 100 is a part.
- the networking address E.F.G.H specifies the internal networking address of the device within the network that is to handle the data packet in question.
- the row 204 includes bytes 208 A, 208 B, 208 C, 208 D, 208 E, 208 F, 208 G, and 208 H, starting with the byte 208 A and ending with the byte 208 H.
- the row 204 includes bytes 208 I, 208 J, 208 K, 208 L, 208 M, 208 N, 208 O, and 208 P, starting with the byte 208 I and ending with the byte 208 P.
- a data packet 210 is said to be made up of twelve bytes 208 C- 208 N, which is indicated in FIG. 2 by shading. It is noted that, in actuality, a data packet is more likely to be made up of a larger number of bytes in at least some situations.
- the explicit calling out of the rows 204 and 206 and of the data packet 210 in FIG. 2 illustrates two aspects of data packets vis-à-vis the rows 202 of the pipeline 102 .
- a data packet can span more than one row.
- the exemplary data packet 210 for instance, spans the rows 204 and 206 .
- a data packet does not have to start at the first byte of a row, nor end at the last byte of a row.
- the exemplary data packet 210 starts at the third byte 208 C of the row 204 , and ends at the sixth byte 208 N of the row 206 .
- the second byte 208 B of the row 204 may be the ending byte of the previous data packet, and the seventh byte 208 O of the row 206 may be the starting byte of the next data packet.
- FIG. 3 shows the malicious code detector 104 in more detail, according to an embodiment of the disclosure. Furthermore, how the detector 104 can representatively detect malicious code in the data packet 210 spanning the rows 204 and 206 of the processing pipeline 102 is described in relation to FIG. 2 .
- the detector 104 includes a storage 302 and correlators 304 A, 304 B, 304 C, 304 D, 304 E, 304 F, 304 G, and 304 H, which are collectively referred to as the correlators 304 .
- the storage 302 stores a signature 306 having a length of bytes 308 A, 308 B, 308 C, 308 D, 308 E, 308 F, 308 G, 308 H, and 308 I, which are collectively referred to as the bytes 308 .
- the number of bytes 308 of the signature 306 is independent of the number of bytes in each row 202 of the pipeline 102 . There are nine bytes 308 in the example of FIG. 3 , but in actuality there can be more or less of such bytes 308 .
- the signature 306 corresponds to one or more malicious code portions.
- a malicious code portion is a portion of malicious code that is sufficient to identify this malicious code with a predetermined degree of confidence.
- the signature 306 corresponds to one malicious code portion having the bytes 308
- the signature 306 corresponds to one malicious code portion having the bytes 308
- the correlators 304 are equal in number to the number of bytes in each row 202 of the pipeline 102 . Therefore, in the example of FIG. 3 , there are eight correlators 304 , because there are eight bytes in each row 202 of the pipeline 102 .
- the correlators 304 each detect whether the malicious code portions of the signature 306 are present within the data beginning in the row 204 , but at different starting byte positions within the row 204 . That is, the correlators 304 A through 304 H have starting byte positions corresponding to the positions of the bytes 208 A through 208 H within the row, and thus have unique offsets of zero through seven, respectively.
- the correlator 304 A has an offset of zero and thus a starting byte position corresponding to the byte 208 A of the row 204 . Therefore, the correlator 304 A detects whether the bytes 308 A through 308 I of the signature 306 match the bytes 208 A through 208 I of the rows 204 and 206 , where the bytes 208 A through 208 H are in the row 204 and the byte 208 I is in the row 206 . That is, the correlator 304 A detects whether the byte 308 A matches the byte 208 A, whether the byte 308 B matches the byte 208 B, and so on, through whether the byte 308 I matches the byte 208 I.
- the correlator 304 B has an offset of one and thus a starting byte position corresponding to the byte 208 B of the row 204 . Therefore, the correlator detects whether the bytes 308 A through 308 I of the signature 306 match the bytes 208 B through 208 J. That is, the correlator 304 B detects whether the byte 308 A matches the byte 208 B, whether the byte 308 B matches the byte 208 C, and so on, through whether the byte 308 I matches the byte 208 J.
- the correlator 304 A determines whether the bytes 308 of the signature 306 are present within the rows 204 and 206 starting at the byte 208 A
- the correlator 304 B determines whether the bytes 308 are present within the rows 204 and 206 starting at the byte 208 B.
- the correlator 304 C has an offset of two and so determines whether the bytes 308 are present starting at the byte 208 C
- the correlator 304 D has an offset of three and determines whether the bytes 308 are present starting at the byte 208 D, and so on.
- the correlators 304 can detect whether the bytes 308 of the signature 306 are present in a corresponding sequence of bytes in the rows 204 and 206 (i.e., in the same order and with the same values), regardless of where the sequence starts within the row 204 . If the sequence of the bytes 308 starts at the byte 208 A in the row 204 , then the correlator 304 A detects the signature 306 , and if the sequence of the bytes 308 starts at the byte 208 B in the row 204 , then the correlator 304 B detects the signature 306 .
- the correlator 304 C through 308 H having a starting byte position corresponding to this byte in the row 204 detects the signature 306 .
- each correlator 304 detects whether the malicious code portions of the signature 306 are present within the data as that data spans both the rows 204 and 206 of the pipeline 102 .
- Each correlator 308 provides the number of bytes of the data that it has matched to the signature 306 , and the detector 104 indicates that malicious code has been detected within the data based on this number of bytes of the data that have been matched to the signature 306 . For example, in one embodiment, only if a given correlator 308 matches all the bytes of the signature 306 to corresponding bytes of the data does the detector 104 indicate that malicious code has been detected. As other examples, and in other embodiments, a programmable threshold number of bytes, or a threshold percentage of the number of bytes matched in relation to the total number of bytes 308 within the signature 306 , may be employed to decide whether to indicate that malicious code has been detected.
- the signature 306 may correspond to one or more malicious code portions. Each code portion may correspond to a different type of malicious code, however.
- the bytes 308 A through 308 D may correspond to a first type of malicious code
- the bytes 308 E through 308 I may correspond to a second type of malicious code.
- the correlators 304 can thus simultaneously detect whether either or both types of malicious code are present within the data.
- the correlator 304 E may detect that the bytes 308 A through 308 D of the signature 306 match the bytes 208 E through 208 H of the data, but that the bytes 308 E through 308 I do not match the bytes 208 I through 208 M.
- the detector 104 concludes that the first type of malicious code having the malicious code portion of the bytes 308 A through 308 D is present within the data, but that the second type of malicious code having the malicious code portion of the bytes 308 E through 308 I is not.
- Different malicious code portions may also be stored in the storage 302 as the signature 306 at different times.
- the correlators 304 can detect different malicious code portions within the data moving through the pipeline 102 at different times. This may be useful where there is a large number of malicious code portions against which to test the data, and where if one packet of data is infected with a given type of malicious code, it is likely that other packets of data are infected with the same type of malicious code.
- the malicious code portion corresponding to this type of malicious code may be retained within a portion of the signature 306 , while at the same time other malicious code portions corresponding to other types of malicious code are rotated through other portions of the signature 306 .
- Each byte 308 may correspond to one of two different types of bytes.
- the first type of byte is an actual malicious code byte having a specific value to be detected within the data by the correlators 304 in relation to the other bytes 308 .
- the second type of byte is a do-not-care byte.
- the identity of the corresponding byte within the data does not matter for a do-not-care byte, and is unimportant for the detection of malicious code within the data.
- a given type of malicious code has a malicious code section including a byte of particular value A, followed two bytes down by a byte of particular value B.
- the byte in between the bytes having the values A and B is of no consequence, however, and is unimportant to the detection of this type of malicious code. Therefore, the corresponding byte 308 of the signature 306 is set as a do-not-care byte, to indicate to the correlators 304 that this byte is not actually used to detect the malicious code within the data.
- a do-not-care byte in other words, is a placeholder byte to specify the separation of other bytes that are important in detecting malicious code within the data.
- the bytes 308 for each malicious code portion within the signature 306 may have to be present within the same data packet of the data in the processing pipeline 102 .
- the bytes 308 A through 308 D correspond to a given malicious code portion
- just the correlators 304 C through 304 H are able to detect this malicious code portion in such a situation, and not the correlators 304 A and 304 B.
- the correlators 304 A and 304 D as to the four bytes 308 A through 308 D span two data packets, and not just the data packet 210 .
- the correlators 304 C through 304 H span just the data packet 210 .
- This caveat is that the row 202 in relation to which the correlators 304 has to be sufficiently down the pipeline 102 so that there is a corresponding number of bytes within that row and within any preceding rows equal to or greater than the number of bytes 308 within the signature 306 .
- the first row 202 A could not have been selected in the example of FIG. 3 instead of the row 204 , because there are nine bytes 308 within the signature 306 , whereas there are just eight bytes in the first row 202 A, and there is no preceding row to the first 202 A.
- the comparison performed by the correlators 304 is relatively fast, because each correlator 304 just has to compare the data beginning at a corresponding starting byte position within the row 204 to the bytes 308 of the signature 306 .
- one implementation of the correlators 304 may be a number of comparators equal to the total number of bits (as opposed to bytes) of the signature 306 . Therefore, although the detector 104 does not have the ability to delay movement of data through the pipeline 102 down the rows 202 , this is not a hindrance to implementation, because the comparisons can be made quickly.
- Comparators are also relatively inexpensive hardware components to implement, particularly as compared to dedicated processors.
- FIG. 4 shows a method 400 for performing malicious code detection consistent with the description provided in relation to FIGS. 1-3 , according to an embodiment of the disclosure.
- Data is moved through the processing pipeline 102 to perform processing of the data ( 402 ), where such processing is unrelated to the detection of malicious code.
- the malicious code detector 104 which is a hardware component of the device 100 , detects malicious code within the data as the data is moving through the pipeline 102 ( 404 ). As noted above, such detection is performed in parallel with the processing of the data, and does not delay the movement of the data into, through, and out of the pipeline 102 .
- Malicious code detection is performed as follows.
- the method 400 simultaneously compares the signature 306 , which corresponds to one or more malicious code portions, to each of a number of overlapping portions of the data beginning with a byte of the data at a starting byte position within a given row 202 of the pipeline 102 ( 406 ).
- the given row 202 is the row 204
- the starting byte positions correspond to the bytes 208 A through 208 H of the row 204 .
- the overlapping portions of the data are the bytes 208 A through 208 I of the data, the bytes 208 B through 208 J, the bytes 208 C through 208 K, and so on, where the last overlapping portion of the data includes the bytes 208 H through 208 P.
- the method 400 indicates that malicious code has been detected within the data, based on the number of bytes of each overlapping portion of the data that have been matched to the signature ( 408 ). For instance, in a rudimentary example, if the signature 306 perfectly matches any of the overlapping portions of the data within the pipeline 102 beginning at a starting byte position within the row 204 (viz., all the bytes 308 match corresponding bytes within the data in the same sequence as the bytes 308 ), then the method 400 indicates that malicious code has been detected. However, if the signature 306 does not perfectly match any of the overlapping portions of the data within the pipeline 102 beginning at a starting byte position within the row 204 , then in this example the method 400 does not indicate that malicious code has been detected.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Virology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
Description
- This application claims priority to and is a Continuation Application of U.S. patent application Ser. No. 13/259,307, filed on Sep. 23, 2011, entitled “MALICIOUS CODE DETECTION,” the disclosure of which is hereby incorporated by reference in its entirety.
- With the advent of networking technologies and the Internet, computing devices worldwide have been able to intercommunicate with one another. While this has provided numerous benefits, there have been some problems. One problem is that malicious computer code, such as computer viruses, Trojans, worms, and even spam, among other types of malicious computer code, can more easily and undesirably spread over a large number of computing devices. Malicious computer code can also be referred to in shortened form as malicious code or malware. Malicious code may generally be considered as software that is designed to infiltrate a computing device without the informed consent of the device's owner or administrator. Malware in particular is a general term used by computer professionals to mean a variety of forms of hostile, intrusive, annoying, and/or unwanted software or program code.
-
FIG. 1 is a diagram of a device in which malicious code detection is performed, according to an embodiment of the present disclosure. -
FIG. 2 is a diagram of the processing pipeline of the device ofFIG. 1 in detail, according to an embodiment of the present disclosure. -
FIG. 3 is a diagram of a malicious code detector of the device ofFIG. 1 in detail, according to an embodiment of the present disclosure. -
FIG. 4 is a flowchart of a method for malicious code detection, according to an embodiment of the present disclosure. - As noted in the background section, the spread of malicious code has become problematic with the increasing interconnectedness of computing devices over the Internet and other networks. As a result, users and network administrators are often concerned with detecting such malicious code before the code can be installed on or otherwise infect a given computing device or computing devices. For large organizations, it can be beneficial to attempt malicious code detection at the points of entry of their networks to the outside world (e.g., the Internet), in addition to or in lieu of trying to detect malicious code individually at each computing device within the organizations.
- Existing techniques to detect malicious code suffer from some disadvantages, however. In one prior art approach, an existing processor of a networking device such as a switch or a router, a computing device like a general purpose computer, or another type of device, is also burdened with having to detect malicious code. As such, the other tasks of the processor may be completed more slowly. Overall performance of such a device, in other words, is lowered.
- In another prior art approach, a dedicated processor is added to a network device, a computing device, or another type of device for the primary if not sole purpose of detecting malicious code detection. However, adding dedicated processors is expensive, and typically results in higher energy consumption by the devices. Furthermore, adding a dedicated processor may still not alleviate the concern of reducing the performance of a device in completing other tasks.
- For example, incoming data to a device may be temporarily stored within a queue. Before this data can be processed by the device in accordance with its existing tasks, the data is first scanned for malicious code. The data cannot be processed in accordance with the existing tasks of the device until this malicious code detection has occurred. Therefore, even if a dedicated processor is employed to perform malicious code detection, overall performance of the device may suffer.
-
FIG. 1 shows arepresentative device 100, according to an embodiment of the disclosure, which overcomes these shortcomings. Thedevice 100 may be a networking device, such as a switch, router, or other type of networking device. Thedevice 100 may alternatively or additionally be a computing device, like a general purpose computer such as a server computing device, a client computing device, a desktop computer, and/or a laptop computer, among other types of computing devices. - The
device 100 includes aprocessing pipeline 102 and amalicious code detector 104. Both thepipeline 102 and thedetector 104 are implemented at least in hardware. In one embodiment, thepipeline 102 and thedetector 104 are both implemented solely in hardware, such as by using appropriate application-specific integrated circuits (ASIC's), field-programmable gate arrays (FPGA's), and other types of hardware-only components. In another embodiment, thepipeline 102 and thedetector 104 may be implemented at least in hardware in so far as they are also software that is executed by a processor (which is hardware) to perform their respective functionalities. - To process data within the
device 100, the data is moved through thepipeline 102, as indicated by thearrow 106. This processing is unrelated to the detection of malicious code. That is, the purpose of moving the data through thepipeline 102 to perform processing on the data is unrelated to the detection of malicious code. The processing is performed on the data as it is moved through thepipeline 102 in that the data is altered by a processing agent executed within thepipeline 102, which may control the rate at which the data moves through thepipeline 102. - For example, where the
device 100 is a networking device like a switch or router, the data may be incoming data packets received from outside a network to which thedevice 100 is a gatekeeper. Thepipeline 102 may be used to modify the header information of these data packets so that the packets are transmitted to the proper devices within the network. For instance, data packets relating to world-wide web (WWW) requests may be transmitted to a WWW server device on the network, data packets relating to file transport protocol (FTP) requests may be transmitted to an FTP server device on the network, and so on. - External devices on the network can thus view the network as having a single networking address, whereas in actuality the network is made up of a number of devices having corresponding (internal) network addresses. The
pipeline 102 is therefore used in this example to alter the networking addresses of incoming data packets to the internal network addresses of the devices on the network that are to handle the data packets. The modification of the networking addresses of incoming data packets to the internal network addresses is one type of processing that can be performed on these data packets within thepipeline 102. - However, in parallel with the processing of the data as the data is moved through the
pipeline 102, thedetector 104 detects any malicious code within the data as the data is moved through thepipeline 102, as indicated by thedotted line 108. Thedetector 104 is able to detect malicious code within the data as the data is moved through thepipeline 102, without delaying the movement of the data into, through, and out of thepipeline 102. The data processing that is performed in thepipeline 102 is independent of the malicious code detection performed by thedetector 104. Data enters, moves through, and exits thepipeline 102 without waiting for thedetector 104 to perform its detection. - In this respect, the embodiment of
FIG. 1 is able to detect malicious code without reducing the overall performance of a device like thedevice 100. Furthermore, the embodiment ofFIG. 1 does not require potentially expensive and power-hungry dedicated processors for malicious code detection. Rather, thedetector 104 can be implemented in hardware via much lower cost hardware components that consume much less power, as compared to dedicated processors. - An additional benefit of the embodiment of
FIG. 1 is that in at least some situations, all data that enters thedevice 100 is moved through thepipeline 102 for processing, such that thedetector 104 detects malicious code within all this data. In many types of conventional techniques, by comparison, data is spot checked (i.e., randomly or selectively sampled) for the presence of malicious code. While such data sampling can be sufficiently sophisticated to more than likely catch all malicious code present within the data, it can never guarantee that all malicious code will be detected, since not all the data entering thedevice 100 is examined. -
FIG. 2 shows theprocessing pipeline 102 in more detail, according to an embodiment of the disclosure. Thepipeline 102 includes a number ofrows pipeline 102. Therow 202A is the first row of thepipeline 102, and therow 202N is the last row of thepipeline 102. Each row 202 of thepipeline 102 stores the same number of bytes. For exemplary purposes, each row 202 stores eight bytes. However, each row 202 may store a different number of bytes, such as sixteen bytes, thirty-two bytes, and so on. - A number of bytes of the data equal to the number of bytes that each row 202 can store enters the
pipeline 102 at thefirst row 202A, and proceeds through thepipeline 102 on a row-by-row basis until the data exits thelast row 202N, as indicated by thearrow 106. For example, the first eight bytes of data enters thepipeline 102 at thefirst row 202A. These first eight bytes of data then cascade down to thesecond row 202B, and at the same time the second eight bytes of data enter thepipeline 102 at thefirst row 202A. Next, the first eight bytes of data move down to thethird row 202C, the second eight bytes move down to thesecond row 202B, and the third eight bytes of data enter the pipeline at thefirst row 202A. This process continues, until the first eight bytes of data enter and then exit thelast row 202N of thepipeline 102, followed by the second eight bytes entering and then exiting thelast row 202N, and so on. - At any given row 202 of the
pipeline 102, the data may be altered, or processed. For example, as noted above, the header information of a data packet may be altered where theprocessing pipeline 102 is part of agatekeeper networking device 100. For instance, the networking address A.B.C.D may be replaced with the networking address E.F.G.H. The networking address A.B.C.D specifies the external networking address of the network as a whole of which thedevice 100 is a part. The networking address E.F.G.H specifies the internal networking address of the device within the network that is to handle the data packet in question. - Two
particular rows FIG. 2 , in relation to which the detection of malicious code by thedetector 104 will be described. Therow 204 includes bytes 208A, 208B, 208C, 208D, 208E, 208F, 208G, and 208H, starting with the byte 208A and ending with the byte 208H. Therow 204 includes bytes 208I, 208J, 208K, 208L, 208M, 208N, 208O, and 208P, starting with the byte 208I and ending with the byte 208P. In a rudimentary example, adata packet 210 is said to be made up of twelve bytes 208C-208N, which is indicated inFIG. 2 by shading. It is noted that, in actuality, a data packet is more likely to be made up of a larger number of bytes in at least some situations. - The explicit calling out of the
rows data packet 210 inFIG. 2 illustrates two aspects of data packets vis-à-vis the rows 202 of thepipeline 102. First, a data packet can span more than one row. Theexemplary data packet 210, for instance, spans therows exemplary data packet 210, for instance, starts at the third byte 208C of therow 204, and ends at the sixth byte 208N of therow 206. The second byte 208B of therow 204 may be the ending byte of the previous data packet, and the seventh byte 208O of therow 206 may be the starting byte of the next data packet. -
FIG. 3 shows themalicious code detector 104 in more detail, according to an embodiment of the disclosure. Furthermore, how thedetector 104 can representatively detect malicious code in thedata packet 210 spanning therows processing pipeline 102 is described in relation toFIG. 2 . Thedetector 104 includes astorage 302 andcorrelators - The
storage 302 stores asignature 306 having a length of bytes 308A, 308B, 308C, 308D, 308E, 308F, 308G, 308H, and 308I, which are collectively referred to as thebytes 308. The number ofbytes 308 of thesignature 306 is independent of the number of bytes in each row 202 of thepipeline 102. There are ninebytes 308 in the example ofFIG. 3 , but in actuality there can be more or less ofsuch bytes 308. Thesignature 306 corresponds to one or more malicious code portions. A malicious code portion is a portion of malicious code that is sufficient to identify this malicious code with a predetermined degree of confidence. For example, where thesignature 306 corresponds to one malicious code portion having thebytes 308, if all thebytes 308 are found within the data in the same sequence and with the same values, then this means that the data contains the malicious code having this malicious code portion with the predetermined degree of confidence. - The correlators 304 are equal in number to the number of bytes in each row 202 of the
pipeline 102. Therefore, in the example ofFIG. 3 , there are eight correlators 304, because there are eight bytes in each row 202 of thepipeline 102. The correlators 304 each detect whether the malicious code portions of thesignature 306 are present within the data beginning in therow 204, but at different starting byte positions within therow 204. That is, thecorrelators 304A through 304H have starting byte positions corresponding to the positions of the bytes 208A through 208H within the row, and thus have unique offsets of zero through seven, respectively. - For example, the
correlator 304A has an offset of zero and thus a starting byte position corresponding to the byte 208A of therow 204. Therefore, thecorrelator 304A detects whether the bytes 308A through 308I of thesignature 306 match the bytes 208A through 208I of therows row 204 and the byte 208I is in therow 206. That is, thecorrelator 304A detects whether the byte 308A matches the byte 208A, whether the byte 308B matches the byte 208B, and so on, through whether the byte 308I matches the byte 208I. - By comparison, the
correlator 304B has an offset of one and thus a starting byte position corresponding to the byte 208B of therow 204. Therefore, the correlator detects whether the bytes 308A through 308I of thesignature 306 match the bytes 208B through 208J. That is, thecorrelator 304B detects whether the byte 308A matches the byte 208B, whether the byte 308B matches the byte 208C, and so on, through whether the byte 308I matches the byte 208J. As such, whereas thecorrelator 304A determines whether thebytes 308 of thesignature 306 are present within therows correlator 304B determines whether thebytes 308 are present within therows correlator 304C has an offset of two and so determines whether thebytes 308 are present starting at the byte 208C, thecorrelator 304D has an offset of three and determines whether thebytes 308 are present starting at the byte 208D, and so on. - Because the number of the correlators 304 is equal to the number of bytes in each row 202 of the
pipeline 102, the correlators 304 can detect whether thebytes 308 of thesignature 306 are present in a corresponding sequence of bytes in therows 204 and 206 (i.e., in the same order and with the same values), regardless of where the sequence starts within therow 204. If the sequence of thebytes 308 starts at the byte 208A in therow 204, then thecorrelator 304A detects thesignature 306, and if the sequence of thebytes 308 starts at the byte 208B in therow 204, then thecorrelator 304B detects thesignature 306. Similarly, if the sequence of thebytes 308 starts at a given byte 208C through 208H in therow 204, then the correlator 304C through 308H having a starting byte position corresponding to this byte in therow 204 detects thesignature 306. - In the example of
FIG. 3 , each correlator 304 detects whether the malicious code portions of thesignature 306 are present within the data as that data spans both therows pipeline 102. However, in general, depending on the number ofbytes 308 within thesignature 306 and the number of bytes within each row 202 of thepipeline 102, there may be no correlator that spans more than one row 202 of thepipeline 102. Alternatively, there may be one or more correlators that span two rows 202, or more than two rows 202, of thepipeline 102. - Each
correlator 308 provides the number of bytes of the data that it has matched to thesignature 306, and thedetector 104 indicates that malicious code has been detected within the data based on this number of bytes of the data that have been matched to thesignature 306. For example, in one embodiment, only if a givencorrelator 308 matches all the bytes of thesignature 306 to corresponding bytes of the data does thedetector 104 indicate that malicious code has been detected. As other examples, and in other embodiments, a programmable threshold number of bytes, or a threshold percentage of the number of bytes matched in relation to the total number ofbytes 308 within thesignature 306, may be employed to decide whether to indicate that malicious code has been detected. - As noted above, the
signature 306 may correspond to one or more malicious code portions. Each code portion may correspond to a different type of malicious code, however. For example, the bytes 308A through 308D may correspond to a first type of malicious code, and the bytes 308E through 308I may correspond to a second type of malicious code. The correlators 304 can thus simultaneously detect whether either or both types of malicious code are present within the data. For example, thecorrelator 304E may detect that the bytes 308A through 308D of thesignature 306 match the bytes 208E through 208H of the data, but that the bytes 308E through 308I do not match the bytes 208I through 208M. In such instance, thedetector 104 concludes that the first type of malicious code having the malicious code portion of the bytes 308A through 308D is present within the data, but that the second type of malicious code having the malicious code portion of the bytes 308E through 308I is not. - Different malicious code portions may also be stored in the
storage 302 as thesignature 306 at different times. As such, the correlators 304 can detect different malicious code portions within the data moving through thepipeline 102 at different times. This may be useful where there is a large number of malicious code portions against which to test the data, and where if one packet of data is infected with a given type of malicious code, it is likely that other packets of data are infected with the same type of malicious code. However, if it is known a priori that a given type of malicious code is more dangerous or more prevalent at any given time, the malicious code portion corresponding to this type of malicious code may be retained within a portion of thesignature 306, while at the same time other malicious code portions corresponding to other types of malicious code are rotated through other portions of thesignature 306. - Each
byte 308 may correspond to one of two different types of bytes. The first type of byte is an actual malicious code byte having a specific value to be detected within the data by the correlators 304 in relation to theother bytes 308. The second type of byte, however, is a do-not-care byte. The identity of the corresponding byte within the data does not matter for a do-not-care byte, and is unimportant for the detection of malicious code within the data. - For example, it may be known that a given type of malicious code has a malicious code section including a byte of particular value A, followed two bytes down by a byte of particular value B. The byte in between the bytes having the values A and B is of no consequence, however, and is unimportant to the detection of this type of malicious code. Therefore, the
corresponding byte 308 of thesignature 306 is set as a do-not-care byte, to indicate to the correlators 304 that this byte is not actually used to detect the malicious code within the data. A do-not-care byte, in other words, is a placeholder byte to specify the separation of other bytes that are important in detecting malicious code within the data. - In one embodiment, the
bytes 308 for each malicious code portion within thesignature 306 may have to be present within the same data packet of the data in theprocessing pipeline 102. In the example ofFIG. 3 , if the bytes 308A through 308D correspond to a given malicious code portion, then just the correlators 304C through 304H are able to detect this malicious code portion in such a situation, and not thecorrelators correlators data packet 210. By comparison, thecorrelators 304C through 304H span just thedata packet 210. - The particular row 202 of the
pipeline 102 in relation to which the correlators 304 detect malicious code—such as therow 204 in the example of FIG. 3—is relatively unimportant, with at least one caveat. This caveat is that the row 202 in relation to which the correlators 304 has to be sufficiently down thepipeline 102 so that there is a corresponding number of bytes within that row and within any preceding rows equal to or greater than the number ofbytes 308 within thesignature 306. For example, thefirst row 202A could not have been selected in the example ofFIG. 3 instead of therow 204, because there are ninebytes 308 within thesignature 306, whereas there are just eight bytes in thefirst row 202A, and there is no preceding row to the first 202A. - The comparison performed by the correlators 304 is relatively fast, because each correlator 304 just has to compare the data beginning at a corresponding starting byte position within the
row 204 to thebytes 308 of thesignature 306. For example, one implementation of the correlators 304 may be a number of comparators equal to the total number of bits (as opposed to bytes) of thesignature 306. Therefore, although thedetector 104 does not have the ability to delay movement of data through thepipeline 102 down the rows 202, this is not a hindrance to implementation, because the comparisons can be made quickly. Comparators are also relatively inexpensive hardware components to implement, particularly as compared to dedicated processors. - In conclusion,
FIG. 4 shows amethod 400 for performing malicious code detection consistent with the description provided in relation toFIGS. 1-3 , according to an embodiment of the disclosure. Data is moved through theprocessing pipeline 102 to perform processing of the data (402), where such processing is unrelated to the detection of malicious code. However, themalicious code detector 104, which is a hardware component of thedevice 100, detects malicious code within the data as the data is moving through the pipeline 102 (404). As noted above, such detection is performed in parallel with the processing of the data, and does not delay the movement of the data into, through, and out of thepipeline 102. - Malicious code detection is performed as follows. The
method 400 simultaneously compares thesignature 306, which corresponds to one or more malicious code portions, to each of a number of overlapping portions of the data beginning with a byte of the data at a starting byte position within a given row 202 of the pipeline 102 (406). In the example ofFIGS. 2 and 3 , for instance, the given row 202 is therow 204, and the starting byte positions correspond to the bytes 208A through 208H of therow 204. Because there are nine bytes within thesignature 306 in this example, the overlapping portions of the data are the bytes 208A through 208I of the data, the bytes 208B through 208J, the bytes 208C through 208K, and so on, where the last overlapping portion of the data includes the bytes 208H through 208P. - Next, the
method 400 indicates that malicious code has been detected within the data, based on the number of bytes of each overlapping portion of the data that have been matched to the signature (408). For instance, in a rudimentary example, if thesignature 306 perfectly matches any of the overlapping portions of the data within thepipeline 102 beginning at a starting byte position within the row 204 (viz., all thebytes 308 match corresponding bytes within the data in the same sequence as the bytes 308), then themethod 400 indicates that malicious code has been detected. However, if thesignature 306 does not perfectly match any of the overlapping portions of the data within thepipeline 102 beginning at a starting byte position within therow 204, then in this example themethod 400 does not indicate that malicious code has been detected.
Claims (16)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/695,789 US20150235027A1 (en) | 2009-10-31 | 2015-04-24 | Malicious code detection |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2009/062899 WO2011053324A1 (en) | 2009-10-31 | 2009-10-31 | Malicious code detection |
US201113259307A | 2011-09-23 | 2011-09-23 | |
US14/695,789 US20150235027A1 (en) | 2009-10-31 | 2015-04-24 | Malicious code detection |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2009/062899 Continuation WO2011053324A1 (en) | 2009-10-31 | 2009-10-31 | Malicious code detection |
US13/259,307 Continuation US9032517B2 (en) | 2009-10-31 | 2009-10-31 | Malicious code detection |
Publications (1)
Publication Number | Publication Date |
---|---|
US20150235027A1 true US20150235027A1 (en) | 2015-08-20 |
Family
ID=43922413
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/259,307 Active 2030-07-23 US9032517B2 (en) | 2009-10-31 | 2009-10-31 | Malicious code detection |
US14/695,789 Abandoned US20150235027A1 (en) | 2009-10-31 | 2015-04-24 | Malicious code detection |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/259,307 Active 2030-07-23 US9032517B2 (en) | 2009-10-31 | 2009-10-31 | Malicious code detection |
Country Status (4)
Country | Link |
---|---|
US (2) | US9032517B2 (en) |
EP (1) | EP2494484A4 (en) |
CN (1) | CN102576392B (en) |
WO (1) | WO2011053324A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9342709B2 (en) | 2010-10-27 | 2016-05-17 | Hewlett-Packard Enterprise Development LP | Pattern detection |
KR20140025113A (en) * | 2012-08-21 | 2014-03-04 | 한국전자통신연구원 | High speed decision apparatus and method for objectionable contents |
JP7087773B2 (en) * | 2018-07-24 | 2022-06-21 | コニカミノルタ株式会社 | Image forming device and virus check method |
RU2747464C2 (en) * | 2019-07-17 | 2021-05-05 | Акционерное общество "Лаборатория Касперского" | Method for detecting malicious files based on file fragments |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026685A1 (en) * | 2003-02-26 | 2006-02-02 | Secure Ware Inc. | Malicious-process-determining method, data processing apparatus and recording medium |
US20080192523A1 (en) * | 2007-02-12 | 2008-08-14 | Maxim Mondaeev | Apparatus and method to detect patterns in data |
US7802303B1 (en) * | 2006-06-30 | 2010-09-21 | Trend Micro Incorporated | Real-time in-line detection of malicious code in data streams |
US20130239213A1 (en) * | 2011-03-08 | 2013-09-12 | Hewlett-Packard Development Company, L.P. | Methods and systems for full pattern matching in hardware |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2501771B2 (en) * | 1993-01-19 | 1996-05-29 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Method and apparatus for obtaining multiple valid signatures of an unwanted software entity |
US5864683A (en) | 1994-10-12 | 1999-01-26 | Secure Computing Corporartion | System for providing secure internetwork by connecting type enforcing secure computers to external network for limiting access to data based on user and process access rights |
IL120632A0 (en) * | 1997-04-08 | 1997-08-14 | Zuta Marc | Multiprocessor system and method |
US7013482B1 (en) * | 2000-07-07 | 2006-03-14 | 802 Systems Llc | Methods for packet filtering including packet invalidation if packet validity determination not timely made |
US7096498B2 (en) * | 2002-03-08 | 2006-08-22 | Cipher Trust, Inc. | Systems and methods for message threat management |
JP3794491B2 (en) * | 2002-08-20 | 2006-07-05 | 日本電気株式会社 | Attack defense system and attack defense method |
US7134143B2 (en) * | 2003-02-04 | 2006-11-07 | Stellenberg Gerald S | Method and apparatus for data packet pattern matching |
WO2004075056A1 (en) * | 2003-02-21 | 2004-09-02 | National Institute Of Advanced Industrial Science And Technology | Virus check device and system |
WO2004077294A1 (en) * | 2003-02-26 | 2004-09-10 | Secure Ware Inc. | Unauthorized processing judgment method, data processing device, computer program, and recording medium |
US7367057B2 (en) | 2003-06-30 | 2008-04-29 | Intel Corporation | Processor based system and method for virus detection |
US7444515B2 (en) * | 2003-08-14 | 2008-10-28 | Washington University | Method and apparatus for detecting predefined signatures in packet payload using Bloom filters |
AU2004303220B2 (en) * | 2003-09-11 | 2008-05-22 | Bae Systems Plc | Real-time network monitoring and security |
CN101401090B (en) * | 2004-04-19 | 2010-08-25 | 加利福尼亚大学董事会 | Deep packet filtering device and deep packet filtering method |
CA2577891A1 (en) * | 2004-08-24 | 2006-03-02 | Washington University | Methods and systems for content detection in a reconfigurable hardware |
CN100461091C (en) | 2004-08-24 | 2009-02-11 | 华盛顿大学 | Methods and systems for content detection in a reconfigurable hardware |
US7602780B2 (en) * | 2004-11-09 | 2009-10-13 | Cisco Technology, Inc. | Scalably detecting and blocking signatures at high speeds |
US20090217369A1 (en) * | 2005-05-04 | 2009-08-27 | Telecom Italia S.P.A. | Method and system for processing packet flows, and computer program product therefor |
US7937756B2 (en) * | 2005-08-19 | 2011-05-03 | Cpacket Networks, Inc. | Apparatus and method for facilitating network security |
US20080034350A1 (en) | 2006-04-05 | 2008-02-07 | Conti Gregory R | System and Method for Checking the Integrity of Computer Program Code |
US20080134333A1 (en) * | 2006-12-04 | 2008-06-05 | Messagelabs Limited | Detecting exploits in electronic objects |
CN101266682B (en) | 2008-04-22 | 2010-09-22 | 浙江大学 | A Hardware Detector for Embedding Hidden Information in Image LSB |
US9032503B2 (en) * | 2008-05-20 | 2015-05-12 | Shakeel Mustafa | Diversity string based pattern matching |
US8281395B2 (en) * | 2009-01-07 | 2012-10-02 | Micron Technology, Inc. | Pattern-recognition processor with matching-data reporting module |
US8214672B2 (en) * | 2009-01-07 | 2012-07-03 | Micron Technology, Inc. | Method and systems for power consumption management of a pattern-recognition processor |
-
2009
- 2009-10-31 WO PCT/US2009/062899 patent/WO2011053324A1/en active Application Filing
- 2009-10-31 CN CN200980162264.4A patent/CN102576392B/en not_active Expired - Fee Related
- 2009-10-31 US US13/259,307 patent/US9032517B2/en active Active
- 2009-10-31 EP EP09850988.8A patent/EP2494484A4/en not_active Withdrawn
-
2015
- 2015-04-24 US US14/695,789 patent/US20150235027A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060026685A1 (en) * | 2003-02-26 | 2006-02-02 | Secure Ware Inc. | Malicious-process-determining method, data processing apparatus and recording medium |
US7802303B1 (en) * | 2006-06-30 | 2010-09-21 | Trend Micro Incorporated | Real-time in-line detection of malicious code in data streams |
US20080192523A1 (en) * | 2007-02-12 | 2008-08-14 | Maxim Mondaeev | Apparatus and method to detect patterns in data |
US20130239213A1 (en) * | 2011-03-08 | 2013-09-12 | Hewlett-Packard Development Company, L.P. | Methods and systems for full pattern matching in hardware |
Also Published As
Publication number | Publication date |
---|---|
EP2494484A4 (en) | 2016-05-18 |
CN102576392A (en) | 2012-07-11 |
US9032517B2 (en) | 2015-05-12 |
EP2494484A1 (en) | 2012-09-05 |
CN102576392B (en) | 2014-12-17 |
WO2011053324A1 (en) | 2011-05-05 |
US20120023578A1 (en) | 2012-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9342709B2 (en) | Pattern detection | |
US11930036B2 (en) | Detecting attacks and quarantining malware infected devices | |
Liu et al. | A fast string-matching algorithm for network processor-based intrusion detection system | |
US20100077482A1 (en) | Method and system for scanning electronic data for predetermined data patterns | |
US9288220B2 (en) | Methods and systems for malware detection | |
JP4490994B2 (en) | Packet classification in network security devices | |
US20080060074A1 (en) | Intrusion detection system, intrusion detection method, and communication apparatus using the same | |
US7472418B1 (en) | Detection and blocking of malicious code | |
CN107968791B (en) | Attack message detection method and device | |
US10049210B2 (en) | System and method for detection of omnientrant code segments to identify potential malicious code | |
US20120174221A1 (en) | Apparatus and method for blocking zombie behavior process | |
US20150235027A1 (en) | Malicious code detection | |
Akritidis et al. | Efficient content-based detection of zero-day worms | |
US20100014432A1 (en) | Method for identifying undesirable features among computing nodes | |
Ahmed et al. | A novel sliding window based change detection algorithm for asymmetric traffic | |
CN107612890A (en) | A kind of network monitoring method and system | |
KR100770357B1 (en) | High Performance Intrusion Prevention System and Method Using Signature Hash to Reduce Signature Matching Count | |
US20140075537A1 (en) | Method and apparatus for controlling blocking of service attack by using access control list | |
KR102014741B1 (en) | Matching method of high speed snort rule and yara rule based on fpga | |
US10721148B2 (en) | System and method for botnet identification | |
JP6272258B2 (en) | Optimization device, optimization method, and optimization program | |
US9270686B1 (en) | Zero copy packet buffering using shadow sends | |
US20050147037A1 (en) | Scan detection | |
JP4391455B2 (en) | Unauthorized access detection system and program for DDoS attack | |
US11888893B2 (en) | Characterization of HTTP flood DDoS attacks |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:WARREN, DAVID A.;REEL/FRAME:035672/0867 Effective date: 20091023 |
|
AS | Assignment |
Owner name: HEWLETT PACKARD ENTERPRISE DEVELOPMENT LP, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P.;REEL/FRAME:037079/0001 Effective date: 20151027 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |