WO2016053282A1 - String property labels for static analysis - Google Patents
String property labels for static analysis Download PDFInfo
- Publication number
- WO2016053282A1 WO2016053282A1 PCT/US2014/058224 US2014058224W WO2016053282A1 WO 2016053282 A1 WO2016053282 A1 WO 2016053282A1 US 2014058224 W US2014058224 W US 2014058224W WO 2016053282 A1 WO2016053282 A1 WO 2016053282A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- string
- label
- result data
- program code
- data
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 76
- 230000003068 static effect Effects 0.000 title claims abstract description 69
- 238000012986 modification Methods 0.000 claims abstract description 31
- 230000004048 modification Effects 0.000 claims abstract description 31
- 238000000034 method Methods 0.000 claims description 20
- 238000013507 mapping Methods 0.000 claims description 3
- 230000003247 decreasing effect Effects 0.000 claims description 2
- 230000007935 neutral effect Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000035945 sensitivity Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 101150034459 Parpbp gene Proteins 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 244000035744 Hura crepitans Species 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007620 mathematical function Methods 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3604—Analysis of software for verifying properties of programs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2221/00—Indexing scheme relating to security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F2221/03—Indexing scheme relating to G06F21/50, monitoring users, programs or devices to maintain the integrity of platforms
- G06F2221/033—Test or assess software
Definitions
- Static analysis is a technique to study a program by analyzing program code (e.g., source code and/or object code) without executing the program.
- Static analysis is commonly performed by an automated static analysis tool to analyze the program code using a mathematical technique and/or program simulation technique.
- a static analysis tool can simulate coda execution paths based on program simulations anoVor mathematical functions.
- a static analysis tool can commonly perform functions to identify coding errors and/or mathematically prove properties about the program code.
- static analysis can be used to verify properties of a program and locate a potential vulnerability to a malicious attack,
- Figures 1 and 2 are block diagrams depicting example static analysis systems.
- Figure 3 depicts an example environment in which various static analysis systems can be implemented.
- Figure 4 depicts example modules consistent with example static analysis systems.
- Figures 5 and 6 are flow diagrams depicting example methods for static analysis of program code.
- a benefit of static analysis is being able to find vulnerabilities in program code (i.e., set of executable instructions) without executing the program code.
- code execution paths are simulated, real execution paths may differ from the simulated paths. This can lead to results of the static analysis tool to include false positives or false negatives.
- Some static analysis vulnerability categories suffer from a relatively high false results rate that impacts the accuracy of the static analysis.
- One technique of static analysis Is taint analysis. Taint analysis is a technique that emulates a program execution where data entering through various user-controlled sources is propagated through the application until the data reaches a consumption point or sink as discussed herein. For example, data can be tainted if the data is controlled by a user,
- a static analyzer can perform taint analysis based on a set of static analysis rules.
- a static analysis rule is a data structure that describes a condition and a result based on the condition to produce a model of the dataflow (e.g.. propagation flow of data) through program code.
- a static analysis rule can cause a static analysis tod to parse a line of code, identify fields and structure of the line of code, and perform a function (such as add a taint label) based on structure and/or entries of the fields parsed from the line of code.
- Static analysis rules can be organized into security rules and infrastructure rules. Security rules are static analysis rules related to security based on operation of the program code and known vulnerabilities.
- An infrastructure rule is a static analysis rule associated with how the program code interacts with other program code.
- Static analysis rules can utilize taint labels to make vulnerability determinations at a sink.
- Example taint labels include security iaint labels (e.g., a label particular to security vulnerability), generic taint labels (e.g., a label indicating the source of data), neutral taint labels (e.g., a label associated with how data propagates), and the like.
- An example data structure of a rule can include a rule identification value; a function identifier having a namespace, a class name, a function name, and the like; an input argument, and an output argument.
- the example rule above can, for example, identify that the code line example of "String a ⁇ text.getDetaO * describes that if lexf is tainted, then * a" will also be tainted.
- a variable "urf can represent a uniform resource locator ("URL") such as lanexamplesite,comf and the code segment of "Redirected + req
- the priority of the vulnerability to untrusted data for these two statements of code should be different even though both utilize untrusted data.
- Various examples described below relate to incorporating a string property label associated with operations that manipulate a string to static analysis and decrease the number of false results prowled by a static analyzer
- a sink By attaching labels to data as the data is modified, a sink can identify whether to report a vulnerability based on the labels attached to the data.
- the URL retrieved from the request can be flagged as untrusted and vulnerable, but the code statement having a static concatenation to the beginning of the URL can be flagged with a lower-rated vulnerability because the redirect is restricted to particular locations which can decrease the vulnerability of the program to cross-site scripting attacks.
- the vulnerabity issue of the untrusted data can be given an appropriate priority level in a vulnerability report or even provide the report without the vulnerability issue depending on the characterization of the code based on the structure and modification of the untrusted data as described by the string property iabel(s).
- Figures 1 and 2 are block diagrams depicting example static analysis systems 100 and 200,
- the example static analysis system 100 of figure 1 generally includes a data store 102, an operator engine 104, a label engine 106, and a sink engine 108.
- the label engine 106 can label data based on a modification operation to the data identified by the operator engine 104 and the label can be used by the sink engine 108 to identify an analysis message associated with the label
- the example static analysis system 100 can include a structural analyzer engine 110 to facilitate comparison of the program code to a static analysis rule.
- the operator engine 104 represents any circuitry or combination of circuitry and executable instructions to identify a modification operation on a string during an analysis session based on a structural comparison of program code to a static analysis rule.
- a modification operation can be an operation that manipulates a string.
- Example string operations include concatenations and string replacements, such as a sub-string replacement.
- the modification operation can be identified by an operator, such as a plus sign, or a known function, such as a known Wbrdiy function for
- the modification operation can operate on untrusted data based on the source of the string.
- the string can represent a variable capable of receiving untrusted data from a user, if the string is vulnerable to malicious code, a modification operation on the string can preserve or decrease the vulnerability based on the category of the modification operation. For example, a redirect using untrusted data can be restricted from full exploitation by concatenating a static string to the beginning (e.g., the left-hand side) of the untrusted data.
- a static analysis rule can include a condition for applying the result of the rule.
- the condition can be based on a comparison of the structure of program code being analyzed with the static analysis rule.
- the structure of program code can include the characters that identify a particular operation, such as a plus sign used to denote the concatenation operator.
- the structure of the program code can denote fields, such as arguments to the operation, by finding particular characters, such as commas, to delineate separation of values and/or variables.
- the structural comparison can be performed by the structural analyzer engine 110 discussed in more detail below.
- the location of the string in the result data of the modification operation can be identified based on the structural comparison.
- the operator engine 104 can identify a concatenation operation on a string based on a stroeturat comparison of the program code to known concatenation operation forms.
- the operator engine 104 can identify a location of the string in the result data of a concatenation operation based on the structure of program code having the string on the left side or right side of the concatenation operator.
- the label engine 106 represents any circuitry or combination of circuitry and executable instructions to maintain a label with the string based on a static analysis rule.
- the label can be any number, character, string, category, or other value capable of representing a property of a siring.
- Example labels regarding the context described herein includes a concatenation label, a replacement label, a prefix label, and a suffix label.
- Other examples of labels include security-specific and vulnerability-specific labels and generic labels, such as labels about the origin of the data.
- the label identifies how the string has been modified by the modification operation.
- the label can describe a category of the modification operation (e.g., a concatenation operation or replacement operation) and a location of the modification operation (e.g., is the string concatenated to the left side or right side of a vahabie) based on the untrusted data.
- a category of the modification operation e.g., a concatenation operation or replacement operation
- a location of the modification operation e.g., is the string concatenated to the left side or right side of a vahabie
- any appropriate data of the program code can be associated with a label Associating a label with a set of data is referred to herein as "flagging" or appropriate variation thereof
- result data of a concatenation operation can be flagged with a concatenation label.
- result data can be flagged based on a location of the untrusted string in the result data, such as with one of a first label to denote an untrusted prefix when the location of the string is on a left-hand side of the result data and a second label to denote an untrusted suffix when the location of the string is on a right-hand side of the result data,
- the sink engine 108 represents any circuitry or combination of circuitry and executable instructions to identify a string property based on the label and provide an analysis message associated with the string property based on the label.
- the sink engine 10S can identify a concatenated string from the label and cause a static analyser tc provide a message (e.g., report a finding) of a tainted concatenated string.
- the sink engine 108 can identify that the label denotes the string property based on a rule and/or a map that associates the label with the string property.
- the sink engine 108 can provide an appropriate message based on the string property identified based on the label, For example, a message stating the vulnerability issue associated with the string property can be caused to present to a user of a static analysis tool.
- the sink engine 106 can determine the set of data is not vulnerable based on the label when the static analyzer would otherwise determine so without the string property label
- the sink engine 108 can identify the string was concatenated with a prefix string based on the prefix label and then provide an analysis message that the code utilizing the string is lacking a particular vulnerability due to the prefixed string property.
- an API can be vulnerable or not depending on the implementation of the API ⁇ e.g., how the API utilizes the data, such as a string).
- a string property can be associated with a security issue, such as a vulnerability, of a code statement and the sink engine 108 can report an issue associated with a set of data when the set of data is flagged with a label associated with the string property.
- the sink engine 108 can provide a message, such as a report, based on the sink sensitivity to the data modifications modeled by labels affixed to the data, such as the string property label.
- the sensitivity of the sink refers to the ability of the sink to evaluate program code based on any string properties of the data arriving at the sink, the source of the data of the program code, whether the data is validated, and the like, in this manner, the presence of labels that are neutral to security can provide insight on whether to report an issue to which the sink is sensifve or not report an issue when the sink is not sensitive to the issue.
- the sink engine 108 identify a priority level of a vulnerability based on the label associated with the string ⁇ e.g., affixed to the string) and report the priority level of a set of data.
- a priority level can determine the level of urgency and/or level of importance of an issue in the report. In this manner, the report can organize issues based on the level of priority.
- the string property label can be used to determine a level of priority. For example, a concatenation operation on an untrusted string used for a URL redirect can be given a high level of priority relative to a concatenation operation that has concatenated a sandbox URL to the untrusted string to make code less vulnerable to exploitation such as a redirect from cross-site scripting.
- the priority level can be determined based on a plurality of labels associated with a set of data.
- the plurality of labels can include a neutral label (e.g., a string property label ⁇ as well as a generic label or a security label
- the structural analyzer engine 110 represents any circuitry or combination of circuitry and executable instructions to translate the program code into on an intermediate model.
- An intermediate model can be any appropriate form of representing common constructs (e.g., program language constructs such as branches, functions, declaration, and the like) based on arrangements of data (e.g., the structure of the program code).
- the intermediate model can be a representation of program code based on common syntax constructs of a programming language.
- an intermediate model can comprise meta-data of source code and a data structure to represent language constructs, such as a tree that branches conditions and result bodies of the code for each operation and/or functional characters of the program language.
- the structural analyzer engine 110 can translate program code to identify structure of the program code.
- the program code can be translated by a parser as part of translation to an intermediate model.
- the structural analyzer engine 110 can identify a modification operation construct in the program code and a field associated with the string of the modification operation based on the intermediate model.
- the intermediate model can utilize and/or include a mapping of known operations and fields to recognize the structure of the program code.
- the program language being used may contain a set of designated characters that represent particular operations, such as "if,” "while * "return,” and the like
- the operator engine 104 can receive translated information from the structural analyzer engine 110 to identify operations and variables that can be labeled, such as a concatenation operator or a sub-string replacement function. With the operations identified, the variables of the program code can be compared to a mapping of known operations that should be labeled with neutral taint labels based on a static analysis rule. For example, the program code structure identified by the structural analyzer engine 110 can be compared to the conditions of a security rule, such as a security rule that applies a concatenation label on a
- the data store 102 can contain information utilized by the engines 104, 106, 108, and 110, For example, the data store 102 can store program code, a label, a string, a map, an intermediate model, a static analysis rule, and the like.
- Figure 2 depicts the example system 200 can comprise a memory resource 220 operatively coupled to a processor resource 222, The processor resource 222 can be operatively coupled to a data store 202.
- the data store 202 can be the same as the data store 102 of figure 1.
- the memory resource 220 can contain a set of instructions that are executable by the processor resource 222.
- the set of instructions are operable to cause the processor resource 222 to perform operations of the system 200 when the set of instructions are executed by the processor resource 222,
- the set of instructions stored on the memory resource 220 can be represented as an operator module 204, a label module 206, a sink module 208, and a structural analyzer module 210.
- the operator module 204, the label module 206, the sink module 208, and the structural analyzer module 210 represent program instructions that when executed function as the operator engine 104, the label engine 106, the sink engine 108, and the structural analyzer engine 110 of figure 1, respectively.
- the processor resource 222 can carry out a set of instructions to execute the modules 204, 206, 208. 210, and/or any other appropriate operations among anoVor associated with the modules of the system 200.
- the processor resource 222 can carry out a set of instructions to perform a comparison of a structure of program code to a security rule via an
- the processor resource 222 can carry out a set of instructions to cause a vulnerability report to include the vulnerability issue of the result data when the result data is flagged with a first label or cause a vulnerability report to lack the vulnerability issue of the result data when the result data is flagged with a second label and determine the result data is not a vuinembflftv when the result data includes the second label.
- the determination of the state of the vulnerability can be based on the presence or absence of a neutral taint label indicating a string property
- modules illustrated in figure 2 and discussed in other example implementations perform specific functionalities in the examples discussed herein, these and other functionalities can be accomplished, implemented, or realized at different modules or at combinations of modules.
- two or more modules illustrated and/or discussed as separate can be combined into a module that performs the functionalities discussed in relation to the two modules.
- functionalities performed at one module as discussed in relation to these examples can be performed at a different module or different modules.
- Figure 4 depicts yet another example of how functionality can be organized into modules.
- the processor resource 222 can be any appropriate circuitry capable of processing (e.g. compute) instructions, such as one or multiple processing elements capable of retrieving instructions from the memory resource 220 and executing those instructions.
- the processor resource 222 can be a central processing unit ("CPU") that enables static analysis of program code by fetching, decoding, and executing modules 204, 206, 208, and 210.
- Example processor resources 222 include at least one CPU, a semiconductor-based microprocessor, an application specific integrated circuit fASIC"), a field-programmable gate array (TPGA”), and the like.
- the processor resource 222 can include multiple processing elements that are integrated in a single device or distributed across devices.
- the processor resource 222 can process the instructions serially, concurrently, or in partial concurrence.
- the memory resource 220 and the data store 202 represent a medium to store data utilized and/or produced by the system 200.
- the medium can be any non- transitory medium or combination of non-transitory mediums able to electronically store data, such as modules of the system 200 and/or data used by the system 200.
- the medium can be a storage medium, which is distinct from a transitory transmission medium, such as a signal
- the medium can be machine-readable, such as computer-readable.
- the medium can be an electronic, magnetic, optical, or other physical storage device that is capable of containing (i.e. storing) executable
- the memory resource 220 can be said to store program instructions that when executed by the processor resource 222 cause the processor resource 222 to implement functionality of the system 200 of figure 2,
- the memory resource 220 can be integrated in the same device as the processor resource 222 or it can be separate but accessible to that device and the processor resource 222.
- the memory resource 220 can be distributed across devices.
- the memory resource 220 and the data store 202 can represent the same physical medium or separate physical mediums,
- the data of the data store 202 can include representations of data and/or information mentioned herein.
- the engines 104, 106, 108, and 110 of figure 1 and the modules 204, 206, 208, and 210 of figure 2 have been described as circuitry or a combination of circuitry and executable instructions. Such components can be implemented in a number of fashions.
- the executable instructions can be processor-executable instructions, such as program instructions, stored on the memory resource 220, which is a tangible, non-transitory computer-readable storage medium, and the circuitry can be electronic circuity, such as processor resource 222, for executing those instructions.
- the instructions residing on the memory resource 220 can comprise any set of instructions to be executed directly ⁇ such as machine code) or indirectly (such as a script) by the processor resource 222.
- the system 200 can include the executable instructions can be part of an installation package that when installed can be executed by the processor resource 222 to perform operations of the system 200, such as methods described with regards to figures 4-6.
- the memory resource 220 can be a portable medium such as a compact disc, a digital video disc, a flash drive, or memory maintained by a computer device, such as a service device 334 of figure 3, from which the installation package can be downloaded and installed, in another example, the executable Instructions can be pari of an application of applications already installed:
- the memory resource 220 can be a non-volatile memory resource such as read only memory (“ROM”), a volatile memory resource such as random access memory (“RAM”), a storage device, or a combination thereof.
- Example forms of a memory resource 220 include static RAM (“SRAM”), dynamic RAM (“DRAM”), electrically erasable
- the memory resource 220 can include integrated memory such as a hard drive (“MO”), a solid state drive fSSD”), or an optical drive.
- MO hard drive
- fSSD solid state drive
- Figure 3 depicts example environments in which various example static analysis systems can be implemented.
- the example environment 390 is shown to include an example system 300 for static analysis of program code.
- the system 300 ⁇ described herein with respect to figures 1 and 2) can represent generally any circuitry or combination of circuitry and executable instructions to statically analyze program code.
- the system 300 can include an operator engine 304, a label engine 306, a sink engine 308, and a structural analyzer engine 310 that are the same as the operator engine 104, the label engine 106, the sink engine 108, and the structural analyzer engine 110 of figure 1 , respectively, and the associated descriptions are not repeated for brevity.
- the engines 304, 306, 308, and 310 can be integrated into a compute device, such as a service device 334,
- the engines 304, 306. 308, and 310 can be integrated via circuitry or as installed instructions into a memory resource of the compute.
- the example environment 390 can include compute devices, such as developer devices 332, service devices 334, and user devices 336.
- a first set of instructions, such as program code 340, can be developed and/or modified on a developer device 332,
- an application can be developed and modified on a developer device 332 and stored onto a web server, such as a service device 334.
- the service devices 334 represent generally any compute devices to respond to a network request received from a user device 336, whether virtual or res!.
- the service device 334 can operate a combination of circuitry and executable instructions to provide a network packet in response to a request for a page or functionality of an application.
- the service device 334 can host a static analyzer 342 that utilize a rule source 344 of rules to analyze program code 340.
- the user devices 338 represent generally any compute devices to communicate a network request and receive and/or process the corresponding responses.
- a browser application may be installed on the user device 336 to receive the network packet from the service device 334 and utilize the payload of the packet to display an element of a page via the browser application.
- the compute devices can be located on separate networks 330 or part of the same network 330.
- the example environment 390 can include any appropriate number of networks 330 and any number of the networks 330 can include a cloud compute environment.
- a cloud compute environment may include a virtual shared poo ) of compute resources.
- networks 330 can be distributed networks comprising virtual computing resources.
- Any appropriate combination of the system 300 and compute devices can be a virtual instance of a resource of a virtual shared pool of resources.
- the engines and/or modules of the system 300 herein can reside and/or execute "on the cloud" (e.g. reside and/or execute on a virtual shared pool of
- a link 338 generally represents one or a combination of a cable, wireless connection, fiber optic connection, or remote connections via a telecommunications link, an infrared link, a radio frequency link, or any other connectors of systems that provide electronic communication.
- the link 338 can include, at least in pari, intranet, the internet, or a combination of both.
- the link 33$ can also include intermediate proxies, routers, switches, load balancers, and the like.
- the engines 104, 106, 1C8, and 110 of figure 1 and/or the modules 204, 206, 208, and 210 of figure 2 can be distributed across devices 332, 334, 336, or a combination thereof.
- the engine and/or modules can complete or assist completion of operations performed in describing another engine and/or module.
- the label engine 306 of figure 3 can request, complete, or perform the methods or operations described with the label engine 106 of figure 1 as well as the operator engine 104, the sink engine 108, and the structural analyzer engine 110 of figure 1.
- the various engines and modules are shown as separate engines in figures 1 and 2, in other implementations, the functionality of multiple engines and/or modules may be implemented m a single engine m&or module or divided in a variety of engines and/or modules.
- the engines of the system 300 can perform example methods described in connection with figures 4-6,
- Figure 4 depicts example modules used to implement example static analysis systems.
- the example modules of figure 4 generally include an operator module 404, a label module 406, and a sink module 408,
- the operator module 404, the label module 406, and the sink module 408 can be the same as the operator module 204, the label module 206. and the sink module 208 of the figure 2.
- the example modules of figure 4 can be implemented on an example compute device, such as a service device 334,
- a processor resource executing the operator module 404 can receive an analysis request 458 and cause program code 460 to be analysed based on the intermediate model 462, such as a tree data structure representing an intermediate model provided by a processor resource executing a structural analyzer module (not shown) that when executed performs the function of the structural analyzer engine 110 of figure 1.
- the operator module 404 can include program instructions, such as an identification module 440 and a comparison module 442, to facilitate identification of a modification operation in the program code 460.
- the identificafon module 440 represents program instructions that when executed by a processor resource causes the process resource to receive the operations, fields, and arguments of the program code based on the intermediate model 462.
- the intermediate model 462 represents a model of a code segment of the program code 460 translated & an intermediate form of program language constructs (e.g., operations, fields, arguments, and the like).
- the comparison module 442 represents program instructions that when executed by a processor resource cause the processor resource to compare the identified operations to a static analysis rule condition of the rules 476.
- a processor resource can execute the label module 406 to cause the processor resource to receive the operation information 464 identified by a processor resource executing the operator module 404 and flag the program code 460 with an appropriate label 468 based on the operation information 464,
- the label module 406 can include program instructions, such as a category module 444 and a location module 446.
- the category module 444 represents program instructions that when executed by a processor resource cause the processor resource to identify a category (e.g., type) of a modification operation (such as a concatenation or string replacement), on a string containing untrusted data based on the identified operations of the operation information 464 via the intermediate model 462 and known operations that should be associated with a label based on the modification effect of the operation.
- a category e.g., type
- a modification operation such as a concatenation or string replacement
- the location module 446 represents program instructions that when executed by a processor resource cause the processor resource to identify the location of the untrusted data based on the program structure identified via the intermediate model 462 and provided in the operation information 464.
- the location can be the location of the untrusted data in the result string and/or the location of the untrusted data in the input arguments to the modification operation.
- the label module 406 can flag the program code (e.g., maintain a label with the program code) based on the category of the modification operation and the location of untrusted data of a string,
- a processor resource executing the sink module 408 can receive the data at a sink and determine an issue ⁇ e.g., a vulnerability) based on rules at the sink and the label 468 associated with the data.
- the sink engine 408 can include program
- the KB module 448 represents program instructions that when executed cause the processor resource to identify a string property 466 based on the label 468, For example, the processor resource can retrieve a map of the label to a string property or a combination of labels to a string property 466. For another example, the processor resource executing the KB module 448 can utilize the analysis rules as a knowledge base to identify the string property of the data arriving at the sink with the label 468,
- the priority module 460 represents program instructions that when executed by a processor resource cause the processor resource to determine a priority level 470 of the string property 468.
- the priority level 470 of the string property 466 can be based on the label 468 and/or a plurality of labels associated with the data, as applied to analysis rules at the sink. For example, a first neutral label associated with the data can decrease the priority level 470 compared to a second neutral label.
- the report module 452 represents program instructions that when executed by a processor resource cause the processor resource to determine whether an issue associated with the string property 466 is to be reported and how the issue is to be reported (if the issue is to be reported) based on the priority level 470.
- a processor resource executing the report module 452 can generate a report which can include an issue having a prefix concatenation label with a high priority level with a distinguishing color, leave out an issue associated with a suffix concatenation label, and place an issue with a sub-string replacement label at the end of the report based on the low priority level of the sub-string replacement property.
- a processor executing the sink module 408 can provide a message 474 of the analysis performed by the sink module 408.
- Example messages 474 can include a network communication having a payload of the analysis, a full analysis report of a program code in the form of an electronic file, an email containing the discovered issues, or a communication to produce a web page that displays a dashboard of vulnerability issues related to the program code.
- the message form can be based on the static analyzer used to perform the static analysis.
- Figures 5 and 6 are flow diagrams depicting example methods for static analysis of program coda.
- example methods for static analysis of program code can generally comprise identifying a string manipulation operation within program code, flagging result data of the string manipulation operation with a first label, and setting a priority level of the result data,
- a string manipulation operation is identified within program code that operates on a first string based on a structural comparison of the program code,
- a string manipulation operation is a modification operation performed on a string to manipulate the string in some fashion.
- the first string can be untrusted based on the source of the data of the first string, such as requests for data from a user or other manipulated variable.
- the result data of the string manipulation operation can be flagged with a first label based on a classification of the string manipulation operation.
- the classification can describe how the result data is built based on the first string and the first label can describe the string property of the string based on the string manipulation operation.
- the priority level of the result data is set at block 506 based on the first label.
- the priority level of the result data can be determined at a sink during an analysis session performed by a static analyzer that is compatible with taint labels for taint analysis.
- Figure 6 includes blocks similar to blocks of figure 5 and provides additional blocks and details.
- figure 6 depicts additional blocks and details generally regarding executing a static analyzer, identifying a location of untrusted data in the result data, passing the first label to a variable, determining a vulnerability issue, and causing a message associated with the result data to be presented-
- Blocks 604, 60$, and 612 are similar to blocks ⁇ 02, $04, and 506 of figure 5 and, for brevity, their respective descriptions are not repeated.
- a static analyzer is executed to cause a static analysis to be performed on a set of program code.
- a location of untrusted data in the result data of the string manipulation operation is identified.
- the label associated with the result data at block 608 can be based on the location identified at block 606.
- a label to flag the result data can be identified based on one of the string of untrusted data being located on a left side of the result data, the string of untrusted data being located on a right side of the result data, and having a pattern replacement performed on the string of untrusted data.
- the string manipulation operation can be identified as a concatenation operation and, based on the structure of the concatenation operation and a location of the string in the structure, a prefix label can be selected when the string is located at a first end of the result data or a suffix label can be selected when the first string is located at a second end of the result data.
- the first label associated with the result data is passed to a variable during an assignment operation.
- a first label can be attached to a second string based on an assignment of the first string to a second string.
- the labels associated with input arguments to an operation pass to the result data of the operation and any assignments of the result data to a variable also retain the labels of the result data.
- Resulting assignments can be analyzed based on a plurality of labels (including the neutral taint label associated with the string property) that are passed at each assignment operation aneVor subsequent assignment to retain the sirmg property label at each resulting variable of the assignments,
- the priority level of the result data can be identified when the result data arrives at a psyload sensitive sink (e.g., a sink programmed to evaluate generic and neutral taint labels such as string property labels).
- the priority level of the result data (and/or variable to which the result data is assigned) can be evaluated and set based on the labels associated with the data based on the sensitivity of the sink to psyload.
- a first combination of labels may result in a first priority level and a second combination of labels can result in a second priority level
- a priority level can be associated with a particular generic taint label or security taint label but based on the neutral taint label being a prefix label (e.g., a label indicating a prefix concatenation of the string to produce the result data), the priority level of the result data can be decreased.
- vulnerability issue can be determined based on the label
- the vulnerability issue can be based on the string property associated with the label
- a message associated with the result data can be caused to be presented based on the priority level.
- the vulnerability issue identified at block 614 can be reported via a message from the static analyzer, and the vulnerability issue can be caused to be presented based on the priority level such as in a particular font size or color.
- a report state e.g., a state indicating whether the result data should be reported or unreported
- the sink can determine the vulnerability issue is to be unreported when the priority level achieves a minimum threshold.
- the determination of whether to report and how to report issues of data when arriving at the sink can be based on a range of thresholds associated with the priority level of an issue.
- a determination of a possible vulnerability (and associated priority level) of the program code can be based on the sensitivity of the sink and a plurality of labels associated with the data at the sink where the plurality of labels associated with the data being evaluated can include string property labels as well as security labels, generic labels, and/or neutral labels.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Quality & Reliability (AREA)
- Computing Systems (AREA)
- Stored Programmes (AREA)
Abstract
In one implementation, a static analysis system can include an operator engine to identify a modification operation on a string based on a structural comparison of program code to a static analysis rule, a label engine to maintain a label with the string based on the static analysis rule, and a sink engine to identify that the label denotes a string property and provide an analysis message associated with the string property based on the label.
Description
String Property Labels for Static Analysis
BACKGROUND
[0001 ] Static analysis is a technique to study a program by analyzing program code (e.g., source code and/or object code) without executing the program. Static analysis is commonly performed by an automated static analysis tool to analyze the program code using a mathematical technique and/or program simulation technique. For example, a static analysis tool can simulate coda execution paths based on program simulations anoVor mathematical functions. A static analysis tool can commonly perform functions to identify coding errors and/or mathematically prove properties about the program code. For example, static analysis can be used to verify properties of a program and locate a potential vulnerability to a malicious attack,
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] Figures 1 and 2 are block diagrams depicting example static analysis systems.
[0003] Figure 3 depicts an example environment in which various static analysis systems can be implemented.
[0004] Figure 4 depicts example modules consistent with example static analysis systems.
[0005] Figures 5 and 6 are flow diagrams depicting example methods for static analysis of program code.
DETAILED DESCRIPTION
[0006] in the following description and figures, some example implementations of static analysis systems and/or methods are described. A benefit of static analysis is being able to find vulnerabilities in program code (i.e., set of executable instructions)
without executing the program code. However, because code execution paths are simulated, real execution paths may differ from the simulated paths. This can lead to results of the static analysis tool to include false positives or false negatives. Some static analysis vulnerability categories suffer from a relatively high false results rate that impacts the accuracy of the static analysis. One technique of static analysis Is taint analysis. Taint analysis is a technique that emulates a program execution where data entering through various user-controlled sources is propagated through the application until the data reaches a consumption point or sink as discussed herein. For example, data can be tainted if the data is controlled by a user,
[0007] A static analyzer can perform taint analysis based on a set of static analysis rules. A static analysis rule is a data structure that describes a condition and a result based on the condition to produce a model of the dataflow (e.g.. propagation flow of data) through program code. For example, a static analysis rule can cause a static analysis tod to parse a line of code, identify fields and structure of the line of code, and perform a function (such as add a taint label) based on structure and/or entries of the fields parsed from the line of code. Static analysis rules can be organized into security rules and infrastructure rules. Security rules are static analysis rules related to security based on operation of the program code and known vulnerabilities. An infrastructure rule is a static analysis rule associated with how the program code interacts with other program code. Static analysis rules can utilize taint labels to make vulnerability determinations at a sink. Example taint labels include security iaint labels (e.g., a label particular to security vulnerability), generic taint labels (e.g., a label indicating the source of data), neutral taint labels (e.g., a label associated with how data propagates), and the like.
[0008] An example data structure of a rule can include a rule identification value; a function identifier having a namespace, a class name, a function name, and the like; an input argument, and an output argument. Using that example data structure, the example rule above can, for example, identify that the code line example of "String a∞ text.getDetaO* describes that if lexf is tainted, then *a" will also be tainted. For another example, a variable "urf can represent a uniform resource locator ("URL") such as lanexamplesite,comf and the code segment of "Redirected +
req|,getParameter("url ))f and "Redirectq.getParameter("url") both utilize untrusted data, but the first code statement redirects the user to a different page on the same site rather than the second code statement which allows the retrieved parameter to include a different site, such as a malicious site. The priority of the vulnerability to untrusted data for these two statements of code should be different even though both utilize untrusted data.
[0009] Various examples described below relate to incorporating a string property label associated with operations that manipulate a string to static analysis and decrease the number of false results prowled by a static analyzer By attaching labels to data as the data is modified, a sink can identify whether to report a vulnerability based on the labels attached to the data. In the previous example URL parameter retrieval code statements, the URL retrieved from the request can be flagged as untrusted and vulnerable, but the code statement having a static concatenation to the beginning of the URL can be flagged with a lower-rated vulnerability because the redirect is restricted to particular locations which can decrease the vulnerability of the program to cross-site scripting attacks. Based on the severity of the vulnerability and the structure of code operations, the vulnerabity issue of the untrusted data can be given an appropriate priority level in a vulnerability report or even provide the report without the vulnerability issue depending on the characterization of the code based on the structure and modification of the untrusted data as described by the string property iabel(s).
[0010] The terms "include," "have," and variations thereof, as used herein, mean the same as the term "comprise" or appropriate variation thereof. Furthermore, the term "based on," as used herein, means "based at least in part on * Thus, a feature that is described as based on some stimulus can be based only on the stimulus or a
combination of stimuli including the stimulus. Furthermore, the term "maintain" (and variations thereof) as used herein means "to create, delete, add, remove, access, update, associate, attach, affix, and/or modify."
[0011] Figures 1 and 2 are block diagrams depicting example static analysis systems 100 and 200, Referring to figure 1, the example static analysis system 100 of figure 1 generally includes a data store 102, an operator engine 104, a label engine 106, and a sink engine 108. In general, the label engine 106 can label data based on a
modification operation to the data identified by the operator engine 104 and the label can be used by the sink engine 108 to identify an analysis message associated with the label The example static analysis system 100 can include a structural analyzer engine 110 to facilitate comparison of the program code to a static analysis rule.
[0012] The operator engine 104 represents any circuitry or combination of circuitry and executable instructions to identify a modification operation on a string during an analysis session based on a structural comparison of program code to a static analysis rule. A modification operation can be an operation that manipulates a string. Example string operations include concatenations and string replacements, such as a sub-string replacement. The modification operation can be identified by an operator, such as a plus sign, or a known function, such as a known Wbrdiy function for
concatenation.
[0013] The modification operation can operate on untrusted data based on the source of the string. The string can represent a variable capable of receiving untrusted data from a user, if the string is vulnerable to malicious code, a modification operation on the string can preserve or decrease the vulnerability based on the category of the modification operation. For example, a redirect using untrusted data can be restricted from full exploitation by concatenating a static string to the beginning (e.g., the left-hand side) of the untrusted data.
[0014] A static analysis rule can include a condition for applying the result of the rule. The condition can be based on a comparison of the structure of program code being analyzed with the static analysis rule. For example, the structure of program code can include the characters that identify a particular operation, such as a plus sign used to denote the concatenation operator. For another example, the structure of the program code can denote fields, such as arguments to the operation, by finding particular characters, such as commas, to delineate separation of values and/or variables. The structural comparison can be performed by the structural analyzer engine 110 discussed in more detail below.
[0015] The location of the string in the result data of the modification operation (i.e., the data resulting from the modification operation) can be identified based on the structural comparison. For example, the operator engine 104 can identify a
concatenation operation on a string based on a stroeturat comparison of the program code to known concatenation operation forms. For another example, the operator engine 104 can identify a location of the string in the result data of a concatenation operation based on the structure of program code having the string on the left side or right side of the concatenation operator.
[0016] The label engine 106 represents any circuitry or combination of circuitry and executable instructions to maintain a label with the string based on a static analysis rule. The label can be any number, character, string, category, or other value capable of representing a property of a siring. Example labels regarding the context described herein includes a concatenation label, a replacement label, a prefix label, and a suffix label. Other examples of labels include security-specific and vulnerability-specific labels and generic labels, such as labels about the origin of the data. The label identifies how the string has been modified by the modification operation. For example, the label can describe a category of the modification operation (e.g., a concatenation operation or replacement operation) and a location of the modification operation (e.g., is the string concatenated to the left side or right side of a vahabie) based on the untrusted data.
[0017] Any appropriate data of the program code can be associated with a label Associating a label with a set of data is referred to herein as "flagging" or appropriate variation thereof For example, result data of a concatenation operation can be flagged with a concatenation label. For another example, result data can be flagged based on a location of the untrusted string in the result data, such as with one of a first label to denote an untrusted prefix when the location of the string is on a left-hand side of the result data and a second label to denote an untrusted suffix when the location of the string is on a right-hand side of the result data,
[0018] The sink engine 108 represents any circuitry or combination of circuitry and executable instructions to identify a string property based on the label and provide an analysis message associated with the string property based on the label. For example, the sink engine 10S can identify a concatenated string from the label and cause a static analyser tc provide a message (e.g., report a finding) of a tainted concatenated string. For another example, the sink engine 108 can identify that the label denotes the string property based on a rule and/or a map that associates the label
with the string property. The sink engine 108 can provide an appropriate message based on the string property identified based on the label, For example, a message stating the vulnerability issue associated with the string property can be caused to present to a user of a static analysis tool. The sink engine 106 can determine the set of data is not vulnerable based on the label when the static analyzer would otherwise determine so without the string property label For example, the sink engine 108 can identify the string was concatenated with a prefix string based on the prefix label and then provide an analysis message that the code utilizing the string is lacking a particular vulnerability due to the prefixed string property. For another example, an API can be vulnerable or not depending on the implementation of the API {e.g., how the API utilizes the data, such as a string). A string property can be associated with a security issue, such as a vulnerability, of a code statement and the sink engine 108 can report an issue associated with a set of data when the set of data is flagged with a label associated with the string property. The sink engine 108 can provide a message, such as a report, based on the sink sensitivity to the data modifications modeled by labels affixed to the data, such as the string property label. The sensitivity of the sink refers to the ability of the sink to evaluate program code based on any string properties of the data arriving at the sink, the source of the data of the program code, whether the data is validated, and the like, in this manner, the presence of labels that are neutral to security can provide insight on whether to report an issue to which the sink is sensifve or not report an issue when the sink is not sensitive to the issue.
[0019] The sink engine 108 identify a priority level of a vulnerability based on the label associated with the string {e.g., affixed to the string) and report the priority level of a set of data. A priority level can determine the level of urgency and/or level of importance of an issue in the report. In this manner, the report can organize issues based on the level of priority. The string property label can be used to determine a level of priority. For example, a concatenation operation on an untrusted string used for a URL redirect can be given a high level of priority relative to a concatenation operation that has concatenated a sandbox URL to the untrusted string to make code less vulnerable to exploitation such as a redirect from cross-site scripting. The priority level can be determined based on a plurality of labels associated with a set of data. For
example, the plurality of labels can include a neutral label (e.g., a string property label} as well as a generic label or a security label
[0020] The structural analyzer engine 110 represents any circuitry or combination of circuitry and executable instructions to translate the program code into on an intermediate model. An intermediate model can be any appropriate form of representing common constructs (e.g., program language constructs such as branches, functions, declaration, and the like) based on arrangements of data (e.g., the structure of the program code). For example, the intermediate model can be a representation of program code based on common syntax constructs of a programming language. For another example, an intermediate model can comprise meta-data of source code and a data structure to represent language constructs, such as a tree that branches conditions and result bodies of the code for each operation and/or functional characters of the program language. The structural analyzer engine 110 can translate program code to identify structure of the program code. For example* the program code can be translated by a parser as part of translation to an intermediate model. The structural analyzer engine 110 can identify a modification operation construct in the program code and a field associated with the string of the modification operation based on the intermediate model. The intermediate model can utilize and/or include a mapping of known operations and fields to recognize the structure of the program code. For example, the program language being used may contain a set of designated characters that represent particular operations, such as "if," "while * "return," and the like
represented by a map, tn this manner, the operator engine 104 can receive translated information from the structural analyzer engine 110 to identify operations and variables that can be labeled, such as a concatenation operator or a sub-string replacement function. With the operations identified, the variables of the program code can be compared to a mapping of known operations that should be labeled with neutral taint labels based on a static analysis rule. For example, the program code structure identified by the structural analyzer engine 110 can be compared to the conditions of a security rule, such as a security rule that applies a concatenation label on a
concatenation operator, end flagged with the label when the condition is satisfied.
[0021] The data store 102 can contain information utilized by the engines 104, 106, 108, and 110, For example, the data store 102 can store program code, a label, a string, a map, an intermediate model, a static analysis rule, and the like.
[0022] Figure 2 depicts the example system 200 can comprise a memory resource 220 operatively coupled to a processor resource 222, The processor resource 222 can be operatively coupled to a data store 202. The data store 202 can be the same as the data store 102 of figure 1.
[0023] Referring to figure 2, the memory resource 220 can contain a set of instructions that are executable by the processor resource 222. The set of instructions are operable to cause the processor resource 222 to perform operations of the system 200 when the set of instructions are executed by the processor resource 222, The set of instructions stored on the memory resource 220 can be represented as an operator module 204, a label module 206, a sink module 208, and a structural analyzer module 210. The operator module 204, the label module 206, the sink module 208, and the structural analyzer module 210 represent program instructions that when executed function as the operator engine 104, the label engine 106, the sink engine 108, and the structural analyzer engine 110 of figure 1, respectively. The processor resource 222 can carry out a set of instructions to execute the modules 204, 206, 208. 210, and/or any other appropriate operations among anoVor associated with the modules of the system 200. For example, the processor resource 222 can carry out a set of instructions to perform a comparison of a structure of program code to a security rule via an
intermediate model, identify a concatenation operation on a string of the program code based on the comparison, identify a location of the string in a result of the concatenation operation based on the structure of the program code, ftag the result with a first label to denote one of an untrusted prefix or an untrusted suffix based on whether the location of the string is on the left-hand side of the result data or the right-hand side of the result data, and report an issue associated with the result based on the label flagged with the result. For another example, the processor resource 222 can carry out a set of instructions to cause a vulnerability report to include the vulnerability issue of the result data when the result data is flagged with a first label or cause a vulnerability report to lack the vulnerability issue of the result data when the result data is flagged with a
second label and determine the result data is not a vuinembflftv when the result data includes the second label. In that example* the determination of the state of the vulnerability can be based on the presence or absence of a neutral taint label indicating a string property,
[0024] Although these particular modules and various ofter modules are illustrated and discussed in relation to figure 2 and other example implementations, other combinations or subcombinations of modules can be included within other implementations. Said differently, although the modules illustrated in figure 2 and discussed in other example implementations perform specific functionalities in the examples discussed herein, these and other functionalities can be accomplished, implemented, or realized at different modules or at combinations of modules. For example, two or more modules illustrated and/or discussed as separate can be combined into a module that performs the functionalities discussed in relation to the two modules. As another example, functionalities performed at one module as discussed in relation to these examples can be performed at a different module or different modules. Figure 4 depicts yet another example of how functionality can be organized into modules.
[0025] The processor resource 222 can be any appropriate circuitry capable of processing (e.g. compute) instructions, such as one or multiple processing elements capable of retrieving instructions from the memory resource 220 and executing those instructions. For example, the processor resource 222 can be a central processing unit ("CPU") that enables static analysis of program code by fetching, decoding, and executing modules 204, 206, 208, and 210. Example processor resources 222 include at least one CPU, a semiconductor-based microprocessor, an application specific integrated circuit fASIC"), a field-programmable gate array (TPGA"), and the like. The processor resource 222 can include multiple processing elements that are integrated in a single device or distributed across devices. The processor resource 222 can process the instructions serially, concurrently, or in partial concurrence.
[0026] The memory resource 220 and the data store 202 represent a medium to store data utilized and/or produced by the system 200. The medium can be any non- transitory medium or combination of non-transitory mediums able to electronically store
data, such as modules of the system 200 and/or data used by the system 200. For example, the medium can be a storage medium, which is distinct from a transitory transmission medium, such as a signal The medium can be machine-readable, such as computer-readable. The medium can be an electronic, magnetic, optical, or other physical storage device that is capable of containing (i.e. storing) executable
instructions. The memory resource 220 can be said to store program instructions that when executed by the processor resource 222 cause the processor resource 222 to implement functionality of the system 200 of figure 2, The memory resource 220 can be integrated in the same device as the processor resource 222 or it can be separate but accessible to that device and the processor resource 222. The memory resource 220 can be distributed across devices. The memory resource 220 and the data store 202 can represent the same physical medium or separate physical mediums, The data of the data store 202 can include representations of data and/or information mentioned herein.
[0027] In the discussion herein, the engines 104, 106, 108, and 110 of figure 1 and the modules 204, 206, 208, and 210 of figure 2 have been described as circuitry or a combination of circuitry and executable instructions. Such components can be implemented in a number of fashions. Looking at figure 2, the executable instructions can be processor-executable instructions, such as program instructions, stored on the memory resource 220, which is a tangible, non-transitory computer-readable storage medium, and the circuitry can be electronic circuity, such as processor resource 222, for executing those instructions. The instructions residing on the memory resource 220 can comprise any set of instructions to be executed directly {such as machine code) or indirectly (such as a script) by the processor resource 222.
[0028] in some examples, the system 200 can include the executable instructions can be part of an installation package that when installed can be executed by the processor resource 222 to perform operations of the system 200, such as methods described with regards to figures 4-6. In mat example, the memory resource 220 can be a portable medium such as a compact disc, a digital video disc, a flash drive, or memory maintained by a computer device, such as a service device 334 of figure 3, from which the installation package can be downloaded and installed, in another example, the
executable Instructions can be pari of an application of applications already installed: The memory resource 220 can be a non-volatile memory resource such as read only memory ("ROM"), a volatile memory resource such as random access memory ("RAM"), a storage device, or a combination thereof. Example forms of a memory resource 220 include static RAM ("SRAM"), dynamic RAM ("DRAM"), electrically erasable
programmable ROM f EEPROM"), flash memory, or the tike. The memory resource 220 can include integrated memory such as a hard drive ("MO"), a solid state drive fSSD"), or an optical drive.
[0029] Figure 3 depicts example environments in which various example static analysis systems can be implemented. The example environment 390 is shown to include an example system 300 for static analysis of program code. The system 300 {described herein with respect to figures 1 and 2) can represent generally any circuitry or combination of circuitry and executable instructions to statically analyze program code. The system 300 can include an operator engine 304, a label engine 306, a sink engine 308, and a structural analyzer engine 310 that are the same as the operator engine 104, the label engine 106, the sink engine 108, and the structural analyzer engine 110 of figure 1 , respectively, and the associated descriptions are not repeated for brevity. As shown in figure 3, the engines 304, 306, 308, and 310 can be integrated into a compute device, such as a service device 334, The engines 304, 306. 308, and 310 can be integrated via circuitry or as installed instructions into a memory resource of the compute.
[0030] The example environment 390 can include compute devices, such as developer devices 332, service devices 334, and user devices 336. A first set of instructions, such as program code 340, can be developed and/or modified on a developer device 332, For example, an application can be developed and modified on a developer device 332 and stored onto a web server, such as a service device 334. The service devices 334 represent generally any compute devices to respond to a network request received from a user device 336, whether virtual or res!. For example, the service device 334 can operate a combination of circuitry and executable instructions to provide a network packet in response to a request for a page or functionality of an application. For another example, the service device 334 can host a static analyzer 342
that utilize a rule source 344 of rules to analyze program code 340. the user devices 338 represent generally any compute devices to communicate a network request and receive and/or process the corresponding responses. For example, a browser application may be installed on the user device 336 to receive the network packet from the service device 334 and utilize the payload of the packet to display an element of a page via the browser application.
100313 The compute devices can be located on separate networks 330 or part of the same network 330. The example environment 390 can include any appropriate number of networks 330 and any number of the networks 330 can include a cloud compute environment. A cloud compute environment may include a virtual shared poo) of compute resources. For example, networks 330 can be distributed networks comprising virtual computing resources. Any appropriate combination of the system 300 and compute devices can be a virtual instance of a resource of a virtual shared pool of resources. The engines and/or modules of the system 300 herein can reside and/or execute "on the cloud" (e.g. reside and/or execute on a virtual shared pool of
resources),
[0032] A link 338 generally represents one or a combination of a cable, wireless connection, fiber optic connection, or remote connections via a telecommunications link, an infrared link, a radio frequency link, or any other connectors of systems that provide electronic communication. The link 338 can include, at least in pari, intranet, the internet, or a combination of both. The link 33$ can also include intermediate proxies, routers, switches, load balancers, and the like.
[0033] Referring to figures 1-3, the engines 104, 106, 1C8, and 110 of figure 1 and/or the modules 204, 206, 208, and 210 of figure 2 can be distributed across devices 332, 334, 336, or a combination thereof. The engine and/or modules can complete or assist completion of operations performed in describing another engine and/or module. For example, the label engine 306 of figure 3 can request, complete, or perform the methods or operations described with the label engine 106 of figure 1 as well as the operator engine 104, the sink engine 108, and the structural analyzer engine 110 of figure 1. Thus, although the various engines and modules are shown as separate engines in figures 1 and 2, in other implementations, the functionality of multiple
engines and/or modules may be implemented m a single engine m&or module or divided in a variety of engines and/or modules. In some example, the engines of the system 300 can perform example methods described in connection with figures 4-6,
[0034] Figure 4 depicts example modules used to implement example static analysis systems. Referring to figure 4, the example modules of figure 4 generally include an operator module 404, a label module 406, and a sink module 408, The operator module 404, the label module 406, and the sink module 408 can be the same as the operator module 204, the label module 206. and the sink module 208 of the figure 2. The example modules of figure 4 can be implemented on an example compute device, such as a service device 334,
[0035] A processor resource executing the operator module 404 can receive an analysis request 458 and cause program code 460 to be analysed based on the intermediate model 462, such as a tree data structure representing an intermediate model provided by a processor resource executing a structural analyzer module (not shown) that when executed performs the function of the structural analyzer engine 110 of figure 1. The operator module 404 can include program instructions, such as an identification module 440 and a comparison module 442, to facilitate identification of a modification operation in the program code 460. The identificafon module 440 represents program instructions that when executed by a processor resource causes the process resource to receive the operations, fields, and arguments of the program code based on the intermediate model 462. The intermediate model 462 represents a model of a code segment of the program code 460 translated & an intermediate form of program language constructs (e.g., operations, fields, arguments, and the like). The comparison module 442 represents program instructions that when executed by a processor resource cause the processor resource to compare the identified operations to a static analysis rule condition of the rules 476.
[0036] A processor resource can execute the label module 406 to cause the processor resource to receive the operation information 464 identified by a processor resource executing the operator module 404 and flag the program code 460 with an appropriate label 468 based on the operation information 464, The label module 406 can include program instructions, such as a category module 444 and a location module
446. to f aciitate deierminat son of an appropriate label based on the rule* 476 and the operation information 464, The category module 444 represents program instructions that when executed by a processor resource cause the processor resource to identify a category (e.g., type) of a modification operation (such as a concatenation or string replacement), on a string containing untrusted data based on the identified operations of the operation information 464 via the intermediate model 462 and known operations that should be associated with a label based on the modification effect of the operation. Known operations can be provided via the rules 476. The location module 446 represents program instructions that when executed by a processor resource cause the processor resource to identify the location of the untrusted data based on the program structure identified via the intermediate model 462 and provided in the operation information 464. For example, the location can be the location of the untrusted data in the result string and/or the location of the untrusted data in the input arguments to the modification operation. The label module 406 can flag the program code (e.g., maintain a label with the program code) based on the category of the modification operation and the location of untrusted data of a string,
[0037] A processor resource executing the sink module 408 can receive the data at a sink and determine an issue {e.g., a vulnerability) based on rules at the sink and the label 468 associated with the data. The sink engine 408 can include program
instructions to facilitate the analysis of the data at the sink, such as a knowledge base (" KB") module 448, a priority module 450, and a report module 452. The KB module 448 represents program instructions that when executed cause the processor resource to identify a string property 466 based on the label 468, For example, the processor resource can retrieve a map of the label to a string property or a combination of labels to a string property 466. For another example, the processor resource executing the KB module 448 can utilize the analysis rules as a knowledge base to identify the string property of the data arriving at the sink with the label 468, The priority module 460 represents program instructions that when executed by a processor resource cause the processor resource to determine a priority level 470 of the string property 468. The priority level 470 of the string property 466 can be based on the label 468 and/or a plurality of labels associated with the data, as applied to analysis rules at the sink. For
example, a first neutral label associated with the data can decrease the priority level 470 compared to a second neutral label. The report module 452 represents program instructions that when executed by a processor resource cause the processor resource to determine whether an issue associated with the string property 466 is to be reported and how the issue is to be reported (if the issue is to be reported) based on the priority level 470. For example, a processor resource executing the report module 452 can generate a report which can include an issue having a prefix concatenation label with a high priority level with a distinguishing color, leave out an issue associated with a suffix concatenation label, and place an issue with a sub-string replacement label at the end of the report based on the low priority level of the sub-string replacement property. A processor executing the sink module 408 can provide a message 474 of the analysis performed by the sink module 408. Example messages 474 can include a network communication having a payload of the analysis, a full analysis report of a program code in the form of an electronic file, an email containing the discovered issues, or a communication to produce a web page that displays a dashboard of vulnerability issues related to the program code. The message form can be based on the static analyzer used to perform the static analysis.
[0038] Figures 5 and 6 are flow diagrams depicting example methods for static analysis of program coda. Referring to figure 5, example methods for static analysis of program code can generally comprise identifying a string manipulation operation within program code, flagging result data of the string manipulation operation with a first label, and setting a priority level of the result data,
[0039] At block 502, a string manipulation operation is identified within program code that operates on a first string based on a structural comparison of the program code, A string manipulation operation is a modification operation performed on a string to manipulate the string in some fashion. The first string can be untrusted based on the source of the data of the first string, such as requests for data from a user or other manipulated variable. At block 504, the result data of the string manipulation operation can be flagged with a first label based on a classification of the string manipulation operation. The classification can describe how the result data is built based on the first string and the first label can describe the string property of the string based on the string
manipulation operation. The priority level of the result data is set at block 506 based on the first label. The priority level of the result data can be determined at a sink during an analysis session performed by a static analyzer that is compatible with taint labels for taint analysis.
[0040] Figure 6 includes blocks similar to blocks of figure 5 and provides additional blocks and details. In particular, figure 6 depicts additional blocks and details generally regarding executing a static analyzer, identifying a location of untrusted data in the result data, passing the first label to a variable, determining a vulnerability issue, and causing a message associated with the result data to be presented- Blocks 604, 60$, and 612 are similar to blocks §02, $04, and 506 of figure 5 and, for brevity, their respective descriptions are not repeated.
[0041] At block 602, a static analyzer is executed to cause a static analysis to be performed on a set of program code. During the analysis session, the string
manipulation operation is identified and, at block 606, a location of untrusted data in the result data of the string manipulation operation is identified. The label associated with the result data at block 608 can be based on the location identified at block 606. For example, a label to flag the result data can be identified based on one of the string of untrusted data being located on a left side of the result data, the string of untrusted data being located on a right side of the result data, and having a pattern replacement performed on the string of untrusted data. For another example, the string manipulation operation can be identified as a concatenation operation and, based on the structure of the concatenation operation and a location of the string in the structure, a prefix label can be selected when the string is located at a first end of the result data or a suffix label can be selected when the first string is located at a second end of the result data.
[0042] At block 610, the first label associated with the result data is passed to a variable during an assignment operation. For example, a first label can be attached to a second string based on an assignment of the first string to a second string. For another example, the labels associated with input arguments to an operation pass to the result data of the operation and any assignments of the result data to a variable also retain the labels of the result data. Resulting assignments can be analyzed based on a plurality of labels (including the neutral taint label associated with the string property) that are
passed at each assignment operation aneVor subsequent assignment to retain the sirmg property label at each resulting variable of the assignments,
[0043] At block 612, the priority level of the result data can be identified when the result data arrives at a psyload sensitive sink (e.g., a sink programmed to evaluate generic and neutral taint labels such as string property labels). The priority level of the result data (and/or variable to which the result data is assigned) can be evaluated and set based on the labels associated with the data based on the sensitivity of the sink to psyload. For example, a first combination of labels may result in a first priority level and a second combination of labels can result in a second priority level For another example, a priority level can be associated with a particular generic taint label or security taint label but based on the neutral taint label being a prefix label (e.g., a label indicating a prefix concatenation of the string to produce the result data), the priority level of the result data can be decreased.
[0044] At block 614, vulnerability issue can be determined based on the label The vulnerability issue can be based on the string property associated with the label At block 616, a message associated with the result data can be caused to be presented based on the priority level. For example, the vulnerability issue identified at block 614 can be reported via a message from the static analyzer, and the vulnerability issue can be caused to be presented based on the priority level such as in a particular font size or color. A report state (e.g., a state indicating whether the result data should be reported or unreported) can be evaluated at the payload sensitive sink. For example, the sink can determine the vulnerability issue is to be unreported when the priority level achieves a minimum threshold. The determination of whether to report and how to report issues of data when arriving at the sink can be based on a range of thresholds associated with the priority level of an issue. In this manner, a determination of a possible vulnerability (and associated priority level) of the program code can be based on the sensitivity of the sink and a plurality of labels associated with the data at the sink where the plurality of labels associated with the data being evaluated can include string property labels as well as security labels, generic labels, and/or neutral labels.
[0045] Although the flow diagrams of figures 4-6 illustrate specific orders of execution, the order of execution may differ from that which is illustrated. For example.
the order of execute of the blocks may be scrambled relative tofhe order shown. Also, the blocks shown in succession may be executed concurrently or with partial
concurrence. Ail such variations are within the scope of the present description,
[0046] The present description has been shown and described with reference to the foregoing examples. It is understood, however, that other forms, details, and examples may be made without departing from the spirit and scope of the following claims. The use of the words "first," ""second," or related terms in the claims are not used to limit the claim elements to an order or location, but are merely used to distinguish separate claim elements.
Claims
What is claimed is: 1. A static analysis system comprising:
an operator engine to identify a modification operation on a string during an analysis session based on a structural comparison of program code to a static analysis rule, the modification operation to operate on untrusted data;
a label engine to maintain a label with the string based on the static analysis rule that describes a category of the modification operation and a location of the modification operation based on the untrusted data, the label to denote a string property; and
a sink engine to identify that the Iabei denotes the string property and provide an analysis message associated with the string property based on the Iabei.
2, The system of claim 1 , wherein the modification operation is one of a string
concatenation and a string replacement
3. The system of claim 1, wherein the structural comparison identifies the location of the string in result data of the modification operation and the label identifies how the string was modified by the modification operation,
4. The system of claim 1, comprising:
a structural analyzer engine to translate the program code into an intermediate model to identify the modification operation in the program code and a field associated with the string, the intermediate model to utilize a mapping of known operations and fields to recognize the structure of the program code.
5. The system of claim 4, wherein the static analysis rule is a security rule and the sink engine is further to identify a priority level of a vulnerability based on the iabei affixed to the string.
6. A mtvtransrtory computer readable storage medium comprising a set of instructions executable by a processor resource to;
perform a comparison of a structure of program code to a condition of a static analysis rule via an intermediate model, the structure of program code to include a string representing a variable capable of receiving untrusted data from a user;
identify a concatenation operation on the string based on the comparison;
identify a location of the string in a result data of the concatenation operation based on the structure of the program code;
flag the result data of the concatenation operation with, sased on the location of the string in the result data, one of:
a first label to denote an untrusted prefix when the location of the string is on a left-hand side of the result data; and
a second label to denote an untrusted suffix when the location of the string is on a right-hand side of the result data; and
report an issue associated with the result data when the result data is flagged with the first label, the issue associated with a string property.
7. The medium of claim 6, wherein the set of instructions is executable by the
processor resource to:
parse the program code to identify the structure of the program code; and translate the structure of the program code to the intermediate model.
8. The medium of claim 6, wherein the set of instructions is executable by the
processor resource to;
report a priority level of the result data based on a plurality of labels associated with the result data, the plurality of labels to include one of the first label and the second label.
9. The medium of claim 6, wherein the set of instructions is executable by the
processor resource to:
determine the result data is not a vulnerability based on the second label
10. The medium of claim 6, wherein the set of instructions is executable by the processor resource to;
cause a vulnerability report to include the result data when the result data is flagged with the first label; and
cause the vulnerability report to lack the result data when the result data is flagged with the second label 11, A method for static analysis of program code comprising;
identifying a string manipulation operation within the program code that operates on a first string based on a structural comparison of the program code;
flagging result data of the string manipulation operation «ith a first label based on a classification of the string manipulation operation, the classification to describe how the result data is built based on the first string;
setting a priority le*el to the result data based on the first label; and
causing a message associated with the result data to be presented based on the priority level. 12- The method of claim 11 , comprising:
attaching the first label to a second string based on an assignment of the first string to the second string;
identifying the first label to flag the result data based on one of the first string being located on a left side of the result data, the first string being located on a right side of the result data, and having a pattern replacement performed on the first string.
analyzing the second string based on the first label; and
identifying the priority level of the result data based on the first label when the result data arrives at a sink. 13. The method of claim 12, comprising:
identifying the string manipulate operation as a concatenation operation: and, based on a structure of the concatenation operation and a location of the first string, one of;
selecting a prefix iabei when the first string is located at a first end of the result data; and
selecting a suffix label when the first string is located at a second end of the result data. 14, The method of claim 13, comprising:
decreasing the priority level of the result data at a sink when the first iabel represents the prefix label, the prefix iabei to indicate a prefix concatenation of the first string to produce the result data; and
determining a vulnerability issue to be unreported when the priority level achieves a minimum threshold, 15. The method of claim 11 , comprising:
passing a plurality of iabeis to a variable when an assignment operation is made of the first string to the variable, the plurality of labels to include the first iabei; and evaluating the priority ievel and report state of the result data at a payioad sensitive sink, the report state to indicate whether the result data is reported or unreported.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/500,531 US10546132B2 (en) | 2014-09-30 | 2014-09-30 | String property labels for static analysis |
EP14903315.1A EP3201780A4 (en) | 2014-09-30 | 2014-09-30 | String property labels for static analysis |
PCT/US2014/058224 WO2016053282A1 (en) | 2014-09-30 | 2014-09-30 | String property labels for static analysis |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2014/058224 WO2016053282A1 (en) | 2014-09-30 | 2014-09-30 | String property labels for static analysis |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016053282A1 true WO2016053282A1 (en) | 2016-04-07 |
Family
ID=55631138
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2014/058224 WO2016053282A1 (en) | 2014-09-30 | 2014-09-30 | String property labels for static analysis |
Country Status (3)
Country | Link |
---|---|
US (1) | US10546132B2 (en) |
EP (1) | EP3201780A4 (en) |
WO (1) | WO2016053282A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111651773A (en) * | 2020-08-05 | 2020-09-11 | 成都无糖信息技术有限公司 | An automatic mining method for binary security vulnerabilities |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10235176B2 (en) | 2015-12-17 | 2019-03-19 | The Charles Stark Draper Laboratory, Inc. | Techniques for metadata processing |
US10936713B2 (en) * | 2015-12-17 | 2021-03-02 | The Charles Stark Draper Laboratory, Inc. | Techniques for metadata processing |
US10235218B2 (en) | 2016-05-03 | 2019-03-19 | International Business Machines Corporation | Automatic correction of cryptographic application program interfaces |
US10733080B2 (en) * | 2016-06-27 | 2020-08-04 | International Business Machines Corporation | Automatically establishing significance of static analysis results |
TW201935306A (en) | 2018-02-02 | 2019-09-01 | 美商多佛微系統公司 | Systems and methods for policy linking and/or loading for secure initialization |
TWI794405B (en) | 2018-02-02 | 2023-03-01 | 美商查爾斯塔克德拉普實驗室公司 | Systems and methods for policy execution processing |
EP3788488A1 (en) | 2018-04-30 | 2021-03-10 | Dover Microsystems, Inc. | Systems and methods for checking safety properties |
US11087003B2 (en) * | 2018-08-24 | 2021-08-10 | Oracle International Corporation | Scalable pre-analysis of dynamic applications |
EP3877874A1 (en) | 2018-11-06 | 2021-09-15 | Dover Microsystems, Inc. | Systems and methods for stalling host processor |
EP3881190A1 (en) | 2018-11-12 | 2021-09-22 | Dover Microsystems, Inc. | Systems and methods for metadata encoding |
WO2020132012A1 (en) | 2018-12-18 | 2020-06-25 | Dover Microsystems, Inc. | Systems and methods for data lifecycle protection |
US11636211B2 (en) * | 2019-06-27 | 2023-04-25 | Blackberry Limited | Binary static analysis of application structure to identify vulnerabilities |
WO2021076871A1 (en) | 2019-10-18 | 2021-04-22 | Dover Microsystems, Inc. | Systems and methods for updating metadata |
US11218491B2 (en) * | 2019-12-12 | 2022-01-04 | At&T Intellectual Property I, L.P. | Security de-escalation for data access |
US20230075290A1 (en) * | 2020-02-14 | 2023-03-09 | Debricked Ab | Method for linking a cve with at least one synthetic cpe |
US12124576B2 (en) | 2020-12-23 | 2024-10-22 | Dover Microsystems, Inc. | Systems and methods for policy violation processing |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110088023A1 (en) * | 2009-10-08 | 2011-04-14 | International Business Machines Corporation | System and method for static detection and categorization of information-flow downgraders |
US20120023486A1 (en) * | 2010-07-26 | 2012-01-26 | International Business Machines Corporation | Verification of Information-Flow Downgraders |
US20120216177A1 (en) * | 2011-02-23 | 2012-08-23 | International Business Machines Corporation | Generating Sound and Minimal Security Reports Based on Static Analysis of a Program |
US20140040855A1 (en) * | 2011-07-28 | 2014-02-06 | National Instruments Corporation | Optimization of a Data Flow Program Based on Access Pattern Information |
US20140115564A1 (en) * | 2012-10-19 | 2014-04-24 | International Business Machines Corporation | Differential static program analysis |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5862382A (en) * | 1995-05-08 | 1999-01-19 | Kabushiki Kaisha Toshiba | Program analysis system and program analysis method |
US8210903B2 (en) * | 2006-09-29 | 2012-07-03 | Hoya Corporation | Method of manufacturing glass substrate for magnetic disk, method of manufacturing magnetic disk, and polishing apparatus of glass substrate for magnetic disk |
CA2686638A1 (en) * | 2007-05-07 | 2008-11-13 | Tetralogic Pharmaceuticals Corp. | Tnf.alpha. gene expression as a biomarker of sensitivity to antagonists of inhibitor of apoptosis proteins |
US8302080B2 (en) | 2007-11-08 | 2012-10-30 | Ntt Docomo, Inc. | Automated test input generation for web applications |
US8572747B2 (en) * | 2010-11-19 | 2013-10-29 | International Business Machines Corporation | Policy-driven detection and verification of methods such as sanitizers and validators |
US8656496B2 (en) | 2010-11-22 | 2014-02-18 | International Business Machines Corporations | Global variable security analysis |
US8930391B2 (en) * | 2010-12-29 | 2015-01-06 | Microsoft Corporation | Progressive spatial searching using augmented structures |
US8898188B2 (en) | 2011-06-07 | 2014-11-25 | International Business Machines Corporation | String analysis based on three-valued logic |
US9383358B2 (en) * | 2012-06-28 | 2016-07-05 | Cellprint Ip Holding, Llc | Method to assess patterns of molecular expression |
US20150309813A1 (en) * | 2012-08-31 | 2015-10-29 | iAppSecure Solutions Pvt. Ltd | A System for analyzing applications in order to find security and quality issues |
US9424423B2 (en) | 2012-09-12 | 2016-08-23 | International Business Machines Corporation | Static security analysis using a hybrid representation of string values |
US9141807B2 (en) * | 2012-09-28 | 2015-09-22 | Synopsys, Inc. | Security remediation |
US9170908B2 (en) | 2012-12-14 | 2015-10-27 | Salesforce.Com, Inc. | System and method for dynamic analysis bytecode injection for application dataflow |
US9239889B2 (en) * | 2013-03-15 | 2016-01-19 | Sugarcrm Inc. | Adaptive search and navigation through semantically aware searching |
US9582402B2 (en) * | 2013-05-01 | 2017-02-28 | Advanced Micro Devices, Inc. | Remote task queuing by networked computing devices |
US9158922B2 (en) * | 2013-05-29 | 2015-10-13 | Lucent Sky Corporation | Method, system, and computer-readable medium for automatically mitigating vulnerabilities in source code |
US9171168B2 (en) * | 2013-09-30 | 2015-10-27 | Hewlett-Packard Development Company, L.P. | Determine anomalies in web application code based on authorization checks |
US9262296B1 (en) * | 2014-01-31 | 2016-02-16 | Cylance Inc. | Static feature extraction from structured files |
US10282550B1 (en) * | 2015-03-12 | 2019-05-07 | Whitehat Security, Inc. | Auto-remediation workflow for computer security testing |
US9792443B1 (en) * | 2015-03-12 | 2017-10-17 | Whitehat Security, Inc. | Position analysis of source code vulnerabilities |
WO2017027031A1 (en) * | 2015-08-12 | 2017-02-16 | Hewlett Packard Enterprise Development Lp | Assigning classifiers to classify security scan issues |
-
2014
- 2014-09-30 EP EP14903315.1A patent/EP3201780A4/en not_active Withdrawn
- 2014-09-30 WO PCT/US2014/058224 patent/WO2016053282A1/en active Application Filing
- 2014-09-30 US US15/500,531 patent/US10546132B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110088023A1 (en) * | 2009-10-08 | 2011-04-14 | International Business Machines Corporation | System and method for static detection and categorization of information-flow downgraders |
US20120023486A1 (en) * | 2010-07-26 | 2012-01-26 | International Business Machines Corporation | Verification of Information-Flow Downgraders |
US20120216177A1 (en) * | 2011-02-23 | 2012-08-23 | International Business Machines Corporation | Generating Sound and Minimal Security Reports Based on Static Analysis of a Program |
US20140040855A1 (en) * | 2011-07-28 | 2014-02-06 | National Instruments Corporation | Optimization of a Data Flow Program Based on Access Pattern Information |
US20140115564A1 (en) * | 2012-10-19 | 2014-04-24 | International Business Machines Corporation | Differential static program analysis |
Non-Patent Citations (1)
Title |
---|
See also references of EP3201780A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111651773A (en) * | 2020-08-05 | 2020-09-11 | 成都无糖信息技术有限公司 | An automatic mining method for binary security vulnerabilities |
Also Published As
Publication number | Publication date |
---|---|
EP3201780A1 (en) | 2017-08-09 |
US10546132B2 (en) | 2020-01-28 |
US20170220806A1 (en) | 2017-08-03 |
EP3201780A4 (en) | 2018-06-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10546132B2 (en) | String property labels for static analysis | |
US20210382949A1 (en) | Systems and methods for web content inspection | |
US9069574B2 (en) | Code analysis for simulation efficiency improvement | |
US9747187B2 (en) | Simulating black box test results using information from white box testing | |
CN109347882B (en) | Webpage Trojan horse monitoring method, device, equipment and storage medium | |
US20150128272A1 (en) | System and method for finding phishing website | |
US11019096B2 (en) | Combining apparatus, combining method, and combining program | |
CN109376534B (en) | Method and apparatus for detecting applications | |
CN103647678A (en) | Method and device for online verification of website vulnerabilities | |
US10310962B2 (en) | Infrastructure rule generation | |
CN114491560A (en) | Vulnerability detection method and device, storage medium and electronic equipment | |
US10291492B2 (en) | Systems and methods for discovering sources of online content | |
Wi et al. | Diffcsp: Finding browser bugs in content security policy enforcement through differential testing | |
US20120185943A1 (en) | Classification of code constructs using string analysis | |
US11392663B2 (en) | Response based on browser engine | |
JP2015026182A (en) | Security service effect display system, security service effect display method, and security service effect display program | |
WO2016114748A1 (en) | Data comparison | |
CN104375935A (en) | Method and device for testing SQL injection attack | |
CN110874475A (en) | Vulnerability mining method, vulnerability mining platform and computer readable storage medium | |
US9398041B2 (en) | Identifying stored vulnerabilities in a web service | |
CN107977225B (en) | A unified description method and description system for security vulnerabilities | |
CN111125714A (en) | Safety detection method and device and electronic equipment | |
CN109684844B (en) | Webshell detection method and device, computing equipment and computer-readable storage medium | |
US20120215757A1 (en) | Web crawling using static analysis | |
JP6666475B2 (en) | Analysis apparatus, analysis method and analysis program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14903315 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15500531 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2014903315 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014903315 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |