Kwon et al., 2013 - Google Patents
Mantis: Automatic performance prediction for smartphone applicationsKwon et al., 2013
View PDF- Document ID
- 12849557791784600904
- Author
- Kwon Y
- Lee S
- Yi H
- Kwon D
- Yang S
- Chun B
- Huang L
- Maniatis P
- Naik M
- Paek Y
- Publication year
- Publication venue
- 2013 USENIX Annual Technical Conference (USENIX ATC 13)
External Links
Snippet
We present Mantis, a framework for predicting the performance of Android applications on given inputs automatically, accurately, and efficiently. A key insight underlying Mantis is that program execution runs often contain features that correlate with performance and are …
- 241000258241 Mantis 0 title abstract description 67
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3604—Software analysis for verifying properties of programs
- G06F11/3612—Software analysis for verifying properties of programs by runtime analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/3668—Software testing
- G06F11/3672—Test management
- G06F11/3676—Test management for coverage analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3409—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment for performance assessment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3466—Performance evaluation by tracing or monitoring
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Preventing errors by testing or debugging software
- G06F11/362—Software debugging
- G06F11/3636—Software debugging by tracing the execution of the program
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
- G06F9/44—Arrangements for executing specific programmes
- G06F9/455—Emulation; Software simulation, i.e. virtualisation or emulation of application or operating system execution engines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/40—Transformations of program code
- G06F8/41—Compilation
- G06F8/44—Encoding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for programme control, e.g. control unit
- G06F9/06—Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
- G06F9/46—Multiprogramming arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/50—Computer-aided design
- G06F17/5009—Computer-aided design using simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F8/00—Arrangements for software engineering
- G06F8/70—Software maintenance or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2201/00—Indexing scheme relating to error detection, to error correction, and to monitoring
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Kwon et al. | Mantis: Automatic performance prediction for smartphone applications | |
US10691419B2 (en) | Reconstructing a high level compilable program from an instruction trace | |
Blackburn et al. | Wake up and smell the coffee: Evaluation methodology for the 21st century | |
Arzt et al. | Using targeted symbolic execution for reducing false-positives in dataflow analysis | |
Laaber et al. | Dynamically reconfiguring software microbenchmarks: Reducing execution time without sacrificing result quality | |
Kwon et al. | Mantis: Efficient predictions of execution time, energy usage, memory usage and network usage on smart mobile devices | |
CN106529304B (en) | An Android application concurrency vulnerability detection system | |
Laaber et al. | Applying test case prioritization to software microbenchmarks | |
CN110419031B (en) | Code coverage tracking for microcontroller programs | |
Izsó et al. | MONDO-SAM: A Framework to Systematically Assess MDE Scalability. | |
Zhao et al. | H-fuzzing: A new heuristic method for fuzzing data generation | |
Jääskelä | Genetic algorithm in code coverage guided fuzz testing | |
Simon et al. | Mode-Aware Concolic Testing for PLC Software: Special Session “Formal Methods for the Design and Analysis of Automated Production Systems” | |
Guo et al. | Graphspy: Fused program semantic embedding through graph neural networks for memory efficiency | |
Kashif et al. | INSTEP: A static instrumentation framework for preserving extra-functional properties | |
US20200272555A1 (en) | Automatic software behavior identification using execution record | |
Laaber et al. | Performance testing in the cloud. How bad is it really? | |
Garbervetsky et al. | Quantitative dynamic‐memory analysis for Java | |
Arzt et al. | Automatic low-overhead load-imbalance detection in MPI applications | |
Badri et al. | On the effect of aspect-oriented refactoring on testability of classes: A case study | |
Chun et al. | Mantis: Predicting system performance through program analysis and modeling | |
Wienke et al. | Continuous regression testing for component resource utilization | |
Gibbs et al. | Operation Mango: Scalable Discovery of {Taint-Style} Vulnerabilities in Binary Firmware Services | |
Bán et al. | Prediction models for performance, power, and energy efficiency of software executed on heterogeneous hardware | |
Castañé et al. | Machine learning applied to accelerate energy consumption models in computing simulators |