(2018) Practical Test Dependency Detection.
|
Text
icst18-pradet.pdf - Submitted Version Download (268kB) | Preview |
Abstract
Regression tests should consistently produce the same outcome when executed against the same version of the system under test. Recent studies, however, show a different picture: in many cases simply changing the order in which tests execute is enough to produce different test outcomes. These studies also identify the presence of dependencies between tests as one likely cause of this behavior. Test dependencies affect the quality of tests and of the correlated development activities, like regression test selection, prioritization, and parallelization, which assume that tests are independent. Therefore, developers must promptly identify and resolve problematic test dependencies. This paper presents PRADET, a novel approach for detecting problematic dependencies that is both effective and efficient. PRADET uses a systematic, data-driven process to detect problematic test dependencies significantly faster and more precisely than prior work. PRADET scales to analyze large projects with thousands of tests that existing tools cannot analyze in reasonable amount of time, and found 27 previously unknown dependencies.
Item Type: | Conference or Workshop Item (A Paper) (Paper) |
---|---|
Additional Information: | Acceptance rate: 25. |
Conference: | ICST International Conference on Software Testing, Verification and Validation |
Depositing User: | Ben Stock |
Date Deposited: | 14 Feb 2018 16:18 |
Last Modified: | 17 Oct 2022 10:13 |
Primary Research Area: | NRA5: Empirical & Behavioral Security |
URI: | https://publications.cispa.saarland/id/eprint/1495 |
Actions
Actions (login required)
View Item |