Pex

Pex–White Box Test Generation for .NET. Pex automatically produces a small test suite with high code coverage for a .NET program. To this end, Pex performs a systematic program analysis (using dynamic symbolic execution, similar to path-bounded model-checking) to determine test inputs for Parameterized Unit Tests. Pex learns the program behavior by monitoring execution traces. Pex uses a constraint solver to produce new test inputs which exercise different program behavior. The result is an automatically generated small test suite which often achieves high code coverage. In one case study, we applied Pex to a core component of the .NET runtime which had already been extensively tested over several years. Pex found errors, including a serious issue.


References in zbMATH (referenced in 32 articles )

Showing results 1 to 20 of 32.
Sorted by year (citations)

1 2 next

  1. Godefroid, Patrice; Sen, Koushik: Combining model checking and testing (2018)
  2. Ahrendt, Wolfgang; Chimento, Jesús Mauricio; Pace, Gordon J.; Schneider, Gerardo: Verifying data- and control-oriented properties combining static and runtime verification: theory and tools (2017)
  3. Lucanu, Dorel; Rusu, Vlad; Arusoaie, Andrei: A generic framework for symbolic execution: a coinductive approach (2017)
  4. Rusu, Vlad; Arusoaie, Andrei: Executing and verifying higher-order functional-imperative programs in Maude (2017)
  5. Liang, Tianyi; Reynolds, Andrew; Tsiskaridze, Nestan; Tinelli, Cesare; Barrett, Clark; Deters, Morgan: An efficient SMT solver for string constraints (2016)
  6. Arusoaie, Andrei; Lucanu, Dorel; Rusu, Vlad: Symbolic execution based on language transformation (2015)
  7. Cavalcanti, Ana; Gaudel, Marie-Claude: Test selection for traces refinement (2015)
  8. Liang, Tianyi; Tsiskaridze, Nestan; Reynolds, Andrew; Tinelli, Cesare; Barrett, Clark: A decision procedure for regular membership and length constraints over unbounded strings (2015)
  9. Albert, Elvira; Arenas, Puri; Gómez-Zamalloa, Miguel; Rojas, Jose Miguel: Test case generation by symbolic execution: basic concepts, a CLP-based instance, and actor-based concurrency (2014)
  10. Xie, Tao; Zhang, Lu; Xiao, Xusheng; Xiong, Ying-Fei; Hao, Dan: Cooperative software testing and analysis: advances and challenges (2014) ioport
  11. Brucker, Achim D.; Wolff, Burkhart: On theorem prover-based testing (2013)
  12. Amato, Gianluca; Parton, Maurizio; Scozzari, Francesca: Discovering invariants via simple component analysis (2012)
  13. Carlier, Matthieu; Dubois, Catherine; Gotlieb, Arnaud: A first step in the design of a formally verified constraint-based testing tool: FocalTest (2012)
  14. Christakis, Maria; Müller, Peter; Wüstholz, Valentin: Collaborative verification and testing with explicit assumptions (2012)
  15. Janičić, Predrag: URSA: a system for uniform reduction to SAT (2012)
  16. Kosmatov, Nikolai; Williams, Nicky; Botella, Bernard; Roger, Muriel; Chebaro, Omar: A lesson on structural testing with PathCrawler-online.com (2012) ioport
  17. Vanoverberghe, Dries; de Halleux, Jonathan; Tillmann, Nikolai; Piessens, Frank: State coverage: Software validation metrics beyond code coverage (2012) ioport
  18. Yang, Guowei; Khurshid, Sarfraz; Kim, Miryung: Specification-based test repair using a lightweight formal method (2012) ioport
  19. Fähndrich, Manuel; Logozzo, Francesco: Static contract checking with abstract interpretation (2011)
  20. Giannakopoulou, Dimitra; Bushnell, David H.; Schumann, Johann; Erzberger, Heinz; Heere, Karen: Formal testing for separation assurance (2011)

1 2 next