Fundamentals of Software Testing

Spring 2008

Course Number: CMSC737.

Meeting Times: Tue. Thu. - 9:30AM - 10:45AM (CSIC 2120)

Office Hours: Tue. Thu. - 10:45AM - 12:00PM (4115 A. V. Williams Building)

Catalog Course Description: This course will examine fundamental software testing and related program analysis techniques. In particular, the important phases of testing will be reviewed, emphasizing the significance of each phase when testing different types of software. The course will also include concepts such as test generation, test oracles, test coverage, regression testing, mutation testing, program analysis (e.g., program-flow and data-flow analysis), and test prioritization.

Course Summary: This course will examine fundamental software testing and program analysis techniques. In particular, the important phases of testing will be reviewed, emphasizing the significance of each phase when testing different types of software. Students will learn the state of the art in testing technology for object-oriented, component-based, concurrent, distributed, graphical-user interface, and web software. In addition, closely related concepts such as mutation testing and program analysis (e.g., program-flow and data-flow analysis) will also be studied. Emerging concepts such as test-case prioritization and their impact on testing will be examined. Students will gain hands-on testing/analysis experience via a multi-phase course project. By the end of this course, students should be familiar with the state-of-the-art in software testing. Students should also be aware of the major open research problems in testing.

Student Presentations: All students are strongly encouraged to present a topic related to software testing. You must prepare slides and select a date for your presentation. Group presentations are encouraged if the selected topic is broad enough. Topics include, but are not limited to:

Þ    Combining and Comparing Testing Techniques

Þ    Defect and Failure Estimation and Analysis

Þ    Testing Embedded Software

Þ    Fault Injection

Þ    Load Testing

Þ    Testing for Security

Þ    Software Architectures and Testing

Þ    Test Case Prioritization

Þ    Testing Concurrent Programs

Þ    Testing Database Applications

Þ    Testing Distributed Systems

Þ    Testing Evolving Software

Þ    Testing Interactive Systems

Þ    Testing Object-Oriented Software

Þ    Testing Spreadsheets

Þ    Testing vs. Other Quality Assurance Techniques

Þ    Usability Testing

Þ    Web Testing

The grade of the course will be determined as follows: 25% mid-term, 25% final exam, 50% project.

Credits: 3

Prerequisites: Software engineering CMSC435 or equivalent.

Status with respect to graduate program: MS qualifying course (Midterm+Final exam), PhD core (Software Engineering).

Syllabus: The following topics will be discussed.

  1. Introduction to software testing
    • [Jan. 29, 31, Feb 5, 7] Contents: The need for testing; testing as an integral part of software engineering; software engineering processes and testing.
    • Slides: 1.pdf, 2.pdf
    • Reading List
      1. Testing: a roadmap, Mary Jean Harrold, Proceedings of the conference on the future of Software engineering May 2000.
      2. Introduction to special section on software testing, R. Hamlet, Communications of the ACM June 1988, Volume 31 Issue 6.
      3. Testing: principles and practice, Stephen R. Schach, ACM Computing Surveys, (CSUR) March 1996, Volume 28 Issue 1.
      4. Software safety: why, what, and how, Nancy G. Leveson, ACM Computing Surveys (CSUR) June 1986, Volume 18 Issue 2.
      5. Validation, Verification, and Testing of Computer Software, W. Richards Adrion, Martha A. Branstad, John C. Cherniavsky, ACM Computing Surveys (CSUR) June 1982, Volume 14 Issue 2.
  2. Test-case Generation
    • Contents: Blackbox Testing; sampling the program's input space
    • Slides: 3.pdf, 5.pdf
    • Reading List
      1. [Feb 12, 14, 19] The category-partition method for specifying and generating functional tests, T. J. Ostrand, M. J. Balcer, Communications of the ACM June 1988, Volume 31 Issue 6.
      2. [Feb 21] A test generation strategy for pair-wise testing, Kuo-Chung Tai; Yu Lei, Software Engineering, IEEE Transactions on, Volume: 28 Issue: 1, Jan. 2002, Page(s): 109 -111.
    • Slides:, 7.pdf, 8.pdf.
    • Contents: Whitebox testing; sampling the program's input space; path-testing; branch and predicate testing.
    • Reading List
      1.  [Feb. 26] Predicate-based test generation for computer programs, Kuo-Chung Tai, Software Engineering, 1993. Proceedings of the 15th International Conference on, 1993, Page(s): 267 -276.
      2. [Feb. 26]  A heuristic approach for test case generation, Kai-Hsiung Chang, W. Homer Carlisle, James H. Cross, II, David B. Brown, Proceedings of the 19th annual conference on Computer Science, 1991, Page(s): 174 – 180.
  3. GUI Testing

[Mar. 25]Generating test cases for GUI responsibilities using complete interaction sequences, White, L.; Almezen, H.; Software Reliability Engineering, 2000. ISSRE 2000. Proceedings. 11th International Symposium on; 8-11 Oct. 2000 Page(s):110 – 121.

  1. Regression testing [Apr. 3]
    • Contents: Test selection.
    • Slides: 15.pdf
    • Reading List
      1. An empirical study of regression test selection techniques, Todd L. Graves, Mary Jean Harrold, Jung-Min Kim, Adam Porter, Gregg Rothermel, ACM Transactions on Software Engineering and Methodology (TOSEM) April 2001, Volume 10 Issue 2.
  1. Special Topics: presented by students [Apr. 8-May 8]

Date

Topic

Status

Presented by

Apr. 8

Test Oracle Generation.

Read

[1] http://citeseer.ist.psu.edu/peters95generating.html

[2] http://citeseer.ist.psu.edu/181183.html

[3] http://rupak.info/article/33/test-oracles

[4] http://www.cse.ogi.edu/PacSoft/conf/jvw02/slides/jmlunit.pdf

Questions; Slides; Slides in pdf format

Solutions to three questions posted

Alford, Ronald Wayne

Object Oriented Software Testing.

Read  http://portal.acm.org/citation.cfm?id=1044835&dl=GUIDE

Questions; Slides; Slides in pdf format

Solutions to three questions posted

Bahety, Anand Baldeodas

&

Bucatanschi, Dan George

Apr. 10

Protocol testing. Paper: “ASPIRE: Automated Systematic Protocol Implementation Robustness Evaluation” by Arunchandar Vasan and Atif M. Memon. (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Chandra, Deepti Jagdish

Testing concurrent software: paper Reachability testing of concurrent programs, Lei, Y.; Carver, R.H., Software Engineering, IEEE Transactions on, Volume 32, Issue 6, June 2006 Page(s): 382 – 403. (Questions; Slides; Slides in pdf format)

 

Solutions to three questions posted

Huynh, Thuan Quang

Apr. 15

Using Execution Feedback in Test Case Generation; paper Using GUI Run-Time State as Feedback to Generate Test Cases

(Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Nguyen, Bao Ngoc

Testing Embedded Software; paper Pao-Ann Hsiung, Shang-Wei Lin, Chih-Hao Tseng, Trong-Yen Lee, Jih-Ming Fu, Win-Bin See, "VERTAF: An Application Framework for the Design and Verification of Embedded Real-Time Software," IEEE Transactions on Software Engineering, vol. 30,  no. 10,  pp. 656-674,  Oct.,  2004. (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Konda, Shravya Reddy

Apr. 17

Combining static and dynamic reasoning for bug detection

(Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Reisner, Elnatan Benjamin

Database Testing; paper A framework for testing database applications (http://portal.acm.org/citation.cfm?id=348954&dl=GUIDE&dl=ACM)

(Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Liu, Liping

Apr. 22

Web Testing; paper VeriWeb: Automatically testing dynamic web sites

by Michael Benedikt, Juliana Freire, Patrice Godefroid, In: Proceedings of the 11th international world wide web conference (WWW2002) (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Wongsuphasawat, Krist

A Theory of Fault-Based Testing (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Lee, Joonghoon

Apr. 24

Pip: Detecting the Unexpected in Distributed Systems (Questions; Slides; Slides in pdf format)

 

Solutions to three questions posted

Schulman, Aaron David

Machine Learning approaches for Statistical Software Testing; paper Saraph, P.   Last, M.   Kandel, A., "Test case generation and reduction by automated input-output analysis".  Proceedings of IEEE International Conference on Systems, Man and Cybernetics, 2003. (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Sharara, Hossam Samy Elsai

Apr. 29

PhD Proposal

https://forum.cs.umd.edu/calendar.php?do=getinfo&day=2008-4-29&c=5

 

Jaymie Strecker

May 1

Using Formal Software Specifications for Testing; paper Boyapati, C., Khurshid, S., and Marinov, D. 2002. Korat: automated testing based on Java predicates. In Proceedings of the 2002 ACM SIGSOFT international Symposium on Software Testing and Analysis (Roma, Italy, July 22 - 24, 2002). ISSTA '02. ACM, New York, NY, 123-133. http://sdg.csail.mit.edu/pubs/2002/korat.pdf (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Stuckman, Jeff

Fault Injection; paper “Software Fault Injection for Survivability" by Jeffrey M. Voas & Anup K. Ghosh. http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=821531 (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Teoh, Alison Lui Koon

May 6

Web Testing. Alessandro Marchetto, Paolo Tonella and Filippo Ricca. State-Based Testing of Ajax Web Applications, First International Conference on Software Testing, Verification and Validation, Lillehammer, Norway. (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Thakor, Shashvat Advait

Network Security testing Techniques; source is this report (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Vador, Sachin Shashikant

May 8

Security Testing: paper Bypass testing of web applications., J Offutt, Y Wu, X Du, H Huang - Proc. of the IEEE International Symposium on Software  E, 2004 - http://cs.gmu.edu/~offutt/rsrch/papers/bypass-issre.pdf (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Donlon, Eileen Merle

Usability testing: reading list (1) Nielsen, Molich: Heuristic evaluation of user interfaces (8 pages): http://portal.acm.org/citation.cfm?id=97243.97281

(2) Spool, Schroeder: Testing web sites: five users is nowhere near enough

(2 pages) http://portal.acm.org/citation.cfm?id=634067.634236 (Questions; Slides; Slides in pdf format)

Solutions to three questions posted

Zazworka, Nico

 

Course Project

Phase 1

Goal: Black-box test-case generation.

Procedure: Take at least three subject applications from the TerpOffice web-site; select five methods, each with at least 5 parameters (you may select methods with 3 or 4 parameters only if the methods are reasonably complex with at least two branches); create JUnit test cases for these methods using the category-partition method. Reduce the number of test cases using pair-wise testing. Compare the statement coverage of the original and reduced suites.

Deliverables: Source code of the five methods and all the JUnit test cases; and a document describing the categories/choices and constraints. At least two sets of test cases, each of which is sufficient to satisfy the pair-wise testing criterion. Coverage reports. Also a one-page document describing the difficulties you faced with using the subject applications – including downloading, installing, etc. and how you handled the difficulties.

Due on Feb. 26 in class. [Late submission policy – you lose 20% (of the maximum points) per day]

Phase 2

Goal: Making each JUnit tests fail by seeding artificial faults in the method source code.

Procedure: Examine each JUnit test cases and the method source code. Obtain a set of source-code changes (from this paper) that will cause each test case to fail. Insert, in the method, a comment /*FAULT## FAILURE INDUCING CODE */ at line N. Simple string replacement of line N with “FAILURE INDUCING CODE” should cause the JUnit test case to fail. If the change requires changes to multiple lines, then replace ## with an integer; use the same value of the integer for all lines that are related to one failure. Write a script to perform the string replacement automatically, one fault at a time to avoid fault interaction. Use the classification of the faults from this paper.

Deliverables: The modified source of methods, the script, and JUnit test cases. An standalone executable that will demonstrate the entire process automatically.

Due on Mar. 28. [Late submission policy – you lose 20% (of the maximum points) per day]

Phase 3

Goal: Running the pair-wise test cases on fault-seeded code. Also, running the application (via the GUI) to detect the seeded faults.

Procedure: Execute all your pair-wise test cases on the methods seeded with faults. Report the number of faults that you detected. Compare with full suite. For the second part of this phase, compile and run the application (with one fault turned ON at a time) and come up with an interaction (sequence of GUI events) that reveals the fault, i.e., produces an outcome that is different from that of the original code.

Deliverables: The failure report and the event-sequences.

Due on Apr. 29 in class. [Late submission policy – you lose 20% (of the maximum points) per day]