Online Publications for M. V. Zelkowitz

The following documents are available via the WWW:

( [pdf]) Culture Conflicts in Software Engineering Technology Transfer, by M. Zelkowitz, D. Wallace and D. Binkley, (Submitted)

Although the need to transition new technology to improve the process of developing quality software products is well understood, the computer software industry has done a poor job of carrying out that need. All too often new software technology is touted as the next "silver bullet" to be adopted, only to fail and disappear within a very short period. New technologies are often adopted without any convincing evidence that they will be effective, yet other technologies are ignored despite the published data that they will be useful. In this paper we discuss a study conducted among a large group of computer software professionals in order to understand what techniques can be used to support the introduction of new technologies, and to understand the biases and opinions of those charged with researching, developing or implementing those new technologies. This study indicates which evaluation techniques are viewed as most successful under various conditions. We show that the research and industrial communities do indeed have different perspectives, which leads to a conflict between the goals of the technology researchers and the needs of the technology users.

Experimental models for validating computer technology, by M. Zelkowitz and D. Wallace, IEEE Computer, May, 1998.

Experimentation is important within science for determining the effectiveness of proposed theories and methods. However, computer science has not developed a concise taxonomy of methods applicable for demonstrating the validity of a new technique. In this paper we discuss the methods generally employed to validate an experiment and propose a classification taxonomy consisting of 12 techniques that can be used to show that a new technology achieves its hypothesized goals. An evaluation of over 600 papers published from 1985 through 1995 shows that the 12 methods can be effectively applied to research papers, and we provide some observations of how well the research community validates its claims in these papers.

Incorporating User Defined Data Types into a Web-based Data by R. Tesoriero and M. V. Zelkowitz.

Many organizations have incorporated data collection into their software processes. However, interpreting the data is just as important as the collection of data. The Web Measurement Environment (WebME), a Web-based data visualization tool, is being developed to facilitate the understanding of collected data. The WebME system will extend and improve the functionality provided by an existing system, the Software Management Environment (SME), which was developed to execute on a single system and view data from a specific domain. SME was designed to graphically display software development data, such as staff hours used, errors found, or lines of code developed, from one particular development environment. WebME will permit the analysis of development data from multiple development domains. These enhancements and the capability of adding user defined data types to permit multiple views into the data are the focus of this paper. In particular, the data definition language for creating and using new data types is described. In addition, the issues of type compatibility and system architecture are discussed.

Experimental validation in software engineering by M. V. Zelkowitz and Dolores Wallace (NIST). (Presented at Empirical Assessment of Software Engineering, Keele University, UK, March, 1997, To appear in Information and Software Technology 39(11), November, 1997)

Although experimentation is an accepted approach toward scientific validation in most scientific disciplines, it only recently has gained acceptance within the software development community. In this paper we discuss a 12 model classification scheme for performing experimentation within the software development domain. We evaluate over 600 published papers in the computer science literature and over one hundred papers from other scientific disciplines in order to determine: (1) how well the computer science community is succeeding at validating its theories, and (2) how computer science compares to other scientific disciplines.

Models of Software Experimentation by M. V. Zelkowitz and D. Wallace (Presented at !996 ISERN meeting, Sydney, Australia, August, 1996)

Experimentation and data collection are becoming accepted practices within the software engineering community in order to determine the effectiveness of various software development practices. However, there is wide disagreement as to exactly what the term "experimentation" means in this domain. It is important that we be able to understand this concept and identify how we can best collect data needed to validate software methods that seem to be effective. This understanding will provide a basis for improved technical exchange of information between scientists and engineering within the software community.

Assessing Software Engineering Technology Transfer Within NASA (Technical Report NASA-RPT-003-95, NASA/GSFC, January, 1995).

This represents a NASA-wide study of software engineering technology transfer across NASA. A version of this paper appeared in the IEEE Transactions on Engineering Management in August, 1996.

Reference Model for Frameworks of Software Engineering Environments. (Edition 3 of NIST Special Publication 500-211, August, 1993 and ECMA Technical Report TR/55.) Edited by M. Zelkowitz.

This report presents edition 3 of the environment framework for integrated software engineering environments. It has been called at various times the NIST-ECMA model, the ECMA-NIST model and the Toaster model, due to the original graphic by George Tatge of HP. The report presents the set of services (i.e., functions) needed within the infrastructure of an environment. A summary and an example of using this model was presented at the ACM/IEEE 15th International Conference on Software Engineering, Baltimore, Maryland, (May, 1993) 348-357 under the title " Use of an environment classification model" by M. Zelkowitz.

Reference Model for Project Support Environments. (Edition 2 of NIST Special Publication 500-213, November, 1993 and CMU Software Engineering Institute Technical Report 93-TR-23.) Edited by A. Brown, D. Carney, P. Oberndorf, and M. Zelkowitz.

This presents work by the NGCR (U.S. Navy's Next Generation Computing Resources) PSESWG (Project Support Environment Standards Working Group) to extend the NIST-ECMA model to incorporate end-user services beyond framework capabilities. This includes most of the tasks needed to develop, manage, and operate software, such as development services, management services, and support services. A summary of this report was published as "Issues in the Definition of a Project Support Environment Reference Model" in Computer Standards and Interfaces, Volume 15, (1993) 431-443.

Process Enactment Within An Environment by Roseanne Tesoriero and Marvin Zelkowitz, NASA/GSFC Software Engineering Workshop, NASA/GSFC, November, 1995, Greenbelt, MD (A version of this paper was also presented at the 7th European Control and Metrics Conference, Wilmslow, UK, May, 1996)

Environment research has often centered on either the set of tools needed to support software development or on the set of process steps followed by personnel on a project as they complete their activities. In this paper, we address the effects that the environment has on the development process in order to complete a project. In particular, we are interested in how software process steps are actually performed using a typical programming environment. We then introduce a model to measure a software engineering process in order to be able to determine the relative tradeoffs among manual process steps and automated environmental tools. Understanding process complexity is a potential result of this model. Data from the Flight Dynamics Division at NASA Goddard Space Flight Center is used to understand these issues.

An Information Model for Use in Software Management Estimation and Prediction by Rorry Li and Marvin Zelkowitz, Conference on Information and Knowledge Management, Washington, DC, 1993

This paper describes the use of cluster analysis for determining the information model within collected software engineering development data at the NASA/GSFC Software Engineering Laboratory. We describe the Software Management Environment tool that allows managers to predict development attributes during early phases of a software project and the modifications we propose to allow it to develop dynamic models for better prediction of these attributes.


Prepared by: Marvin Zelkowitz

Last Change: June 30, 1998