sss ssss rrrrrrrrrr
ssss ss rrrr rrrr
sssss s rrrr rrrr
ssssss rrrr rrrr
ssssssss rrrr rrrr
ssssss rrrrrrrr
s ssssss rrrr rrrr
ss sssss rrrr rrrr
sss sssss rrrr rrrr
s sssssss rrrrr rrrrr
+===================================================+
+======= Testing Techniques Newsletter (TTN) =======+
+======= ON-LINE EDITION =======+
+======= May 1995 =======+
+===================================================+
TESTING TECHNIQUES NEWSLETTER (TTN), On-Line Edition, is E-Mailed
monthly to support the Software Research, Inc. (SR) user community and
provide information of general use to the world software testing commun-
ity.
(c) Copyright 1995 by Software Research, Inc. Permission to copy and/or
re-distribute is granted to recipients of the TTN On-Line Edition pro-
vided that the entire document/file is kept intact and this copyright
notice appears with it.
TRADEMARKS: Software TestWorks, STW, STW/Regression, STW/Coverage,
STW/Advisor, X11 Virtual Display System, X11virtual and the SR logo are
trademarks of Software Research, Inc. All other systems are either
trademarks or registered trademarks of their respective companies.
========================================================================
INSIDE THIS ISSUE:
o CONFERENCE ANNOUNCEMENT
EIGHTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW95)
o SPECIAL ISSUE REVIEW:
OBJECT-ORIENTED SOFTWARE TESTING (Part 3 of 3)
by Edward F. Miller, President, Software Research, Inc.
o AUTOMATED TOOL SUPPORT FOR ANSI/IEEE STD. 829-1983
SOFTWARE TEST DOCUMENTATION (Part 3 of 3)
by Harry M. Sneed, Germany
o CALENDAR OF EVENTS
o TTN SUBMITTAL POLICY
o TTN SUBSCRIPTION INFORMATION
========================================================================
************************************************************************
EIGHTH INTERNATIONAL SOFTWARE QUALITY WEEK (QW95)
************************************************************************
30 May 1995 -- 2 June 1995
Sheraton Palace Hotel, San Francisco, California
Conference Theme: The Client-Server Revolution
QW `95 is the premier technological conference of its kind, combining
the newest applications, technology, and management techniques. Software
Quality Week, now in its eighth year, focuses on advances in client/
server technologies, software test technology, quality control, software
test process, managing OO integration, software safety, and test automa-
tion. Quality Week `95 offers an exchange of information between academ-
icians and practitioners that no other conference can provide.
The Client/Server Revolution is sweeping all of computing, changing the
way we think about organizing complex systems, how we develop and test
those systems, and changing our approach to quality control questions
for multi-user, multi-platform, heterogeneous environments. At the same
time, the Client/Server Revolution is forcing a closer look at critical
development strategies, at how we think about software testing, and at
the methods and approaches we use to get the job done. The Eighth Inter-
national Software Quality Week covers advances in software analysis and
review technologies, along with formal methods and empirical strategies
for large-scale as well as small-scale projects. Quality Week competi-
tive edge to dominate your industry.
PROGRAM DESCRIPTION
^^^^^^^^^^^^^^^^^^^
The Pre-Conference Tutorial Day offers expert insights on ten key topic
areas. The Keynote presentations give unique perspectives on trends in
the field and recent technical developments in the community, and offer
conclusions and recommendations to attendees.
The General Conference offers four-track presentations, mini-tutorials
and a debate:
Technical Track. Topics include:
Class testing
Deep Program Analysis
Test Oracles
Novel GUI Approaches, and more...
Applications Track. Topics include:
Real-world experiences
Novel tools
User-Level analysis, and more...
Management Track. Topics include:
Automatic tests
Process experience
Team approaches
Managing OO integration, and more...
Vendor Track: Selected vendors present their products and/or services to
guide the testing process. The vendor track is specifically reviewed for
technical content -- no high-pressure sales pitches are allowed; come to
learn, not to be sold!
A two-day Tools Expo brings together leading suppliers of testing solu-
tions.
Mini-Tutorial: Explore the pros and cons of outsourcing software test-
ing.
Debate: Examine one of today's hottest topics, Model-Checking and the
Verification of Concurrent Programs, and listen to the experience of
experts from Carnegie Mellon University in Pittsburgh, Pennsylvania,
Trinity College of Dublin, Ireland, Oxford University, Oxford, England,
and Universite de Liege, Belgium.
WHO SHOULD ATTEND
^^^^^^^^^^^^^^^^^
o Lead senior quality assurance managers looking for powerful mainte-
nance and testing techniques and an opportunity to evaluate today's
tools.
o All quality assurance and testing specialists, beginners and experts
alike, who need exposure to authoritative sources for improving soft-
ware test technology.
o Programmers and developers who want to learn more about producing
better quality code.
o Maintenance technicians looking for techniques that control product
degradation.
o Technologists who want to catch up on the state-of-the-art techniques
in software testing, quality assurance and quality control.
Quality Week '95 is sponsored by
Software Research, Inc.
San Francisco, California
REGISTRATION FOR QUALITY WEEK
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
REGISTRATION: Please pay by check or with your Company Purchase Order.
The entire Conference Fee is payable prior to the program. Make checks
payable to SR Institute, Inc. Registration is accepted up to the time of
the meeting; on-site registration begins at 7:00 a.m., subject to space
availability. No cancellation fee until 5 May 1995; a service charge of
$125 after 5 May 1995 applies. Call the registrar to obtain your cancel-
lation number.
FEES: Registration includes all material, Conference Lunches, Refresh-
ments and invitation to the Cocktail Party.
Registered & Paid Before After Group Rates
28 April 28 April
Tutorial Day $300 $350 no discount
3-Day Conference $750 $850 10% discount
COMBINED $950 $1050 10% discount
SAVE: Send your team of software testing specialists and benefit from
the reduced group rate. If you register two or more representatives at
one time, you may deduct 10% of the fee for each attendee from the
Conference or COMBINED price only.
CONFERENCE HOTEL: Quality Week will be held at the luxurious landmark
Sheraton Palace Hotel, San Francisco, CA, located in the very heart of
the downtown business district. The Sheraton Palace has welcomed vaca-
tioners and business persons with its famous hospitality. Enjoy the best
in facilities, restaurants, clubs, theaters, shops, and points of
interest.
Please complete and mail form together with your check or purchase order
to:
--------------------------cut here--------------------------------------
SR Institute
901 Minnesota Street
San Francisco, CA 94107 USA USA
Or request information through e-mail: qw@soft.com
Or FAX Your Registration: [+1] (415) 550-3030
Please Type or Print:
Name: __________________________________________________________________
Title: _________________________________________________________________
Company: _______________________________________________________________
Street: ________________________________________________________________
City: __________________________________________________________________
State or Province: _____________________________________________________
ZIP or Postal Code: ____________________________________________________
Country: _______________________________________________________________
Phone: _________________________________________________________________
FAX: ___________________________________________________________________
Note: Please copy this form for multiple registration.
Please Check One:
[ ] Tutorials
[ ] 3-Day Conference
[ ] Tutorials and Conference COMBINED
[ ] Check Enclosed [ ] P.O. Number Enclosed
========================================================================
AUTOMATED TOOL SUPPORT FOR ANSI/IEEE STD. 829-1983
SOFTWARE TEST DOCUMENTATION
by Harry M. Sneed, Germany
(Part 3 of 3)
(Editor's note: This is the final installment in a series of
articles appearing in the TTN/Online Edition.)
Introduction
The ANSI/IEEE Standard for Software Test Documentation calls for the
production of a series of documents which verify that the testing pro-
cess has been carried out properly and that the test objectives have
been met. Without automated tool support the costs of such test documen-
tation are prohibitive in all but the most trivial projects.
This paper describes a test system which provides such a service. It
begins with a test plan frame as a master class, from which the class
test design is then derived. From it various test procedure classes are
generated which serve to generate the individual objects - test cases
specified in the form of pre- and post condition assertions to be exe-
cuted in test suites.
ANSI/IEEE Standard 829 "Software Test Documentation" calls for the pro-
duction of a set of documents to ensure the quality of software testing
(1). The ISO-9000 Standard refers to the ANSI Standards as a basis for
test documentation (2). Any organization seeking certification by the
American Software Engineering Institute S.E.I. must provide a minimum
subset of the documents specified (3). It has now become obvious that
the test documents required by the Standard 829 will be a prerequisite
to almost any certification process for software producers.
Automated Test Documentation
The second way to reduce the effort involved in producing requirement
test documents is through automation (6). Automation is appropriate for
the following document types:
o Test Case Specification,
o Test Procedure Specification,
o Test Item Transmittal Report and
o Test Log
Just as there are many ways to specify programs, there are also many
ways to specify test cases. Since the two tasks - program specification
and test specification - are so closely related, the best way is to link
the two. The test case specification should be automatically generated
out of the program specification (7).
In the case of the SOFSPEC system referred to here, programs are speci-
fied by the Jackson method of creating a structured control tree with
control and elementary nodes (8). The control nodes are of the type
o sequence
o selection and
o repetition.
The selection node has been enhanced in SOFSPEC to include
o case,
o chain and
o alternate
selections, while the repetition node has been extended to include
o WHILE loops,
o UNTIL loops and
o EXIT loops
thus covering all of the basic structures of structured program design.
The same technique applies also to the action diagrams in the Informa-
tion Engineering Method of James Martin (9).
In generating test cases, each elementary node, i.e. sequence of elemen-
tary nodes is traced back to the top of the tree, picking up all of the
conditions along the path.
ENTRY-NODE
A-NODE IF (TRAN = 'UPDATE')
A1-NODE WHILE (I < N)
A112-NODE WHEN (ICASE = 2)
COMPUTE RATE = PART/SUM
The path expression derived from the control trees (JSP or action
diagram) for each elementary operation is then converted by inversion to
a test case specification expressed with an Assertion Language (10.
TESTCASE: COMPUTE RATE;
ASSERT PRE SUM IS RANGE (1 TO 100),
ASSERT PRE PART IS < SUM;
ASSERT POST RATE IS PART/SUM;
ASSERT PRE TRAN = 'UPDATE';
ASSERT PRE X > Y IF (TRAN = 'UPDATE');
ASSERT PRE I < N IF (X > Y);
ASSERT ICASE = 2 IF (I < N);
END-TESTCASE;
The sum of all the test cases derived from the program specification is
then fed to a post processor which prepares the test case specification
giving the function description of the elementary operation taken from
the function repository, the inputs taken from the pre-conditions, the
outputs taken from the post-conditions and the path expression consist-
ing of the conditions leading to the function.
The test procedure is produced by merging the test path expressions with
the program design.The test path expressions give the required inputs
and the expected outputs for each elementary function to be bested. The
program design gives the location of the functions in the modules or
procedures to be tested and the location of the data files, databases
and panels to be generated or validated. This is a very important step
in allocating logical test cases to physical test objects.
The result of this merger is not only a test procedure specification but
also a test frame consisting of drivers and stubs, and test files con-
sisting of before and after images. These are generated on the PC
workstation and then transmitted to the test machine for execution.
The items to be tested - the modules, procedures, files, panels, etc.
are stored either in a library or a repository. From there they are
extracted and transmitted to the test machine. Following the extraction
but before copying the list of items is submitted to a report generator
which prepares the test item transmittal report - a list of all objects
in the target system as well as all of the test support items - test
procedures, test files, drivers, stubs. etc.
The final automated report is prepared at test execution time by the
dynamic analyzer in the testbed (11).
The paths actually taken through the program from entry to exit are
recorded in terms of the branches traversed. In addition, the test cov-
erage is measured in terms of the ratio of branches traversed. The
expected outputs are compared with the actual outputs and the deviations
recorded. Finally any interruption is trapped and the path up to that
point reported. Thus, the test log is a complex report of
o Program Interruptions,
o Assertion Violations,
o Execution Path Traces and
o Test Coverage Measurement
The good part about it is that all of this information is collected
automatically as a byproduct of the test execution process by means of
program instrumentation and result comparison.
The automation of the four documents outlined above is absolutely neces-
sary to capturing such a mass of detailed informations required to make
the documentation meaningful. In light of the fact that a typical com-
mercial application requires several thousand test cases to test, there
is no alternative to automation (12).
REFERENCES
(1) ANSI/IEEE Std. 829-1983: Software Test Documentation, IEEE
Press, Institute of Electrical and Electronics Engineers,
Inc., New York, 1987
(2) ISO Std. 9001 (ES 20991): Quality Assurance Systems, Model
for the Implementation of Quality Assurance in design,
construction and installation, International Standards
Organization, Br|ssels, 1990
(3) S.E.I.: Certification Program, Software Engineering
Institute, Pittsburg, PA., 1990
(4) Shu, Nan C.: "FORMAL-A - a Forms-Oriented, Visual-Directed
Development System", IEEE Computer, August, 1985
(5) Bassett, P.:"Frame-Based Software Engineering", IEEE
Software, July, 1987
(6) Posten, R./Sexton, M.: "Evaluating and Selecting Testing
Tools" IEEE Software, May 1992
(7) McMullin,P./Gannon, J.: Combining Testing with Formal
Speifications - A Case Study", IEEE Trans. on S.E., Vol. 9,
No. 3, May, 1983
(8) Jackson, M.A.: Principles of Program Design, Academic Press,
London, 1975
(9) Martin, J./McClure, C.: Structured Techniques- The Basis of
CASE, Prentice Hall, Englewood Cliffs, N.J. 1988
(10) Taylor, R.N.: "Assertion in Programming Languages", Boeing
Computer Services, Seattle, Wash. 1979
(11) Taylor, R.N./Lerme, D.L./Kelly, C.D.: "Structural Testing of
Concurrent Programs" , IEEE Trans on S.E., Vol. 18, No. 3
March 1992
(12) Basili, V./Selby, R.: "Comparing the Effectiveness of
Software Testing Strategies", IEEE Trans. on S.E., Vol. 13,
No. 12, Dec. 1987
Summary
The techniques of reuse and automation in producing test
documents have been built into a test system under development
by Software Engineering Services in Budapest for testing AS/400
COBOL programs. The system itself is PC-based using MS-Windows
as a user interface and Windows-Word as an editor. Besides
storing and manipulating documentation frames, the system has
its own program instrumentor, assertion processor, file
generator, file comparator and file auditor. On the AS/400 there
is a dynamic analyzer to monitor the test execution and to
generate the test log. Experience with this system in an ongoing
projects shows that this may be the only way to fulfill the
ANSI/IEEE standard for test documentation within the usual cost
constraints imposed by a commercial project.
========================================================================
SPECIAL ISSUE REVIEW:
OBJECT-ORIENTED SOFTWARE TESTING
Part 3 of 3
Note: These are reviews and commentary on a special section of the Com-
munications of the ACM devoted to Object-Oriented Software Testing (C.
ACM, Vol. 37, No. 9, September 1994, p. 30ff).
The September Edition of the ACM magazine, COMMUNICATIONS OF THE ACM,
was devoted to Object-Oriented Software Testing. The six articles were:
"Object Oriented Integration Testing" by Paul C. Jorgensen and Carl
Erickson; "Experiences with Cluster and Class Testing" by Gail C. Mur-
phy, Paul Townsend, and Pok Sze Wong; "Automated Testing from Object
Models" by Robert M. Poston; "Integrating Object-Oriented Testing and
Development Processes" by John D. McGregor and Timothy D. Korson; "Test-
ing `In A Perfect World'" by Thomas R. Arnold and William A. Fuson; and
"Design for Testability in Object-Oriented Systems" by Robert V.
Binder.
o o o o o o o
"Testing `In A Perfect World',"
by Thomas R. Arnold and William A. Fuson
(C. ACM, Vol. 37, No. 9, September 1994, p. 78ff).
``In the `real world' of large projects, involving legacy systems, non-
OO interfaces, noninfinite resources, non-infinitely applicable tools,
competing and clashing features, and noninfinite delivery windows, test-
ing is through back into the tradeoff-world...'' (p. 78) seems to sum-
marize the problem pretty well.
But the writers in this paper explain how, in spite of all the real-
world difficulties, OO based software (built in C++) actually was able
to apply inherent advantages to OO to simplify and standardize an
already-built, ongoing test-suite system. Carefully thought out
changes, based on small experiments to determine their validity, and
implemented with a highly experienced test team for whom budget, while
certainly an issue, was clearly substantial, yielded good results.
Hats off for persistence! It's good to know that someone, somehow, wins
the battle and is happy about it. Now, whether their success had to do
with their essential strong character or with OO technology -- that's
for the reader to guess.
"Design for Testability in Object-Oriented Systems,"
by Robert V. Binder
(C. ACM, Vol. 37, No. 9, September 1994, p. 87ff).
``Testability is the relative ease and expense of revealing software
defects...", and famed OO writer/speaker Binder focuses his process-
oriented paper on reliability driven vs. resource limited processes, two
ends of the spectrum of processes he sees as competing for adoption.
Fishbone charts are used to illustrate the main points and the wise
technologist would do well to ponder these relationships before deciding
on a new or changed internal process.
Binder's analysis is comprehensive -- all of the right topics and
options are there, too many perhaps to make the choices simple. This is
a fine survey, a good compendium, and a very good "motherhood and apple
pie" piece. While it is short on definitive conclusions and simple
recommendations it is long on references and pretty much covers all of
the available options.
End Part 3
========================================================================
---------------------->>> CALENDAR OF EVENTS <<<----------------------
========================================================================
The following is a partial list of upcoming events of interest. ("o"
indicates Software Research will participate in these events.)
+ May 22 - 24:
2nd Int'l Workshop on Automated and Algorithmic Debugging
(AADEBUG '95)
St Malo, France
Contact: Mireille Ducasse
fax: 33-99-28-64-58
email: ducasse@irisa.fr
+ May 22 - 25: Software Engineering Process Group Conference
Boston, MA
contact: Rhonda Green
tel: 412-268-6467
fax: 412-268-5758
email: rrg@sei.cmu.edu
o May 23 - 25: SunWorld '95
Moscone Center
San Francisco
Show Manager: Ms. Jacqueline E. Murphy
contact: Mr. Chip Zaborowski
National Account Manager
or: Mr. William Bernardi
Exhibit Services Coordinator
or: SunWorld staff
All @ tel: 508-879-6700
fax: 508-872-8237
o May 30 - June 2: Eighth International Software Quality Week (QW95)
Sheraton Palace Hotel, San Francisco, CA, USA
Contact: Rita Bral
tel: [+1] (415) 550-3020
fax: [+1] (415) 550-3030
email: qw@soft.com
o June 12 - 15: USPDI Software Testing Conference
Crystal Gateway Marriott
Washington, D.C.
Contact: Genevieve (Ginger) Houston-Ludlam
tel: 301-445-4400
fax: 301-445-5722
========================================================================
------------>>> TTN SUBMITTAL POLICY <<<------------
========================================================================
The TTN On-Line Edition is forwarded on the 15th of each month to sub-
scribers via InterNet. To have your event listed in an upcoming issue,
please e-mail a description of your event or Call for Papers or Partici-
pation to "ttn@soft.com". The TTN On-Line submittal policy is as fol-
lows:
o Submission deadlines indicated in "Calls for Papers" should provide
at least a 1-month lead time from the TTN On-Line issue date. For
example, submission deadlines for "Calls for Papers" in the January
issue of TTN On-Line would be for February and beyond.
o Length of submitted items should not exceed 68 lines (one page).
o Publication of submitted items is determined by Software Research,
Inc., and may be edited as necessary.
========================================================================
----------------->>> TTN SUBSCRIPTION INFORMATION <<<-----------------
========================================================================
To request a FREE subscription or submit articles, please send E-mail to
"ttn@soft.com". For subscriptions, please use the keywords "Request-
TTN" or "subscribe" in the Subject line of your E-mail header. To have
your name added to the subscription list for the biannual hard-copy
version of the TTN -- which contains additional information beyond the
monthly electronic version -- include your name, company, and postal
address.
To cancel your subscription, include the phrase "unsubscribe" or
"UNrequest-TTN" in the Subject line.
Note: To order back copies of the TTN On-Line (August 1993 onward),
please specify the month and year when E-mailing requests to
"ttn@soft.com".
TESTING TECHNIQUES NEWSLETTER
Software Research, Inc.
901 Minnesota Street
San Francisco, CA 94107 USA
Phone: (415) 550-3020
Toll Free: (800) 942-SOFT
FAX: (415) 550-3030
E-mail: ttn@soft.com
## End ##