Just Enough Software Test Automation
Bisher € 50,49
Besorgung - Lieferbarkeit unbestimmt
BeschreibungFor students in any course on software quality, software testing, software test automation, or software project management.
Automated testing is a crucial element of any strategy for improving software quality and reducing time-to-market. Just Enough Software Test Automation is a practical, hands-on guide to software test automation from the perspective of test developers and users. It offers real-world do's and don'ts for designing and implementing test automation infrastructure, combined with pragmatic advice on what today's most popular approaches to automated testing can and cannot accomplish.
InhaltsverzeichnisPreface. What Is Just Enough Test Automation? No New Models, Please! A Life Cycle Is Not a Process. A Tool Is Not a Process. How Much Automation Is Enough? Testing Process Spheres. Test Planning. Test Design. Test Implementation. Support Activities. Testing Is a Team Effort. A Test Automation Group's Scope and Objectives. The Scope. Assumptions, Constraints, and Critical Success Factors for an Automated Testing Framework. Test Automation Framework Deliverables. An Automation Plan. Categories of Testing Tools. Conclusion. References. Knowing When and What to Automate. In General. When to Automate System Tests. Time to Automate Is Always the Number One Factor. An Extreme Example. A Quantitative Example. What to Automate. A Word About Creating Test Scripts. Conclusion. References. Start at the Beginning, Define the Test Requirements, Design the Test Data. Software/Test Requirements. Requirements Gathering and Test Planning Automation. From Software Requirement to Test Requirement to Test Condition: An Automated Approach. Requirements Management and Traceability. Functional Test Data Design. Black Box (Requirements-Based) Approaches. Gray Box (Both Requirements- and Code-Based) Approaches. White Box (Code-Based) Approaches. Requirements-Based Approaches. Requirements-Driven Cause-Effect Testing. Equivalence Partitioning, Boundary Analysis, and Error Guessing. Defining Boundary Conditions for Equivalence Classes. Error Guessing. Hybrid (Gray Box) Approaches. Decision Logic Tables. DLT as a Software Testing Tool. An Automated DLT Design Tool. Code-Based Approaches. Basis Testing Overview. The Basis Testing Technique. Conclusion. References. A Look at the Development of Automated Test Scripts and at the Levels of Test Automation. Developing Automated Test Scripts. Unit-Level Tests. System-Level Tests. Specialized System-Level Tests. Recording versus Programming Test Scripts. Conclusion. References. Automated Unit Testing. Introduction. Unit Testing Justification. The Unit Testing Process. A Rigorous Approach to Unit Testing. The Unit Test Specification. Unit Testing Tasks. Implementating the Unit Testing Tasks. Rules of Thumb for Unit Testing. Unit Testing Data. A Unit Testing Framework. For Object-Oriented Development Using Java. Conclusion. References. Automated Integration Testing. Introduction. What Is Integration Testing? The Daily Build Smoke Test. Build Smoke Test Objectives. Automated Build Smoke Test Checklist. Conclusion. References. Automated System/Regression Testing Frameworks. The Data-Driven Approach. Framework-Driven (Structured) Test Scripts. Developing Framework-Driven Test Scripts. The Archer Group Framework. Business Rules Test. GUI Test. Properties Test. Input Data Test. Formatting the Test Data File. Application-Level Errors. Building the External Data Input File. Data File Summary. Code Construction for the Business Rules Test. The Shell Script. The Main Script. After the Data Are Read. Keep Your Code Clean and Robust. Archer Group Summary. Carl Nagle's DDE Framework. DDE Overview. DDE Development Effort. Keith Zambelich's Test Plan Driven Testing Framework for Mercury Interactive Users. Zambelich Approach Summary. "Test Plan Driven" Method Architecture. Using TestDirector to Drive the Test Suite. Conclusion. References. The Control Synchronized Data-Driven Testing Framework in Depth. Creating Data-Driven Test Scripts. Implementing the CSDDT Approach. Common Problems and Solutions. Problem: Data Input. Solution: Utilize Input Data Text Files. Problem: Program Flow Changes. Solution: Let the Input Data Do the Driving. Problem: Managing Application Changes. Solution: Rerecord or Modify a Very Small Section of Code. Setting Up Common Startup and Ending Test Conditions. Modifying Recorded Code to Accept Input Data. Very Important Practices. Creating Functions for Common Operations-Isolating Command Objects. Continuing with the Program Flow. Using Multiple Input Records to Create a Test Scenario. Utilizing Dynamic Data Input-Keyword Substitution. Using Library or Include Files (*.sbh and *.sbl Files in Rational Robot). Utility Scripts. Debugging Your Scripts-When the Test(s) Uncover a Defect. Implementing the CSDDT Template Scripts. The DDScripts. SQABasic32 Include Files. Utility Scripts. An Example of the CSDDT Framework. Script File List. Library File List. Directions for Installing the Example Files. Conclusion. References. Facilitating the Manual Testing Process with Automated Tools. Introduction. Semiautomated Manual Testing Process Steps. Step 1. Identify and document the test objectives. Step 2. Translate the test objectives into specific test requirements. Step 3. Translate the test requirements into test conditions. Step 4. Construct the test data. Step 5. Execute the manual tests. Using the List Box. Manual Testing Artifacts. Conclusion. References. Managing Automated Tests. Writing Effective Test Scripts and Test Data. Managing Manual and Automated Test Scripts. Test Suite Maintenance. Conclusion. References. Data-Driven Automation: User Group Discussion. Automated Testing Terms and Definitions. Example Test Automation Project Plan Using Rational Suite TestStudio. Introduction. Documentation References. Internal Sources. External Sources. Automation Implementation. Test Management. Test Design Phase. Test Implementation Phase. Test Execution Phase. Automation Environment. Test Development Workstations. Test Data Store Server. Test Execution Workstations. Test Application Servers. Organizational Structure. External Interfaces. Roles and Responsibilities. Roles. Responsibilities. Project Estimates. Test Automation Project Work Plan Template. Work Breakdown Structure. Project Start-Up Phase. Timeline. Implementation Iteration Objectives. Project Schedule. Project Resources. Budget. Project Monitoring and Control. Automation Effort Estimation. Schedule Control Plan. Budget Control Plan. Reporting Plan. Measurement Plan. Supporting Processes. Configuration Management Plan. Defect Tracking and Problem Resolution. Framework Evaluation. Framework Documentation Plan. Process Improvement. Index.
PortraitDANIEL J. MOSLEY is founder and principal of Client-Server Software Testing Technologies and author of The Handbook of MIS Application Software Testing and Client-Server Software Testing on the Desktop and Web (Prentice Hall PTR). A Certified Software Test Engineer (CSTE), Mosley served as senior consultant and seminar leader for the Quality Assurance Institute and authored the TEST-RxTM Methodology. BRUCE A. POSEY specializes in developing and implementing data-driven, framework-based test scripts utilizing SQA Suite/Rational Team Test. He has nearly 30 years' IT experience in diverse roles at MasterCard, Deutsche Financial Services, SBC, and other leading firms. He is owner and principal consultant of The Archer Group, which specializes in software testing and training.
Untertitel: 'Just Enough (Yourdon Press)'. Sprache: Englisch.
Verlag: PRENTICE HALL
Erscheinungsdatum: Juli 2002
Seitenanzahl: 281 Seiten