AutomationTestPlanTemplate1docQA and Testing Tutorial

上传人:痛*** 文档编号:79639476 上传时间:2022-04-24 格式:DOC 页数:28 大小:196.51KB
返回 下载 相关 举报
AutomationTestPlanTemplate1docQA and Testing Tutorial_第1页
第1页 / 共28页
AutomationTestPlanTemplate1docQA and Testing Tutorial_第2页
第2页 / 共28页
AutomationTestPlanTemplate1docQA and Testing Tutorial_第3页
第3页 / 共28页
点击查看更多>>
资源描述
Generic Project Prepared forProject NamePrepared byCompany NameDateJuly 17, 2001 2001, COMPANY NAME. All rights reserved.This documentation is the confidential and proprietary intellectual property of COMPANY NAME. Any unauthorized use, reproduction, preparation of derivative works, performance, or display of this document, or software represented by this document, without the express written permission of COMPANY. is strictly prohibited.COMPANY NAME and the COMPANY NAME logo design are trademarks and/or service marks of an affiliate of COMPANY NAME. All other trademarks, service marks, and trade names are owned by their respective companies.PROJECT NAMEAutomated Testing Detail Test PlanDOCUMENT REVISION INFORMATIONThe following information is to be included with all versions of the document.Project NameProject NumberPrepared byDate PreparedRevised byDate RevisedRevision ReasonRevision Control No.Revised byDate RevisedRevision ReasonRevision Control No.Revised byDate RevisedRevision ReasonRevision Control No.PROJECT NAMEAutomated Testing Detail Test PlanDOCUMENT APPROVALThis signature page is to indicate approval from COMPANY NAME sponsor and Client sponsor for the attached PROJECT NAME Detail Test Plan for the PROJECT NAME. All parties have reviewed the attached document and agree with its contents.COMPANY NAME Project Manager: Name, Title: Project Manager, PROJECT NAMEDateCUSTOMER Project Manager: Name, Title:DateCOMPANY NAME/DEPARTMENT Sponsor: Name, Title: DateCOMPANY NAME Sponsor: Name, Title: DateCUSTOMER NAME Sponsor: Name, Title: DateCOMPANY NAME Manager: Name, Title: DateSabre Inc. Confidential/All Rights Reserved ivTable of Contents1Introduction11.1 Automated Testing DTP Overview12Test Description22.1 Test Identification22.2 Test Purpose and Objectives22.3 Assumptions, Constraints, and Exclusions22.4 Entry Criteria22.5 Exit Criteria32.6 Pass/Fail Criteria33Test Scope53.1 Items to be tested by Automation53.2 Items not to be tested by Automation54Test Approach64.1 Description of Approach65Test Definition75.1 Test Functionality Definition (Requirements Testing)75.2 Test Case Definition (Test Design)75.3 Test Data Requirements75.4 Automation Recording Standards75.5 Loadrunner Menu Settings85.6 Loadrunner Script Naming Conventions85.7 Loadrunner GUIMAP Naming Conventions85.8 Loadrunner Result Naming Conventions95.9 Loadrunner Report Naming Conventions95.10 Loadrunner Script, Result and Report Repository96Test Preparation Specifications116.1 Test Environment116.2 Test Team Roles and Responsibilities126.3 Test Team Training Requirements136.4 Automation Test Preparation137Test Issues and Risks147.1 Issues147.2 Risks148Appendices168.1 Traceability Matrix168.2 Definitions for Use in Testing188.2.1 Test Requirement188.2.2 Test Case188.2.3 Test Procedure188.3 Automated Test Cases198.3.1 NAME OF FUNCTION Test Case199Project Glossary219.1 Glossary Reference219.2 Sample Addresses for Testing229.3 Test Equipment Example Credit card numbers23Sabre Inc. Confidential/All Rights ReservedTable of Contents vi1Introduction1 Introduction1.1 Automated Testing DTP OverviewThis Automated Testing Detail Test Plan (ADTP) will identify the specific tests that are to be performed to ensure the quality of the delivered product. System/Integration Test ensures the product functions as designed and all parts work together. This ADTP will cover information for Automated testing during the System/Integration Phase of the project and will map to the specification or requirements documentation for the project. This mapping is done in conjunction with the Traceability Matrix document, that should be completed along with the ADTP and is referenced in this document.This ADTP refers to the specific portion of the product known as PRODUCT NAME. It provides clear entry and exit criteria, and roles and responsibilities of the Automated Test Team are identified such that they can execute the test.The objectives of this ADTP are: Describe the test to be executed. Identify and assign a unique number for each specific test. Describe the scope of the testing. List what is and is not to be tested. Describe the test approach detailing methods, techniques, and tools. Outline the Test Design including: Functionality to be tested. Test Case Definition. Test Data Requirements. Identify all specifications for preparation. Identify issues and risks. Identify actual test cases. Document the design point or requirement tested for each test case as it is developed.Sabre Inc. Confidential/All Rights ReservedIntroduction 12Test Description2 Test Description2.1 Test IdentificationThis ADTP is intended to provide information for System/Integration Testing for the PRODUCT NAME module of the PROJECT NAME. The test effort may be referred to by its PROJECT REQUEST (PR) number and its project title for tracking and monitoring of the testing progress.2.2 Test Purpose and ObjectivesAutomated testing during the System/Integration Phase as referenced in this document is intended to ensure that the product functions as designed directly from customer requirements. The testing goal is to identify the quality of the structure, content, accuracy and consistency, some response times and latency, and performance of the application as defined in the project documentation.2.3 Assumptions, Constraints, and ExclusionsFactors which may affect the automated testing effort, and may increase the risk associated with the success of the test include: Completion of development of front-end processes Completion of design and construction of new processes Completion of modifications to the local database Movement or implementation of the solution to the appropriate testing or production environment Stability of the testing or production environment Load Discipline Maintaining recording standards and automated processes for the project Completion of manual testing through all applicable paths to ensure that reusable automated scripts are valid2.4 Entry CriteriaThe ADTP is complete, excluding actual test results. The ADTP has been signed-off by appropriate sponsor representatives indicating consent of the plan for testing.The Problem Tracking and Reporting tool is ready for use. The Change Management and Configuration Management rules are in place.The environment for testing, including databases, application programs, and connectivity has been defined, constructed, and verified. 2.5 Exit CriteriaIn establishing the exit/acceptance criteria for the Automated Testing during the System/Integration Phase of the test, the Project Completion Criteria defined in the Project Definition Document (PDD) should provide a starting point. All automated test cases have been executed as documented. The percent of successfully executed test cases met the defined criteria. Recommended criteria: No Critical or High severity problem logs remain open and all Medium problem logs have agreed upon action plans; successful execution of the application to validate accuracy of data, interfaces, and connectivity. 2.6 Pass/Fail CriteriaThe results for each test must be compared to the pre-defined expected test results, as documented in the ADTP (and DTP where applicable). The actual results are logged in the Test Case detail within the Detail Test Plan if those results differ from the expected results. If the actual results match the expected results, the Test Case can be marked as a passed item, without logging the duplicated results.A test case passes if it produces the expected results as documented in the ADTP or Detail Test Plan (manual test plan). A test case fails if the actual results produced by its execution do not match the expected results. The source of failure may be the application under test, the test case, the expected results, or the data in the test environment. Test case failures must be logged regardless of the source of the failure.Any bugs or problems will be logged in the DEFECT TRACKING TOOL.The responsible application resource corrects the problem and tests the repair. Once this is complete, the tester who generated the problem log is notified, and the item is re-tested. If the retest is successful, the status is updated and the problem log is closed.If the retest is unsuccessful, or if another problem has been identified, the problem log status is updated and the problem description is updated with the new findings. It is then returned to the responsible application personnel for correction and test.Severity Codes are used to prioritize work in the test phase. They are assigned by the test group and are not modifiable by any other group. The following standard Severity Codes to be used for identifying defects are:Table 1 Severity CodesSeverity Code NumberSeverity Code NameDescription1CriticalAutomated tests cannot proceed further within applicable test case (no work around)2HighThe test case or procedure can be completed, but produces incorrect output when valid information is input.3MediumThe test case or procedure can be completed and produces correct output when valid information is input, but produces incorrect output when invalid information is input.(e.g. no special characters are allowed as part of specifications but when a special character is a part of the test and the system allows a user to continue, this is a medium severity)4LowAll test cases and procedures passed as written, but there could be minor revisions, cosmetic changes, etc. These defects do not impact functional execution of systemThe use of the standard Severity Codes produces four major benefits: Standard Severity Codes are objective and can be easily and accurately assigned by those executing the test. Time spent in discussion about the appropriate priority of a problem is minimized. Standard Severity Code definitions allow an independent assessment of the risk to the on-schedule delivery of a product that functions as documented in the requirements and design documents. Use of the standard Severity Codes works to ensure consistency in the requirements, design, and test documentation with an appropriate level of detail throughout. Use of the standard Severity Codes promote effective escalation procedures.Sabre Inc. Confidential/All Rights ReservedPass/Fail Criteria 43Test Scope3 Test ScopeThe scope of testing identifies the items which will be tested and the items which will not be tested within the System/Integration Phase of testing.3.1 Items to be tested by Automation1. PRODUCT NAME2. PRODUCT NAME3. PRODUCT NAME4. PRODUCT NAME5. PRODUCT NAME3.2 Items not to be tested by Automation1. PRODUCT NAME2. PRODUCT NAME Sabre Inc. Confidential/All Rights ReservedTest Scope 54Test Approach4 Test Approach4.1 Description of ApproachThe mission of Automated Testing is the process of identifying recordable test cases through all appropriate paths of a website, creating repeatable scripts, interpreting test results, and reporting to project management. For the Generic Project, the automation test team will focus on positive testing and will complement the manual testing undergone on the system. Automated test results will be generated, formatted into reports and provided on a consistent basis to Generic project management. System testing is the process of testing an integrated hardware and software system to verify that the system meets its specified requirements. It verifies proper execution of the entire set of application components including interfaces to other applications. Project teams of developers and test analysts are responsible for ensuring that this level of testing is performed. Integration testing is conducted to determine whether or not all components of the system are working together properly. This testing focuses on how well all parts of the web site hold together, whether inside and outside the website are working, and whether all parts of the website are connected. Project teams of developers and test analyst are responsible for ensuring that this level of testing is performed.For this project, the System and Integration ADTP and Detail Test Plan complement each other.Since the goal of the System and Integration phase testing is to identify the quality of the structure, content, accuracy and consistency, response time and latency, and performance of the application, test cases are included which focus on determining how well this quality goal is accomplished. Content testing focuses on whether the content of the pages match what is supposed to be there, whether key phrases exist continually in changeable pages, and whether the pages maintain quality content from version to version. Accuracy and consistency testing focuses on whether todays copies of the pages download the same as yesterdays, and whether the data presented to the user is accurate enough. Response time and latency testing focuses on whether the web site server responds to a browser request within certain performance parameters, whether response time after a SUBMIT is acceptable, or whether parts of a site are so slow that the user discontinues working. Although Loadrunner provides the full measure of this test, there will be various AD HOC time measurements within certain Loadrunner Scripts as needed.Performance testing (Loadrunner) focuses on whether performance varies by time of day or by load and usage, and whether performance is adequate for the application.Completion of automated test cases is denoted in the test cases with indication of pass/fail and follow-up action.COMPANY NAME Confidential/All Rights ReservedProject Glossary 235Test Definition5 Test DefinitionThis section addresses the development of the components required for the specific test. Included are identification of the functionality to be tested by automation, the associated automated test cases and scenarios. The development of the test components parallels, with a slight lag, the development of the associated product components.5.1 Test Functionality Definition (Requirements Testing)The functionality to be automated tested is listed in the Traceability Matrix, attached as an appendix. For each function to undergo testing by automation, the Test Case is identified. Automated Test Cases are given unique identifiers to enable cross-referencing between related test documentation, and to facilitate tracking and monitoring the test progress. As much information as is available is entered into the Traceability Matrix in order to complete the scope of automation during the System/Integration Phase of the test.5.2 Test Case Definition (Test Design)Each Automated Test Case is designed to validate the associated functionality of a stated requirement. Automated Test Cases include unambiguous input and output specifications. This information is documented within the Automated Test Cases in Appendix 8.5 of this doc.5.3 Test Data RequirementsThe automated test data required for the test is described below. The test data will be used to populate the data bases and/or files used by the application/system during the System/Integration Phase of the test. 5.4 Automation Recording StandardsInitial Automation Testing Rules for the Generic Project:1. Ability to move through all paths within the applicable system2. Ability to identify and record the GUI Maps for all associated test items in each path3. Specific times for loading into automation test environment4. Code frozen between loads into automation test environment5. Minimum acceptable system stability5.5 Loadrunner Menu Settings1. Default recording mode is CONTEXT SENSITIVE2. Record owner-drawn buttons as OBJECT3. Maximum length of list item to record is 253 characters4. Delay for Window Synchronization is 1000 milliseconds (unless Loadrunner is operating in same environment and then must increase appropriately)5. Timeout for checkpoints and CS statements is 1000 milliseconds6. Timeout for Text Recognition is 500 milliseconds7. All scripts will stop and start on the main menu page8. All recorded scripts will remain short; Debugging is easier. However, the entire script, or portions of scripts, can be added together for long runs once the environment has greater stability.5.6 Loadrunner Script Naming Conventions1. All automated scripts will begin with GE abbreviation representing the Generic Project and be filed under the Loadrunner on LAB W Drive/Generic/Scripts Folder. 2. GE will be followed by the Product Path name in lower case: air, htl, car3. After the automated scripts have been debugged, a date for the script will be attached: 0710 for July 10. When significant improvements have been made to the same script, the date will be changed.4. As incremental improvements have been made to an automated script, version numbers will be attached signifying the script with the latest improvements: eg. GEsea0710.1 GEsea0710.2 The .2 version is the most up-to-date5.7 Loadrunner GUIMAP Naming Conventions1. All Generic GUI Maps will begin with GE followed by the area of test. Eg. GEsea. GEpond GUI Map represents all pond paths. GEmemmainmenu GUI Map represents all membership and main menu concerns. GElogin GUI Map represents all GE login concerns.2. As there can only be one GUI Map for each Object, etc on the site, they are under constant revision when the site is undergoing frequent program loads. 5.8 Loadrunner Result Naming Conventions1. When beginning a script, allow default res# name to be filed2. After a successful run of a script where the results will be used toward a report, move file to results and rename: GE for project name, res for Test Results, 0718 for the date the script was run, your initials and the original default number for the script. Eg. GEres0718jr.1 5.9 Loadrunner Report Naming Conventions1. When the accumulation of test result(s) files for the day are formulated, and the statistics are confirmed, a report will be filed that is accessible by upper management. The daily Report file will be as follows: GEdaily0718 GE for project name, daily for daily report, and 0718 for the date the report was issued.2. When the accumulation of test result(s) files for the week are formulated, and the statistics are confirmed, a report will be filed that is accessible by upper management. The weekly Report file will be as follows: GEweek0718 GE for project name, week for weekly report, and 0718 for the date the report was issued.5.10 Loadrunner Script, Result and Report Repository1. LAB 11, located within the GE Test Lab, will “house” the original Loadrunner Script, Results and Report Repository for automated testing within the Generic Project. WRITE access is granted Loadrunner Technicians and READ ONLY access is granted those who are authorized to run scripts but not make any improvements. This is meant to maintain the purity of each script version.2. Loadrunner on LAB W Drive houses all Loadrunner related documents, etc for GE automated testing.3. Proje
展开阅读全文
相关资源
正为您匹配相似的精品文档
相关搜索

最新文档


当前位置:首页 > 办公文档


copyright@ 2023-2025  zhuangpeitu.com 装配图网版权所有   联系电话:18123376007

备案号:ICP2024067431-1 川公网安备51140202000466号


本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。装配图网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知装配图网,我们立即给予删除!