Objective 1: |
- Explain how and why the timing and level of involvement for the Test Analyst varies when working with different software development lifecycle models.
|
Objective 2: |
- Summarize the appropriate tasks for the Test Analyst when conducting analysis and design activities.
|
Objective 3: |
- Explain why test conditions should be understood by the stakeholders.
|
Objective 4: |
- For a given project scenario, select the appropriate design level for test cases (high-level or low-level).
|
Objective 5: |
- Explain the issues to be considered in test case design.
|
Objective 6: |
- Summarize the appropriate tasks for the Test Analyst when conducting test implementation activities.
|
Objective 7: |
- Summarize the appropriate tasks for the Test Analyst when conducting test execution activities.
|
Objective 8: |
- For a given situation, participate in risk identification, perform risk assessment and propose appropriate risk mitigation.
|
Objective 9: |
- Analyze a given specification item(s) and design test cases by applying equivalence partitioning.
|
Objective 10: |
- Analyze a given specification item(s) and design test cases by applying boundary value analysis.
|
Objective 11: |
- Analyze a given specification item(s) and design test cases by applying decision table testing.
|
Objective 12: |
- Analyze a given specification item(s) and design test cases by applying state transition testing.
|
Objective 13: |
- Explain how classification tree diagrams support test techniques.
|
Objective 14: |
- Analyze a given specification item(s) and design test cases by applying pairwise testing.
|
Objective 15: |
- Analyze a system, or its requirement specification, in order to determine likely types of defects to be found and select the appropriate black-box test technique(s).
|
Objective 16: |
- Explain the principles of experience-based test techniques, and the benefits and drawbacks compared to black-box and defect-based test techniques.
|
Objective 17: |
- Determine exploratory tests from a given scenario.
|
Objective 18: |
- Describe the application of defect-based test techniques and differentiate their use from black-box test techniques.
|
Objective 19: |
- For a given project situation, determine which black-box or experience-based test techniques should be applied to achieve specific goals.
|
Objective 20: |
- Explain what test techniques are appropriate to test functional completeness, correctness and appropriateness.
|
Objective 21: |
- Define the typical defects to be targeted for the functional completeness, correctness and appropriateness characteristics.
|
Objective 22: |
- Define when the functional completeness, correctness and appropriateness characteristics should be tested in the software development lifecycle.
|
Objective 23: |
- Explain the approaches that would be suitable to verify and validate both the implementation of the usability requirements and the fulfillment of the user's expectations.
|
Objective 24: |
- Explain the role of the test analyst in interoperability testing including identification of the defects to be targeted.
|
Objective 25: |
- Explain the role of the test analyst in portability testing including identification of the defects to be targeted.
|
Objective 26: |
- For a given set of requirements, determine the test conditions required to verify the functional and/or non-functional quality characteristics within the scope of the Test Analyst.
|
Objective 27: |
- Identify problems in a requirements specification according to checklist information provided in the syllabus.
|
Objective 28: |
- Identify problems in a user story according to checklist information provided in the syllabus.
|
Objective 29: |
- For a given scenario, determine the appropriate activities for a Test Analyst in a keyword-driven automation project.
|
Objective 30: |
- Explain the usage and types of test tools applied in test design, test data preparation and test execution.
|
Objective 1: |
- Explain the usage and types of test tools applied in test design, test data preparation and test execution
|
Objective 2: |
- Identify problems in a user story according to checklist information provided in the syllabus.
|
Objective 3: |
- Identify problems in a requirements specification according to checklist information provided in the syllabus.
|
Objective 5: |
- For a given set of requirements, determine the test conditions required to verify the functional and/or non-functional quality characteristics within the scope of the Test Analyst.
|
Objective 6: |
- Explain the role of the test analyst in portability testing including identification of the defects to be targeted.
|
Objective 7: |
- Explain the role of the test analyst in interoperability testing including identification of the defects to be targeted.
|
Objective 8: |
- Explain the approaches that would be suitable to verify and validate both the implementation of the usability requirements and the fulfillment of the user's expectations.
|
Objective 9: |
Define when the functional completeness, correctness and appropriateness characteristics should be tested in the software development lifecycle. |
Objective 10: |
- Define the typical defects to be targeted for the functional completeness, correctness and appropriateness characteristics.
|
Objective 11: |
- Explain what test techniques are appropriate to test functional completeness, correctness and appropriateness.
|
Objective 12: |
- For a given project situation, determine which black-box or experience-based test techniques should be applied to achieve specific goals.
|
Objective 13: |
- Describe the application of defect-based test techniques and differentiate their use from black-box test techniques.
|
Objective 14: |
- Determine exploratory tests from a given scenario.
|
Objective 15: |
- Analyze a system, or its requirement specification, in order to determine likely types of defects to be found and select the appropriate black-box test technique(s).
|
Objective 17: |
- Analyze a given specification item(s) and design test cases by applying pairwise testing.
|
Objective 18: |
- Explain how classification tree diagrams support test techniques.
|
Objective 19: |
- Analyze a given specification item(s) and design test cases by applying state transition testing.
|
Objective 20: |
- Analyze a given specification item(s) and design test cases by applying decision table testing.
|
Objective 21: |
- Analyze a given specification item(s) and design test cases by applying boundary value analysis.
|
Objective 22: |
- Analyze a given specification item(s) and design test cases by applying equivalence partitioning.
|
Objective 23: |
- For a given situation, participate in risk identification, perform risk assessment and propose appropriate risk mitigation.
|
Objective 24: |
- Summarize the appropriate tasks for the Test Analyst when conducting test execution activities.
|
Objective 25: |
- Summarize the appropriate tasks for the Test Analyst when conducting test implementation activities.
|
Objective 26: |
- Explain the issues to be considered in test case design.
|
Objective 27: |
- Explain why test conditions should be understood by the stakeholders.
|
Objectgive 28: |
- Summarize the appropriate tasks for the Test Analyst when conducting analysis and design activities.
|
Objective 29: |
- Explain how and why the timing and level of involvement for the Test Analyst varies when working with different software development lifecycle models.
|
Official Information |
https://www.istqb.org/certification-path-root/advanced-level/advanced-level-test-analyst.html |