Triaged Tester

March 11, 2009

How to Implement Test Automation Framework Methodology

Filed under: Automation,Checklist,Stratergies,Tips — Triaged Tester @ 6:10 am
Tags: , ,

1.Identification of the Scope of Testing: Company Oriented, Product Oriented, Project Oriented
2.Identification of the Needs of Testing: Identify Types of testing e.g. FT, Web Services etc. and application / modules to be tested.
3.Identification of the Requirements of Testing: Find out the Nature of Requirements, Identification of type of actions for each requirement & identification of High Priority Requirements
4.Evaluation of the Test Automation Tool: Preparation of Evaluation Checklist, Identification of the Candidate Tools Available, Sample Run, Rate & Select the Tool, Implementation & Training
5.Identification of the Actions to be automated: Actions, Validations & Requirements supported by the Tool
6.Design of the Test Automation Framework: Framework Guidelines, Validations, Actions Involved, Systems Involved, Tool Extensibility Support, Customs Messages & UML Documentation
7.Design of the Input Data Bank: Identification of Types of Input file, Categorization & Design of File Prototypes
8.Development of the Automation Framework: Development of Script based upon Framework Design, Driver Scripts, Worker Scripts, Record / Playback, Screen / Window / Transaction, Action / Keyword & Data Driven
9.Population of Input Data Bank: Different Types of Data Input, Population of Data from Different Data Sources, Manual Input of Data and Parent – Child Data Hierarchy
10.Configuration of the Schedulers: Identify Scheduler Requirements & Configure the Schedulers.


February 4, 2009

Logging – Do’s & Don’ts

Filed under: Automation,Checklist,Guidelines — Triaged Tester @ 7:22 am
Tags: , ,

Logging is imperative for the successful release of any product. It is required for ensuring that enough test cases are passing and release criteria is being met with. Without proper logging, there is no way of determining, if a component meets the QA needs as prescribed. The logging infrastructure has to be robust.

 Error reporting is one of the crucial aspects of logging required for test cases. Errors, Warnings and informational messages should be context sensitive and be able to point to the source of the problem in the test code. They should also be as much unique as possible so that a result from a wild card search should be able to point the test suite and the test Case the error message belongs to.




  • Use custom logging only in conjunction with logging provided by operations team.
  • Do not log multiple passes/failures per test case.
  • Do not instantiate your own version of the Logging object provided by operations team.
  • No “No repro on rerun” explanations.
  • Do not log a failure in place of warnings. A failing Test caseshould report only one failure.




  • Do use the logging infrastructure provided by the operations team.
  • Do use only one Pass or one Fail per test case.
  • Tests should always log a result per test case.
  • Do Use multiple warnings instead of failures.
  • Log as much information needed for easier troubleshooting.
  • Log the state of your objects for easier troubleshooting.

February 3, 2009

UI Automation – Do’s & Don’ts

Filed under: Automation,Checklist,Guidelines,Tips — Triaged Tester @ 7:19 am
Tags: , ,


  • Explicitly set focus to the window on which the test Case is expected to input data.
  • Capture screen shots for failures. Helps in debugging.
  • Point to dev resource files for localized strings in a UI app. This would help a lot in pseudo loc testing.
  • Leave the UI in a known state after a test Case is done executing. E.g.: Close the wizard even if the test Case is to test the first screen of a wizard by canceling out of the wizard after the first screen.
  • Validate that the object being created using the UI with the one being stored in the backend system
  • Log every valid UI operation being performed in the test Case.
  • Use resource files for all data required to be entered in the UI.
  • Try to localize the usage of the UI automation tool as much as possible in order to avoid deep dependencies with one tool. E.g. Encapsulate the usage of the UI automation tool into a small framework for your UI automation. Use this framework to write test cases rather than the tool directly. This would decouple the tool from the test cases to a large extent. If the team decides to use a new tool, only the framework would need to rewritten using the new tool but the bulk of test code remains unchanged.



  • Do not log multiple times.
  • Do not sleep after every UI operation.
  • Do not give unusually long timeouts for every major operation.
  • Fail immediately after an unexpected screen.
  • Do not re-launch the UI application for every test Case as it is a time consuming process

Create a free website or blog at