As Wolfgang had introduced the topic on functional and EU specific ‘testing the ODF translator tool’ in his blog last week I would like to talk a bit more about the functional, setup and system testing areas that Aztecsoft is involved with.

High level picture: We started off with testing the translator tool prototype back in June 2006 with the development of a test plan to organize our testing efforts and ongoing creation of test scenarios to cover the functional aspects of testing. With the increase in feature set of the translator tool we started off with creation and execution of performance scenarios as well. With the different flavors of the translator being made available for working on Word 2003 and Word XP, we also created a compatibility matrix and have been involved with testing the different translator flavors and investigating installation dependencies when required. The test scenarios for performance and setup and test plan are available for viewing in the documentation area of the project site. We have also worked out a process with the development team on procuring two builds a week for focused testing and filing bugs early on during the development phase. Identification of a set of common end-user scenarios or Build Verification Tests (BVT) helps us in ‘accepting’ a build for further rigorous testing. The Aztecsoft test team also contributed in putting up the end-user feature list for the translator tool.

On processes and tools: On testing functionality of the translator tool we work on creating test cases for the functional area for instance coming up with test scenarios for fonts and formatting, paragraphs, tables, etc and then in-depth test cases for the same. For ensuring good test coverage we are employing orthogonal array technique for creating pair-wise test cases. This means using the ‘pairs’ tool we pass inputs so as to get a combination of test cases generated. Say, for e.g. if we pass font styles like Arial, Verdana and font faces like Bold, Italic, etc with font sizes like 8, 10, etc we would be getting test cases like “Test Arial font with Bold and size 8”, “Test Verdana font with Bold and size 10” and so on and so forth. For the tests identified by this process data generation is another challenging activity to ensure exercising appropriate code paths. The tests are executed to test both the UI and the command line tool. Setup test cases are to ensure the product installs fine on the platforms identified with the appropriate Word and translator flavor combinations and to ensure good user experience. Our performance tests currently test the document size limit and performance of the translator on low memory conditions besides exercising some negative scenarios. Apart from testing the builds for feature completeness and conducting timely performance tests and filing bugs on the same, the Aztecsoft test team has designed and is developing an automation tool to automate most of the scenarios to develop a regression test automation suite for testing the ‘growing’ ODF translator tool. This is based on image / visual comparison techniques for ensuring document fidelity post conversion. The framework is being coded in C# and XML files are being employed for passing inputs to route to the required module execution and due to this the automation is modularized. Our plan is also to make this automation framework flexible so as to test other similar conversions after plugging in necessary code. Hence one thing to watch out for in the upcoming releases would be the test automation results!