fbpx

Setting up Automated Tests in the Script Library

If you have Duette installed, automated tests can easily be added to the script library. To add an automated test to your library:

Right click on your script package and select Add Automated Test.

You can also simply click on the automated test icon on the navigator tool bar.

You can also create a script for the automated test from the Script Library summary screens:

The Edit Automated Test Screen will open.  Here you can add the Name of the script, assign it to a user and select the type of automated test.  Once these fields are completed, the Default Path field will open.  In this field, the path to the automated test tool results in your file system can be specified.

Additionally, if you select Unit Test Results, a Sub-Type field will be displayed where you can select the type of Unit Tests required to be imported.  If you select a Custom Sub-Type you must provide the path to a valid XSLT file.

Similarly to manual test scripts, you can view all relationships associated to the script from the Relationships tab which displays all associated requirements and execution sets. 

When all details have been completed, click Save and Close.  The Automated Test Script will now be listed in the Script Library.

JUnit Format

Automated tests are also included in the following execution status screen, displayed when users double click on an execution package:

  • Execution Status Grid

The Execution Status on the execution set package now contains all automated and manual test execution results.

  • Filter By Entity Type

You can filter by entity type in the assignments grid view.

  • Assign Automated Tests

You can assign automated tests to specific Enterprise Tester users if required.

<?xml version="1.0" encoding="UTF-8"?>
<testsuites>
   <testsuite name="JUnitXmlReporter" errors="0" tests="0" failures="0" time="0" timestamp="2013-05-24T10:23:58" />
   <testsuite name="JUnitXmlReporter.constructor" errors="0" skipped="1" tests="3" failures="1" time="0.006" timestamp="2013-05-24T10:23:58">
      <properties>
         <property name="java.vendor" value="Sun Microsystems Inc." />
         <property name="compiler.debug" value="on" />
         <property name="project.jdk.classpath" value="jdk.classpath.1.6" />
      </properties>
      <testcase classname="JUnitXmlReporter.constructor" name="should default path to an empty string" time="0.006">
         <failure message="test failure">Assertion failed</failure>
      </testcase>
      <testcase classname="JUnitXmlReporter.constructor" name="should default consolidate to true" time="0">
         <skipped />
      </testcase>
      <testcase classname="JUnitXmlReporter.constructor" name="should default useDotNotation to true" time="0" />
   </testsuite>
</testsuites>

Below is the documented structure of a typical JUnit XML report. Notice that a report can contain 1 or more test suite. Each test suite has a set of properties (recording environment information).   Each test suite also contains 1 or more test case and each test case will either contain a skipped, failure or error node if the test did not pass.  If the test case has passed, then it will not contain any nodes.  For more details of which attributes are valid for each node please consult the following section “Schema”.

<testsuites>        => the aggregated result of all junit testfiles
  <testsuite>       => the output from a single TestSuite
    <properties>    => the defined properties at test execution
      <property>    => name/value pair for a single property
      ...
    </properties>
    <error></error> => optional information, in place of a test case - normally if the tests in the suite could not be found etc.
    <testcase>      => the results from executing a test method
      <system-out>  => data written to System.out during the test run
      <system-err>  => data written to System.err during the test run
      <skipped/>    => test was skipped
      <failure>     => test failed
      <error>       => test encountered an error
    </testcase>
    ...
  </testsuite>
  ...
</testsuites>

Schema

The JUnit XML Report output comes from a build tool called Nant, as opposed to the JUnit project itself – thus it can be a little tricky to nail down an official spec for the format, even though it’s widely adopted and used.   There have been a number of attempts to codify the schema, first off there is an XSD for JUnit:

XSD

<?xml version="1.0" encoding="UTF-8"?>
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" elementFormDefault="qualified" attributeFormDefault="unqualified">
   <xs:annotation>
      <xs:documentation xml:lang="en">JUnit test result schema for the Apache Ant JUnit and JUnitReport tasks
Copyright © 2011, Windy Road Technology Pty. Limited
The Apache Ant JUnit XML Schema is distributed under the terms of the GNU Lesser General Public License (LGPL) http://www.gnu.org/licenses/lgpl.html
Permission to waive conditions of this license may be requested from Windy Road Support (http://windyroad.org/support).</xs:documentation>
   </xs:annotation>
   <xs:element name="testsuite" type="testsuite" />
   <xs:simpleType name="ISO8601_DATETIME_PATTERN">
      <xs:restriction base="xs:dateTime">
         <xs:pattern value="[0-9]{4}-[0-9]{2}-[0-9]{2}T[0-9]{2}:[0-9]{2}:[0-9]{2}" />
      </xs:restriction>
   </xs:simpleType>
   <xs:element name="testsuites">
      <xs:annotation>
         <xs:documentation xml:lang="en">Contains an aggregation of testsuite results</xs:documentation>
      </xs:annotation>
      <xs:complexType>
         <xs:sequence>
            <xs:element name="testsuite" minOccurs="0" maxOccurs="unbounded">
               <xs:complexType>
                  <xs:complexContent>
                     <xs:extension base="testsuite">
                        <xs:attribute name="package" type="xs:token" use="required">
                           <xs:annotation>
                              <xs:documentation xml:lang="en">Derived from testsuite/@name in the non-aggregated documents</xs:documentation>
                           </xs:annotation>
                        </xs:attribute>
                        <xs:attribute name="id" type="xs:int" use="required">
                           <xs:annotation>
                              <xs:documentation xml:lang="en">Starts at '0' for the first testsuite and is incremented by 1 for each following testsuite</xs:documentation>
                           </xs:annotation>
                        </xs:attribute>
                     </xs:extension>
                  </xs:complexContent>
               </xs:complexType>
            </xs:element>
         </xs:sequence>
      </xs:complexType>
   </xs:element>
   <xs:complexType name="testsuite">
      <xs:annotation>
         <xs:documentation xml:lang="en">Contains the results of exexuting a testsuite</xs:documentation>
      </xs:annotation>
      <xs:sequence>
         <xs:element name="properties">
            <xs:annotation>
               <xs:documentation xml:lang="en">Properties (e.g., environment settings) set during test execution</xs:documentation>
            </xs:annotation>
            <xs:complexType>
               <xs:sequence>
                  <xs:element name="property" minOccurs="0" maxOccurs="unbounded">
                     <xs:complexType>
                        <xs:attribute name="name" use="required">
                           <xs:simpleType>
                              <xs:restriction base="xs:token">
                                 <xs:minLength value="1" />
                              </xs:restriction>
                           </xs:simpleType>
                        </xs:attribute>
                        <xs:attribute name="value" type="xs:string" use="required" />
                     </xs:complexType>
                  </xs:element>
               </xs:sequence>
            </xs:complexType>
         </xs:element>
         <xs:element name="testcase" minOccurs="0" maxOccurs="unbounded">
            <xs:complexType>
               <xs:choice minOccurs="0">
                  <xs:element name="error">
                     <xs:annotation>
                        <xs:documentation xml:lang="en">Indicates that the test errored.  An errored test is one that had an unanticipated problem. e.g., an unchecked throwable; or a problem with the implementation of the test. Contains as a text node relevant data for the error, e.g., a stack trace</xs:documentation>
                     </xs:annotation>
                     <xs:complexType>
                        <xs:simpleContent>
                           <xs:extension base="pre-string">
                              <xs:attribute name="message" type="xs:string">
                                 <xs:annotation>
                                    <xs:documentation xml:lang="en">The error message. e.g., if a java exception is thrown, the return value of getMessage()</xs:documentation>
                                 </xs:annotation>
                              </xs:attribute>
                              <xs:attribute name="type" type="xs:string" use="required">
                                 <xs:annotation>
                                    <xs:documentation xml:lang="en">The type of error that occured. e.g., if a java execption is thrown the full class name of the exception.</xs:documentation>
                                 </xs:annotation>
                              </xs:attribute>
                           </xs:extension>
                        </xs:simpleContent>
                     </xs:complexType>
                  </xs:element>
                  <xs:element name="failure">
                     <xs:annotation>
                        <xs:documentation xml:lang="en">Indicates that the test failed. A failure is a test which the code has explicitly failed by using the mechanisms for that purpose. e.g., via an assertEquals. Contains as a text node relevant data for the failure, e.g., a stack trace</xs:documentation>
                     </xs:annotation>
                     <xs:complexType>
                        <xs:simpleContent>
                           <xs:extension base="pre-string">
                              <xs:attribute name="message" type="xs:string">
                                 <xs:annotation>
                                    <xs:documentation xml:lang="en">The message specified in the assert</xs:documentation>
                                 </xs:annotation>
                              </xs:attribute>
                              <xs:attribute name="type" type="xs:string" use="required">
                                 <xs:annotation>
                                    <xs:documentation xml:lang="en">The type of the assert.</xs:documentation>
                                 </xs:annotation>
                              </xs:attribute>
                           </xs:extension>
                        </xs:simpleContent>
                     </xs:complexType>
                  </xs:element>
               </xs:choice>
               <xs:attribute name="name" type="xs:token" use="required">
                  <xs:annotation>
                     <xs:documentation xml:lang="en">Name of the test method</xs:documentation>
                  </xs:annotation>
               </xs:attribute>
               <xs:attribute name="classname" type="xs:token" use="required">
                  <xs:annotation>
                     <xs:documentation xml:lang="en">Full class name for the class the test method is in.</xs:documentation>
                  </xs:annotation>
               </xs:attribute>
               <xs:attribute name="time" type="xs:decimal" use="required">
                  <xs:annotation>
                     <xs:documentation xml:lang="en">Time taken (in seconds) to execute the test</xs:documentation>
                  </xs:annotation>
               </xs:attribute>
            </xs:complexType>
         </xs:element>
         <xs:element name="system-out">
            <xs:annotation>
               <xs:documentation xml:lang="en">Data that was written to standard out while the test was executed</xs:documentation>
            </xs:annotation>
            <xs:simpleType>
               <xs:restriction base="pre-string">
                  <xs:whiteSpace value="preserve" />
               </xs:restriction>
            </xs:simpleType>
         </xs:element>
         <xs:element name="system-err">
            <xs:annotation>
               <xs:documentation xml:lang="en">Data that was written to standard error while the test was executed</xs:documentation>
            </xs:annotation>
            <xs:simpleType>
               <xs:restriction base="pre-string">
                  <xs:whiteSpace value="preserve" />
               </xs:restriction>
            </xs:simpleType>
         </xs:element>
      </xs:sequence>
      <xs:attribute name="name" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">Full class name of the test for non-aggregated testsuite documents. Class name without the package for aggregated testsuites documents</xs:documentation>
         </xs:annotation>
         <xs:simpleType>
            <xs:restriction base="xs:token">
               <xs:minLength value="1" />
            </xs:restriction>
         </xs:simpleType>
      </xs:attribute>
      <xs:attribute name="timestamp" type="ISO8601_DATETIME_PATTERN" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">when the test was executed. Timezone may not be specified.</xs:documentation>
         </xs:annotation>
      </xs:attribute>
      <xs:attribute name="hostname" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">Host on which the tests were executed. 'localhost' should be used if the hostname cannot be determined.</xs:documentation>
         </xs:annotation>
         <xs:simpleType>
            <xs:restriction base="xs:token">
               <xs:minLength value="1" />
            </xs:restriction>
         </xs:simpleType>
      </xs:attribute>
      <xs:attribute name="tests" type="xs:int" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">The total number of tests in the suite</xs:documentation>
         </xs:annotation>
      </xs:attribute>
      <xs:attribute name="failures" type="xs:int" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">The total number of tests in the suite that failed. A failure is a test which the code has explicitly failed by using the mechanisms for that purpose. e.g., via an assertEquals</xs:documentation>
         </xs:annotation>
      </xs:attribute>
      <xs:attribute name="errors" type="xs:int" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">The total number of tests in the suite that errorrd. An errored test is one that had an unanticipated problem. e.g., an unchecked throwable; or a problem with the implementation of the test.</xs:documentation>
         </xs:annotation>
      </xs:attribute>
      <xs:attribute name="time" type="xs:decimal" use="required">
         <xs:annotation>
            <xs:documentation xml:lang="en">Time taken (in seconds) to execute the tests in the suite</xs:documentation>
         </xs:annotation>
      </xs:attribute>
   </xs:complexType>
   <xs:simpleType name="pre-string">
      <xs:restriction base="xs:string">
         <xs:whiteSpace value="preserve" />
      </xs:restriction>
   </xs:simpleType>
</xs:schema>

Relax NG Compact Syntax

There is also a Relax NG Compact Syntax Schema:

junit.rnc: 
#---------------------------------------------------------------------------------- 
start = testsuite 
property = element property { 
   attribute name {text}, 
   attribute value {text} 
} 
properties = element properties { 
   property* 
} 
failure = element failure { 
   attribute message {text}, 
   attribute type {text}, 
   text 
} 
testcase = element testcase { 
   attribute classname {text}, 
   attribute name {text}, 
   attribute time {text}, 
   failure? 
} 
testsuite = element testsuite { 
   attribute errors {xsd:integer}, 
   attribute failures {xsd:integer}, 
   attribute hostname {text}, 
   attribute name {text}, 
   attribute tests {xsd:integer}, 
   attribute time {xsd:double}, 
   attribute timestamp {xsd:dateTime}, 
   properties, 
   testcase*, 
   element system-out {text}, 
   element system-err {text} 
} 
#---------------------------------------------------------------------------------- 

and junitreport.rnc 
#---------------------------------------------------------------------------------- 
include "junit.rnc" { 
   start = testsuites 
   testsuite = element testsuite { 
      attribute errors {xsd:integer}, 
      attribute failures {xsd:integer}, 
      attribute hostname {text}, 
      attribute name {text}, 
      attribute tests {xsd:integer}, 
      attribute time {xsd:double}, 
      attribute timestamp {xsd:dateTime}, 
      attribute id {text}, 
      attribute package {text}, 
      properties, 
      testcase*, 
      element system-out {text}, 
      element system-err {text} 
   } 
} 
testsuites = element testsuites { 
   testsuite* 
}

Sourcecode

The JUnit XML report format originates the JUnit ANT task – this is the definitive source for the JUnit Report XML format – and source code can be found on the apache SVN repository here:

http://svn.apache.org/repos/asf/ant/core/trunk/src/main/org/apache/tools/ant/taskdefs/optional/junit/XMLJUnitResultFormatter.java

This can be useful if attempting to do anything “tricky” with JUnit output where you need to confirm it’s compliant.

Stack Overflow Discussions

Here are some additional topics on stack overflow that discuss the JUnit XML format which can be useful for learning about the schema for the JUnit XML results format, as well to get hints about what minimum subset is normally suitable for most tools such as Jenkins (the same rules apply to ET generally as well).

Execution Status Screen

Automated tests are also included in the following execution status screen, displayed when users double click on an execution package:

  • Execution Status Grid

The Execution Status on the execution set package now contains all automated and manual test execution results.

  • Filter By Entity Type

You can filter by entity type in the assignments grid view.

  • Assign Automated Tests

You can assign automated tests to specific Enterprise Tester users if required.

Viewing Automated Run History

Right click on the automated script in the execution set

Select Run History

The Run History screen provides a summary of all runs:

You can select a run in the grid to view the full results.

You can select runs from the grid to change the runs represented in the charts.

Viewing Imported Results

To view the automation results that were most recently imported you can either double click on the run in the execution set grid or right click on the automated script in the execution set on the Explorer tab in the Tree view navigator and select View Automated Test Assignment.

The Automated Test Results screen will open and default to the Summary tab. 

The following tabs are available:

  • Summary
  • Results
  • Test Data
  • Attachments
  • Relationships

Summary Tab

The Summary tab lists all import information. This information is sourced from the imported automated test results and also from the information the user has entered when importing these scripts into Enterprise Tester.  
The Summary tab provides two charting widgets showing the Run Status Over Time and Status Group by Day ( widget is configurable and you can select the field or the date parameter to chart against).

The Summary tab also provides a summary grid of all run results for this automated test script assignment.  Similar to other summary screen grids, the run history grid is configurable.  You can select the columns to display or filter the results using TQL.

Results Tab

The Results tab provides a tree view of the results of your automated test run.  You can drill down to the detail in each node.
In the top right corner is a status filter where you can select on the status that you want to be displayed e.g. only failed and warnings. This will allow you to quickly filter your results set to only the relevant statuses for review.

Each node has an icon indicating results:

Users can add attachments, add and link incidents and view the node details via the screen.

If a user clicks node details a Result Node screen is displayed with the following tabs:
 All of this information is sourced from the automated test tool itself. 

Details – high level information relating to the specific results node. You can add notes on your results node which will display in the Results tree view in the Notes column.

Parameters – automated tool parameters. 

Metadata – any metadata utilized in the automated scripts.

Test Data Tab

The Test Data tab displays table driven variable information from the automated test tool.  Currently this function will only be displayed for imported QTP data.

Attachments and Relationships Tabs

The Attachments and Relationships tabs have been implemented in a consistent way with other areas of Enterprise Tester:

Attachments can be added when required from the Attachments link.

Any relationships to the automated test, such as incidents, can also be viewed.

Importing Automated Test Results – Manual Import

To import automated test results, like manual tests, In the script library you first need to add an automated test script.  This is not the script that your automated tool will run.  It is a placeholder in the Script Library for your Automated Test that specifies the type of results and the pathway to the results to import.

To do this,  from the Explorer tab in the tree view navigator, click to select the package you wish to add your automated test script to.  

Then either click on the Add Automated Test icon from the menu bar.

Or right click on the Package and select Add Automated Test.

The New Automated Test screen will appear.  Complete the details:

FieldDescription
NameEnter a name for your test.
Assigned ToEnter the User name of the person assigned to the Automated Test.
TypeSelect the test type from the drop down list. HP Quick Test Professional, IBM Rational Functional Tester,Selenium or Unit Test Results
Sub-TypeOnly applicable for Unit Test Results. Select from the dropdown list the type of results you are importing
Default PathEnter the path to a specific results file or folder. This is not required. When importing results you can manually select the file to upload.

Once the Automated Test Script is created, you then need to add it to your execution set by dragging and dropping from the script library to the execution set package or by using the Create Execution button on the grid tool bar.

Import Results

To import the results, right click on the Automated Test Assignment in the execution set and select Import Result from the menu or double click on the Automated Test Assignment in the execution set grid to open the script assignment.

A dialogue box will appear that displays the following two tabs:

  • Import from path – If you setup a default path in the Automated Test setup in the script library this will be displayed in the Results Path field. If the default path is still correct, enter your automated test results file name and click Import.
  • Upload file – You can select to upload your file. 

Progress information will be displayed followed by the Success message.

Duette User Guide

Duette is the automated testing tool plugin for Enterprise Tester. It enables users to import results from automated testing tools into Enterprise Tester. All testing results, manual and automated can be managed from Enterprise Tester giving users full visibility of testing status and progress.

Licensing of Duette is separate from your Enterprise Tester License. To run Duette, a current license of Enterprise Tester and Duette are required. If your Duette license expires, you will still be able to view, move and rename automated tests that you have previously imported using Duette, but you will no longer be able to create new automated tests, automated test assignments or import new runs.

Automated Testing Tools Supported

Enterprise Tester’s automated testing plugin, Duette, supports importing results from the following tools:

  • IBM Rational Functional Tester – 8.1 and 8.2 HTML Output
  • HP Quick Test Professional – 10 and 11 Output
  • Selenium HTML Suite
  • Unit Test Results
  • Custom results using XSLT data ( transforms Custom XML to J-Unit)
  • Gallio
  • JSUnit
  • JUnit
  • Microsoft MS Test
  • NUnit
  • Parasoft C+++test
  • PHPUnit

To allow automated test results to be imported you must have dragged and dropped your Automated Script from the Script Library into an execution set, as noted previously in this document.

As each automated test within your automated test tool is repeated you can import new results into the same execution set script.  All results are recorded and a run history is available for each imported set of results.

You can choose to manually import your results either by uploading results or setting an automated import schedule.

Guides

Duette – Automated Test Integration

Integrate Automated Test Tools with Enterprise Tester

Enterprise Tester’s automated test integration is called Duette.  Duette enables the import and viewing of test tools from a variety of Automated test Tools and formats.

Duette supports the import of results in the following formats:

  • IBM Rational Functional Tester – 8.1 and 8.2 HTML Output
  • HP Quick Test Professional – 10 and 11 Output
  • Selenium HTML Suite
  • Unit Test Results
  • Custom results using XSLT data ( transforms Custom XML to J-Unit)
  • Gallio
  • JSUnit
  • JUnit
  • Microsoft MS Test
  • NUnit
  • Parasoft C+++test
  • PHPUnit

On import, results are presented in Enterprise Tester and can be filtered or queried as required:

Features

  • Import automated test results from folders and .zip files.
  • View the results of automated tests, including screen shots and test data.
  • Track overall progress across both automated and manual tests.
  • Import results from Rational Functional Tester, QuickTest Professional and Selenium.
  • Scheduled import of automated test results.
  • Status filters and notes for automated results.
  • Duette can also be configured to push results from your automation framework into Enterprise Tester, see Duette Client:  
    https://github.com/catch-software/EnterpriseTester-API-Examples/tree/master/CSharp/DuetteClient

Note: Duette requires current licenses for both Duette and Enterprise Tester to create new automated tests, automated test assignments or import new runs. 

See the User Guide