Tapestry has an excellent JUnit test suite, with code coverage figures over 80% at the time of this writing (2.4-alpha-4). It is required that changes to the framework be accompanied by additional JUnit tests (typically, mock tests; see below) to validate the changes. In addition, there is an ongoing effort to fill in the gaps in the existing suite; the suite should reach over 90% code coverage.
In order to compile and run the JUnit test suite you need to download
junit.jar
and jdom-b8.jar
,
and place them in the ext-dist
directory.
The official sites to download the libraries are listed in the README file
in that directory.
Some of the JUnit tests now require Jython. You must
download and install Jython 2.1,
then configure jython.dir
in config/build.properties
to point to the install directory. As usual, use an absolute path and forward slashes only. To run
the JUnit test suite within Eclipse, you must
set the JYTHON_DIR
classpath variable.
JUnit test source code is placed into the junit/src
source tree.
The package name for JUnit tests is org.apache.tapestry.junit
.
Less than half of Tapestry is tested using traditional JUnit tests. The majority of JUnit testing occurs using
a system of mock unit tests. Mock testing involves replacing the key classes of the
Servlet API (HttpServletRequest
, HttpSession
, etc.) with
out own implementations, with extensions that allow for checks and validations. Instead of processing a series of
requests over HTTP, the requests are driven by an XML script file, which includes output checks.
Generally, each bit of functionality can be tested using its own mini-application.
Create the application as
junit/context
. This is much easier now,
using Tapestry 3.0 features
such as dynamic lookup of specifications and implicit components.
X
The Mock Unit Test Suite is driven by scripts (whose structure is described below). The suite
searches the directory junit/mock-scripts
for files with the ".xml" extension.
Each of these is expected to be a test script. The order in which scripts are executed is
arbitrary; scripts (and JUnit tests in general) should never rely on any order of execution.
Test scripts are named Test
.
Name
.xml
Note | |
---|---|
The XML script is not validated, and invalid elements are
generally ignored. The class |
A test script consists of an <mock-test>
element. Within it, the virtual context and servlet are defined.
<mock-test> <context name="c6" root="context6"/> <servlet name="app" class="org.apache.tapestry.ApplicationServlet"> <init-parameter name="org.apache.tapestry.engine-class" value="org.apache.tapestry.junit.mock.c6.C6Engine"/> </servlet>
The name for the context becomes the leading term in any
generated URLs. Likewise, the servlet name becomes the second
term. The above example will generate URLs that reference
/c6/app
. Specifying a root
for a context identifies the root context directory (beneath the top level
junit
directory). In this example, HTML templates
go in context6
and specifications
go in context6/WEB-INF
.
Following the <servlet>
and <context>
elements, a series
of <request>
elements. Each such element
simulates a request. A request specifies any query parameters passed as part
of the request, and contains a number of assertions that test
either the results, generally in terms of searching for strings
or regular expressions within the HTML response.
<request> <parameter name="service" value="direct"/> <parameter name="context" value="0/Home/$DirectLink"/> <assert-output name="Page Title"> <![CDATA[ <title>Persistant Page Property</title> ]]> </assert-output>
Warning | |
---|---|
As in the above example, it is very important that HTML tags be properly escaped with the XML CDATA construct. |
Adding failover="true"
to the
<request>
simulates a failover. The contents
of the HttpSession
are serialized,
then deserialized. This ensures that all the data stored into the
HttpSession
will survive a failover to a new server within
a cluster.
All of the assertion elements expect a name
attribute, which is incorporated into any error message
if the assertion fails (that is, if the expected output
is not present).
The <assert-output>
element checks for the
presence of the contained literal output, contained within the
element. Leading and trailing whitespace is trimmed before
the check is made.
<assert name="Session Attribute"> request.session.getAttribute("app/Home/message").equals("Changed") </assert>
The <assert>
element checks
that the provided OGNL expression evaluates to true.
<assert-regexp name="Error Message"> <![CDATA[ <span class="error">\s*You must enter a value for Last Name\.\s*</span> ]]> </assert-regexp>
The <assert-regexp>
looks
for a regular expression in the result, instead of a simple literal
string.
<assert-output-matches name="Selected Radio" subgroup="1"> <![CDATA[ <input type="radio" name="inputSex" checked="checked" value="(.*?)"/> ]]> <match>1</match> </assert-output-matches>
The <assert-output-matches>
is the most complicated assertion. It contains a regular expression
which is evaluated. For each match, the subgroup value is extracted,
and compared to the next <match>
value. Also, the count of matches (vs. the number of
match
elements) is checked.
<assert-output-stream name="Asset Content" content-type="image/gif" path="foo/bar/baz.gif"/>
The <assert-output-stream>
element is used to compare the entire response to a static file
(this is normally associated with private assets). A content type
must be specified, as well as a relative path to a file to compare
against. The path is relative to the junit directory. The
response must match the specified content type and actual content.
<assert-exception name="Exception"> File foo not found. </assert-exception>
The <assert-exception>
element
is used to check when an request fails entirely (is unable to send back a
response). This only occurs when the application specification contains invalid
data (such as an incorrect class for the engine), or when the Exception page
is unable to execute. The body of the element is matched against the exception's
message property.
Force a failure, then check for correctness | |
---|---|
Sometimes the tests themselves have bugs. A useful technique is
to purposely break the test to ensure that it is checking for what
it should check, then fix the test. For example, adding |