Monday, April 23, 2012

Video recording of Interactive Selenium with Groovy

A fifteen minutes video shows how Selenium can be used interactively with a groovy console. http://www.youtube.com/watch?v=IlfLfLuceWk

Monday, March 26, 2012

Create a groovy console and bind to selenium

Required groovy files

In the previous posting we defined the pom file that we need for our build environment. Now we will setup some groovy files to get selenium and groovy running interactively.

ConsoleWaiter.groovy

The idea of Groovy Console I found on some other sides. Honour goes for instance too: http://josefbetancourt.wordpress.com/tag/eclipse-2/ I copied some code of this, and put it under src/test/groovy/com/jankester/selenium/test/utils:
package com.jankester.selenium.test.utils

/**
 * File:  ConsoleWaiter.groovy
 */

import groovy.lang.Binding;
import groovy.ui.Console;

/**
 * Provides a wrapper for the console.
 *
 * Based on source by John Green
 * Adapted from:  http://www.oehive.org/files/ConsoleWaiter.groovy
 * Released under the Eclipse Public License
 * http://www.eclipse.org/legal/epl-v10.html
 *
 * I added methods to allow use from Java.
 *
 * The run() method launches the console and causes this thread
 * to sleep until the console's window is closed.
 * Allows easy interaction with the objects alive at a given
 * point in an application's execution, like in a debugger
 * session.
 *
 * Example 1:
 * new ConsoleWaiter().run()
 *
* * Example 2:
 * def waiter = new ConsoleWaiter()
 * waiter.console.setVariable("node", node)
 * waiter.run()
 *
*/ class ConsoleWaiter { Console console Object source boolean done = false; /** */ public ConsoleWaiter(Console inConsole){ this.console = inConsole } /** */ public ConsoleWaiter(Object source){ console = new Console(getClass().classLoader, new Binding()) this.source = source console.setVariable("source", source) } /** */ public void setVar(String key, Object value){ console.setVariable(key, value) } /** */ public void setVar(String key, List values){ console.setVariable(key, values) } /** */ public void setVar(String key, Object[] values){ console.setVariable(key, values) } /** */ public void run() { console.run() // I'm a little surprised that this exit() can be private. console.frame.windowClosing = this.&exit console.frame.windowClosed = this.&exit while (!done) { sleep 1000 } } /** */ public boolean isDone(){ return done; } /** */ public void exit(EventObject evt = null) { done = true } /** */ public Console getConsole(){ return console; } }
The Groovy console is supposed to get started by my main program, and blocks the thread of my main program. Only when you close the Groovy Console, the main program will continue (and exit). You can give the groovy console a binding, and thus put it in direct contact with any context you set up before. In our case, we want to have selenium driver loaded in our context.

Selenium setup

The groovy class to start our selenium, uses following code:
package com.jankester.selenium.test

import java.io.File
import java.util.logging.Level

import org.apache.log4j.LogManager
import org.apache.log4j.Logger
import org.openqa.selenium.By
import org.openqa.selenium.WebDriver
import org.openqa.selenium.chrome.ChromeDriverService
import org.openqa.selenium.firefox.FirefoxDriver
import org.openqa.selenium.firefox.FirefoxProfile
import org.openqa.selenium.ie.InternetExplorerDriver
import org.openqa.selenium.logging.LoggingPreferences
import org.openqa.selenium.remote.DesiredCapabilities
import org.openqa.selenium.remote.RemoteWebDriver
import org.openqa.selenium.logging.LogType
import org.openqa.selenium.remote.CapabilityType
import com.opera.core.systems.OperaDriver


class WebDriverSetup {

 private static WebDriverSetup setup;
 protected static Logger logger = LogManager.getLogger(WebDriverSetup.class);
 protected WebDriver driver;
 protected Utils utils;
 protected String startUrl;
 protected String username;
 protected String password;

 public static WebDriverSetup getInstance() {
  if (setup == null) {
   setup = new WebDriverSetup();
  }
 }
 
 private WebDriverSetup() {
  startUrl = PropertyHolder.testProperties.getProperty("StartUrl");
  String browser = PropertyHolder.testProperties.getProperty("BrowserType");
  username = PropertyHolder.testProperties.getProperty("LoginUserName");
  password = PropertyHolder.testProperties.getProperty("LoginPassword");

  if (browser.equalsIgnoreCase("*firefox")) {
   driver = getFirefoxDriver();
   logger.info("Started firefox driver");
  }
  else if (browser.equalsIgnoreCase("*iexplore")) {
   driver = getIEDriver();
   logger.info("Started internetexplorer driver");
  }
  else if (browser.equalsIgnoreCase("*googlechrome")) {
   driver = getGoogleChromeDriver();
   logger.info("Started Googlechrome driver");
  }
  else if (browser.equalsIgnoreCase("*opera")) {
   driver = new OperaDriver();
   logger.info("Started opera driver");
  }

  /*  open the url */
  logger.info("Connecting to starturl: " + startUrl);
  driver.get(startUrl);
  logger.info("Connected to starturl");
  
 }

 private WebDriver getFirefoxDriver() {
  logger.info("Starting firefox driver");
  DesiredCapabilities caps = DesiredCapabilities.firefox(); 
  FirefoxProfile firefoxProfile = new FirefoxProfile();
  caps.setCapability(FirefoxDriver.PROFILE, firefoxProfile);
  
  //use setting in log4j to switch on logging of firefox driver
  Logger log4jLogger = LogManager.getLogger("org.openqa");
  if (log4jLogger.isInfoEnabled()) {
   LoggingPreferences logs = new LoggingPreferences(); 
   logs.enable(LogType.DRIVER, Level.INFO); 
   caps.setCapability(CapabilityType.LOGGING_PREFS, logs); 
   logger.info("Logging of firefox driver is enabled");
   String userDir = System.getProperty("user.dir"); 
   System.setProperty("webdriver.firefox.logfile", "target/firefox-console.log");
   System.setProperty("webdriver.log.file","${userDir}/target/firefox-driver.log");
  }

  WebDriver ffDriver = new FirefoxDriver(caps);
  return ffDriver;
 }



 private WebDriver getIEDriver() {
  logger.info("Starting iexplorer driver");
  DesiredCapabilities ieCapabilities = DesiredCapabilities.internetExplorer();
  ieCapabilities.setCapability(InternetExplorerDriver.INTRODUCE_FLAKINESS_BY_IGNORING_SECURITY_DOMAINS, true);
  InternetExplorerDriver ieDriver = new InternetExplorerDriver(ieCapabilities);
  return ieDriver;

 }

 private WebDriver getGoogleChromeDriver() {
  logger.info("Starting googlechrome driver");
  DesiredCapabilities chromeCapabilities = DesiredCapabilities.chrome();

  System.setProperty("webdriver.chrome.driver", PropertyHolder.testProperties.("ChromedriverPath"));

  ChromeDriverService service = new ChromeDriverService.Builder()
    .usingChromeDriverExecutable(new File(PropertyHolder.testProperties.("ChromedriverPath")))
    .usingAnyFreePort().build();

  logger.info("Starting chrome driver service..");
  service.start();

  WebDriver driverGC = new RemoteWebDriver(service.getUrl(),chromeCapabilities);
  return driverGC;
 }
 
 public void close() {
  logger.info("Closing driver now");
  driver.close();
 }
}

PropertyHolder

In our pom profiles we define several variables. They get merged into src/test/resources/test.properties. Our groovy classes need these variables, and get access to these variables over a PropertyHolder class:
package com.jankester.selenium.test

class PropertyHolder {

 public static Properties testProperties = new Properties(); 
 
 static {
  InputStream is = PropertyHolder.class.classLoader.getResourceAsStream('test.properties');
  testProperties.load(is);
        //def theConfig = new ConfigSlurper().parse(is.getText());
 }
}

RunSeleniumConsole

Now the final part is about starting the main program. You can do this with:
package com.jankester.selenium.test

import org.apache.log4j.LogManager;
import org.apache.log4j.Logger
import org.openqa.selenium.WebElement;
import org.openqa.selenium.interactions.Actions;
import org.openqa.selenium.By as By;
import com.jankester.selenium.test.SeleniumConstants as SeleniumConstants;

import com.jankester.selenium.test.utils.ConsoleWaiter;
import com.jankester.selenium.test.utils.Utils;

class RunSeleniumConsole {

 private static Logger logger = LogManager.getLogger(RunSeleniumConsole.class);


 static main(args) {
  logger.info("Starting selenium session with console");
  WebDriverSetup setup = WebDriverSetup.getInstance();

  //set bindings
  Actions actions = new Actions(setup.driver);

  ConsoleWaiter waiter = new ConsoleWaiter(setup);

  logger.info("Setting bindings for driver,actions,utils,logger");
  waiter.setVar("driver", setup.driver);
  waiter.setVar("actions",actions);
  waiter.setVar("logger",logger);
  waiter.setVar("startUrl",setup.startUrl);
  waiter.setVar("By",By);
  waiter.run();
  
  setup.close();
 }
}

With this all in plae, you should be able to run your first console with: mvn clean test -P firefox,development,run-console You may get a failure for log4j initialisation. But you can just add a log4j.xml to your src/main/resources folder, and that problem should be solved. Good Luck!

Next

Remaining topics are:
  • create your own utils directory
  • examples
  • save your groovy scripts
  • replay your saved groovy scripts as junit tests
  • add firebug xpi during startup
At the end of the series I will also add complete source code example.

Setting up maven project to do first interactive test


The pom file

A basic pom file that sets up all dependencies for your interactive selenium-groovy testing contains following:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>com.jankester.selenium</groupId>
  <artifactId>selenium-groovy-public</artifactId>
  <version>2012.03-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>selenium-groovy</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <project.selenium.version>2.20.0</project.selenium.version>    
    <log.root>./target</log.root>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.8.2</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.codehaus.groovy.maven.runtime</groupId>
      <artifactId>gmaven-runtime-1.6</artifactId>
      <version>1.0</version>
    </dependency>
    <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <version>1.2.16</version>
    </dependency>
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-simple</artifactId>
      <version>1.4.2</version>
    </dependency>    
    <dependency>
      <groupId>org.seleniumhq.selenium</groupId>
      <artifactId>selenium-java</artifactId>
      <version>${project.selenium.version}</version>
    </dependency>
    <dependency>
      <groupId>org.seleniumhq.selenium</groupId>
      <artifactId>selenium-firefox-driver</artifactId>
      <version>${project.selenium.version}</version>
    </dependency>
    <dependency>
      <groupId>org.seleniumhq.selenium</groupId>
      <artifactId>selenium-chrome-driver</artifactId>
      <version>${project.selenium.version}</version>
    </dependency>
    <dependency>
      <groupId>org.seleniumhq.selenium</groupId>
      <artifactId>selenium-ie-driver</artifactId>
      <version>${project.selenium.version}</version>
    </dependency>
    <dependency>
      <groupId>org.seleniumhq.selenium</groupId>
      <artifactId>selenium-htmlunit-driver</artifactId>
      <version>${project.selenium.version}</version>
    </dependency>
    <dependency>
      <groupId>org.seleniumhq.selenium</groupId>
      <artifactId>selenium-server</artifactId>
      <version>${project.selenium.version}</version>
    </dependency>    
<!--     <dependency>
          <groupId>org.seleniumhq.webdriver</groupId>
          <artifactId>webdriver-common</artifactId>
          <version>${project.selenium.version}</version>
        </dependency>       -->
    <dependency>
          <groupId>com.opera</groupId>
          <artifactId>operadriver</artifactId>
          <version>0.8.1</version>
        </dependency>    
<dependency>
  <groupId>pl.pragmatists</groupId>
  <artifactId>JUnitParams</artifactId>
  <version>0.4.0</version>
  <scope>test</scope>
</dependency>        
  </dependencies>

  <build>
    <resources>
      <resource>
        <directory>src/main/resources</directory>
        <filtering>true</filtering>
      </resource>
    </resources>
    <testResources>
      <testResource>
        <directory>src/test/resources</directory>
        <filtering>true</filtering>
      </testResource>
    </testResources>

    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>2.3.2</version>
        <configuration>
          <compilerVersion>1.6</compilerVersion>
          <source>1.6</source>
          <target>1.6</target>
          <encoding>ISO-8859-1</encoding>
        </configuration>
      </plugin>
      <plugin>
        <groupId>org.codehaus.gmaven</groupId>
        <artifactId>gmaven-plugin</artifactId>
        <executions>
          <execution>
            <goals>
              <goal>generateStubs</goal>
              <goal>compile</goal>
              <goal>generateTestStubs</goal>
              <goal>testCompile</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-surefire-plugin</artifactId>
        <version>2.7.2</version>
        <configuration>
          <skip>true</skip>
        </configuration>
        <executions>
          <execution>
            <id>unit-tests</id>
            <phase>test</phase>
            <goals>
              <goal>test</goal>
            </goals>
            <configuration>
              <skip>false</skip>
              <includes>
                <include>${testcase.include.pattern}</include>
              </includes>
              <excludes>
                <exclude>**/*InProgressTest.java</exclude>
              </excludes>
            </configuration>
          </execution>
        </executions>
      </plugin>      
    </plugins>
    <pluginManagement>
      <plugins>
        <!--This plugin's configuration is used to store Eclipse m2e settings 
          only. It has no influence on the Maven build itself. -->
        <plugin>
          <groupId>org.eclipse.m2e</groupId>
          <artifactId>lifecycle-mapping</artifactId>
          <version>1.0.0</version>
          <configuration>
            <lifecycleMappingMetadata>
              <pluginExecutions>
                <pluginExecution>
                  <pluginExecutionFilter>
                    <groupId>
                      org.codehaus.gmaven
                    </groupId>
                    <artifactId>
                      gmaven-plugin
                    </artifactId>
                    <versionRange>
                      [1.3,)
                    </versionRange>
                    <goals>
                      <goal>generateTestStubs</goal>
                      <goal>generateStubs</goal>
                      <goal>testCompile</goal>
                      <goal>compile</goal>
                    </goals>
                  </pluginExecutionFilter>
                  <action>
                    <ignore></ignore>
                  </action>
                </pluginExecution>
              </pluginExecutions>
            </lifecycleMappingMetadata>
          </configuration>
        </plugin>
      </plugins>
    </pluginManagement>
  </build>

  <profiles>
  
    <profile>
      <id>development</id>
      <properties>
        <start.url>https://groups.google.com/forum/?fromgroups#!forum/selenium-users</start.url>
        <login.username>XXX</login.username>
        <login.password>YYY</login.password>      
        <log.root>./target</log.root>        
      </properties>
    </profile>  

    <profile>
      <id>integration</id>
      <properties>
      </properties>
    </profile>  

    <profile>
      <id>firefox</id>
      <properties>
        <browser.type>*firefox</browser.type>
        <add.firebug.to.firefox>true</add.firebug.to.firefox>
      </properties>
    </profile>

    <profile>
      <id>iexplorer</id>
      <properties>
        <browser.type>*iexplore</browser.type>
      </properties>
    </profile>

    <profile>
      <id>googlechrome</id>
      <properties>
        <browser.type>*googlechrome</browser.type>
        <webdriver.chrome.driver>${user.home}\map_creator-files\chromedriver.exe</webdriver.chrome.driver>
      </properties>
    </profile>

    <profile>
      <id>opera</id>
      <properties>
        <browser.type>*opera</browser.type>
      </properties>
    </profile>


  
    <profile>
      <activation>
        <activeByDefault>true</activeByDefault>
      </activation>
      <id>run-console</id>
      <properties>
        <testcase.include.pattern>XXX</testcase.include.pattern>
      </properties>      
      <build>
        <plugins>
          <plugin>
            <groupId>org.codehaus.mojo</groupId>
            <artifactId>exec-maven-plugin</artifactId>
            <version>1.1.1</version>
            <executions>
              <execution>
                <phase>test</phase>
                <goals>
                  <goal>java</goal>
                </goals>
                <configuration>
                  <classpathScope>test</classpathScope>
                  <mainClass>com.jankester.selenium.test.RunSeleniumConsole</mainClass>
                  <arguments />
                </configuration>
              </execution>
            </executions>
          </plugin>
        </plugins>
      </build>
    </profile>
    
    <profile>
      <id>run-script-cat1</id>
      <properties>
        <testcase.include.pattern>**/Cat1Test*</testcase.include.pattern>
      </properties>            
    </profile>    
  </profiles>

  <repositories>
  </repositories>

  <pluginRepositories>
  </pluginRepositories>

</project>

I don't want to explain maven, I don't want to go through this pom file in detail. But it will help you with:
  • solving all dependencies of selenium 2.20
  • setup a profile for each browser
  • setup a profile for a test environment
  • setup a profile for a specific test target, or run-console.
All dependencies should be part of maven public repositories, so the repository setting can remain empty. When we have implemented some code, we can execute our selenium groovy console with:
mvn clean test -P firefox,development,run-console  

Wednesday, March 21, 2012

Interactive selenium testing

The problem


Location of elements

When I started using selenium, I noticed that it is not easy to do it right. First you start with IDE, but you notice, that the IDE does not really record a lot. In a next step I added firebug, and started analyzing how the elements where to be located: either by tag, id, class etc.

Junit testcase

With this information I could then create my junit testcase:
 @Test
 public void testMapView() throws Exception {

  //assert that we cannot see submenu of MapCreator
  elem = driver.findElement(By.className(SeleniumConstants.MAP_SUB_MENU));
  String style = elem.getAttribute("style");
  assertTrue("Element must have style display off.",style.matches("display: none.*"));
  logger.debug("Located element " + SeleniumConstants.MAP_SUB_MENU);

  //find menu and click on mapview
  elem = driver.findElement(By.id(SeleniumConstants.MAP_CONTROL));
  actions.moveToElement(elem).click().perform();
  //assert submenu is shown now
  elem = driver.findElement(By.className(SeleniumConstants.MAP_SUB_MENU));
  style = elem.getAttribute("style");
  assertTrue("Element must have style display on.",style.matches("display: block.*"));
Now this all works very nice, only, not after the first time. It took many iterations to get here.

selenium iterations

Usual failures are, that element cannot be found:
org.openqa.selenium.NoSuchElementException: Unable to locate element: {"method":"id","selector":"
or, that you cannot click on an element, as it is not the top element or not visible:
Element is not currently visible and so may not be interacted with Build info: 
This kind of failures eat a lot of time. The forced me to analyze, use firebug, make some changes, and to restart my junit testcase. Every failure caused me to restart my testcase ...

The first solution: eclipse debugger

I started running my testcases from eclipse, in debugging mode. I put my debug point at the location where it started to get interesting. Now I could see the exact state of my application. Also, with the debugger expression viewer, I could create expressions, that actually executed something:
(new Action(driver).moveToElement(elem).click().moveByOffSet(100,100).click().perform()
Althoug the expression viewer is actually to discover calucated values, you can also use it to pass on new commands. It is running in the context of your JVM, so can directly pass commands to your selenium driver.

Disadvantage of debugger expressions

This helped a lot, I could do better analysis, and try out before adding it to my test code. However, on any exception or failure, my debugger threw an exception, and I was out. I still had to restart my jvm. The problem is also with junit: when you get an exception, your test is over. No way to retry.

The new solution: groovy console

The new solution is a lot more elegant:
  • No junit test, but just a simple main program
  • Initialisation of selenium driver takes place in program, then control is handed over to a groovyconsole
  • Groovy allows for runtime compilation
  • Test scripts can be saved and replayed.
This was the real winner, and I still love it. Suddenly test automation with selenium starts to make fun again.

A simple example

When I start up my selenium-groovy, I get a swing groovy console that allows me to type selenium commands:
These are just a few lines of selenium. But the nice thing is: if they are wrong, you get a failure in your console, and you can just retry. Likewise, you can select only a few lines of your script, and only run those again.

Coming topics

In future blogs, I will continue on this topic:
  • code explained to setup the groovy console
  • how to setup your maven project to get this working
  • making your own utilities
  • a few more examples
  • running your recorded scripts in junit tests.

Stay tuned!

Monday, March 05, 2012

Logging in selenium

For some time I am trying to get more out of Selenium. Lot has changed here: selenium2 is a big improvement, and since 2.15 google also contributed its Advanced User Interactions API to the selenium code base, which seems to be a big improvement for mouse handling.
To get a better understanding of what is happening, and especially why my test scripts are failing, I wanted to control the logging of my webdriver. I noticed that this was a bit of a pain. Selenium firefox driver is using java.util.logging. So you are bound to the mechanisms that java.util.logging offers you for log configuration.

This is the approach I ended up with.
  • Before I startup my Firefox Webdriver, I used the setLogLevel method to set the Level of logging that is used for all driver logs. When you set this to WARNING, all logs that the driver makes, are considered to be WARNING logs, when you set them to FINE, all logs are considered to be FINE logs. So dependent on the settings inside your logging.properties, you will see these logs appear in your system out or not. I put the level to INFO, and now they are always appearing in my system.out output. During test development I do not mind. I will switch if off though when I move scripts to an automatic build environment.
  • To change the settings for java.util.logging, you can either configure the file under java_home/jre/lib, or set the config file explicitly over the command line with -Djava.util.logging.config.file=myfile.
I am still interested in routing Java Util Logging (JUL) to log4j. Maybe this is a solution:
http://www.slf4j.org/legacy.html#jul-to-slf4j

Wednesday, October 19, 2011

Junit4 running parallel junit classes

To run junit testcases parallel, you can create your own class to run junit with:

Add this tag to your class declaration.
@RunWith(Parallelized.class)

Implementation of this class looks like:
package mypackage;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeUnit;

import org.junit.runners.Parameterized;
import org.junit.runners.model.RunnerScheduler;

public class Parallelized extends Parameterized
{
   
    private static class ThreadPoolScheduler implements RunnerScheduler
    {
        private ExecutorService executor;
       
        public ThreadPoolScheduler()
        {
            String threads = System.getProperty("junit.parallel.threads", "16");
            int numThreads = Integer.parseInt(threads);
            executor = Executors.newFixedThreadPool(numThreads);
        }
       
        public void finished()
        {
            executor.shutdown();
            try
            {
                executor.awaitTermination(10, TimeUnit.MINUTES);
            }
            catch (InterruptedException exc)
            {
                throw new RuntimeException(exc);
            }
        }

        public void schedule(Runnable childStatement)
        {
            executor.submit(childStatement);
        }
    }

    public Parallelized(Class klass) throws Throwable
    {
        super(klass);
        setScheduler(new ThreadPoolScheduler());
    }
}

Now inside your test class, you will need to create a method with
@Parameters
tag declaration, that feeds the constructor:

    public PocAddPoiTest(String browser) {
        super();
        this.browser = browser;
    }

    @Parameters
    public static Collection browsersStrings() {
        // return Arrays.asList(new Object[][] { { "*firefox" },
        // { "*googlechrome" } });
        return Arrays.asList(new Object[][] { { "*firefox","*iexplore" } });
    }

When junit gets started, it will run two tests parallel.

Not sure how it handles output .. :-).

Thursday, September 22, 2011

Run jmeter from eclipse

Download jmeter source and binaries: http://archive.apache.org/dist/jakarta/jmeter/binaries/jakarta-jmeter-2.3.4.zip http://archive.apache.org/dist/jakarta/jmeter/source/jakarta-jmeter-2.3.4_src.zip

Unpack jmeter source file, and rename eclipse.classpath into .classpath. Add a .project file to the same directory:
<?xml version="1.0" encoding="UTF-8"?>
<projectDescription>
  <name>jakarta-jmeter-2.3.4</name>
  <comment></comment>
  <projects>
  </projects>
  <buildSpec>
    <buildCommand>
      <name>org.eclipse.jdt.core.javabuilder</name>
      <arguments>
      </arguments>
    </buildCommand>
  </buildSpec>
  <natures>
    <nature>org.eclipse.jdt.core.javanature</nature>
  </natures>
</projectDescription>
Now import the source code as eclipse project.

Add all libs of binary distribution (lib/*.jar) to the new project's lib dir.
Add all files, except ApacheJmeter.jar (bin/**/*) to the new project's bin dir.


Clean eclipse project.
Run ant package on the build.xml file to create the RMI jars.


Create a launcher for org.apache.jmeter.NewDriver and set arguments/working directory to: ${workspace_loc:jakarta-jmeter-2.3.4/bin}.
When you start the launcher, jmeter comes up.

With the same configuration you can also debug. Add a debug breakpoint to for instance: org.apache.jmeter.engine.StandardJMeterEngine.runTest()
Start your debug session now, and run example script. It will wait when you start a test run.

To debug your own project (with sampler for instance):
  • make your own project dependent on jakarta-jmeter project
  • check that all jars from jakarta-jmeter project are exported
  • duplicate the launcher of above, but replace project by your own now
  • make sure your working directory is still pointing to jakarta-jmeter project's bin directory
  • check that your own src directory of sampler is on the classpath of the launcher
  • launch jmeter with this project's launcher
  • add debug breakpoints to sampler's code

Debugging against jmeter

Inside jmeter startup script, add:
set DEBUGJDWP=-agentlib:jdwp=transport=dt_socket,address=localhost:9009,server=y,suspend=y
set ARGS=%DUMP% %HEAP% %NEW% %SURVIVOR% %TENURING% %EVACUATION% %RMIGC% %PERM% %DDRAW% %DEBUGJDWP%
Now start jmeter. It will hang and wait for debugger.

In eclipse, add a new remote java application debug configuration. Add jmeter source code to it. Connect. Jmeter will start now.

Attach jmeter source code. Lookup class:
org.apache.jmeter.engine.StandardJMeterEngine
Set debug point in runTest() method and start your script.

Wednesday, August 03, 2011

Good principles of test automation

Core principles of good tests
When you start automating your tests, there a few important principles to stick to.

Reliable
First your tests must be reliable. This means that they must repeatable and always give the same results. They should not contain any environment dependencies, and be executable on any environment. To get this working, all hard-coded settings must be configurable, a maven or ant script is required to build and configure the test environment, and all code, test date and configuration scripts must be checked in source code repository. Having reliable tests that run everywhere and always, also means that your tests must take care to setup necessary test data, and clean it afterward again.


Fast
Next, it would be nice when your tests are fast. In the lifetime of a project you will build up ever more tests; with every next sprint you will need to run more tests. You can't run all your tests on check-in, and for certain groups of tests need to schedule test execution times at less regular intervals.

Self explaining tests
Also your tests should be easy to understand, and have clean reports. Easy to understand means that an outsider can quickly understand what the test is about. Proper naming helps a lot here, additionally you can add document tags to the comments. It also helps to stick to the domain language so that it is easy to map test cases to use cases.



Clean reports
Test reports should have a clean baseline. Try to avoid including known bugs, as they clog up the test results, and make the news bugs less visible. Also in your asserts add meaningful log statements that help finding the root cause of the failure.

Continous integration
Further you want your tests be integrated in continous integration build process. You will probably want to create several groups and have some of these more frequently executed than others. Once you have them running automatically, you are sure that they actually get executed, and you also make sure that they are really reproducible. Before you add your tests to CI you must make sure though that your test reports have a clean baseline. Else people will ignore the test reports.


Continuous integration of test environment setup
Deployment and execution of tests in CI environments is the easy part. Far more difficult it may be to set up and configure your test environment. This includes setting up your application server, initialising your database, starting your application server, deploying your application and configuration of appl server and application. This will require a lot of specialist knowledge.
One step further you may also want to mock your connection proxies to external connections. This may be for instance a credit card provider. You don't want to send your payments that you generate in tests to a real provider. So you will need to mock the provider's functionality and have your test environment set up with this mock interface instead of the real one.

Testability
A last point of attention concerns testability. Some of your application's functionality may be difficult to test as you can't reach it with your testing code. Examples could be a status of an object that you have just changed, and can only be checked by a UI application. As a tester you would like to have direct access to this functionality, so that you can use it for the asserts of the test. These testability functionalities are additional requests to the developers. They will need to build it in the code, just for the sake of testing. Other important testability examples are: fixed IDs for html elements, or some deleteFunctionality to clean up test generated objects.

Test automation in agile

Test automation importance
With our change to agile, we have made an important paradigm shift. In the old days we would stick to our planned roadmap, and compromise either on delivery dates or software quality. In Agile software development we are willing to compromise on scope, but not on software quality or delivery dates. This means in short: no code goes out untested.
 
What tests to execute?
It is nearly impossible to test everything. There are so many types of tests, like: regression, story acceptance, developer, integration, usability, performance, security, upgrade, user acceptance etc. In our release planning, we must make a conscious decision which of those types we want to execute. Some of the test types may require additional experts, some of the tests may not be relevant, some may be just too costly.  

The usual scrum tests Within our sprints, we tend to stick to regression tests, story acceptance tests, developer tests and integration tests. Developer tests are internal component tests, that test a component directly without the need of other components. Integration tests review the interaction between two or more components. Story acceptance tests verify the acceptance criteria. Regression tests are the story acceptance tests of previous sprints and may also include parts of the integration tests.
 
Which tests to automate?
For some tests it may not make sense to automate. Will the test ever be executed again? If not, then it will probably not pay off to automate it. Also consider the cost of test automation versus the time needed to test manually. Especially for UI tests, it may be cheaper to do manual tests. Drag and drop, and visual control of the result are very hard to automate. For normal webforms or low level service tests, automation is efficient.  

What are the risks of not automating?
It is likely, that those tests that we decided not to automate, will not be executed in future sprints. So future sprints may break existing functionality without us knowing it. With less test coverage, we will get afraid to refactor our code. In the extreme case, we would end up with an application that we don't dare to touch. So it is very important to have automated regression tests with good coverage.