Thursday, December 12, 2019

Problem Eclipse: an internal error occurred during: "importing maven projects". java.lang.nullpointerexception

Got NPE when import/forceUpdate for maven project?

 

This possible is the Eclipse cache problem when you have some update in your POM file. 

To solve the problem, you will need:
1. close the eclipse
2. delete the .project and .settings file under your project folder
2. run 'mvn eclipse:clean' in terminal under your project folder
3. delete the eclipse workspace file or the .metadata file under the workspace
4. reopen the eclipse with command '-clean'

Problem will be solved perfect.

Thursday, February 21, 2019

How to use array input parameters with ForEach Controller in JMeter

It is a very basic and common request that run calls base on input array in Jmeter scripts. ForEach Controller can iterates through an array of variables and invoke the actions below it basing on each value.

In this tutorial, we will talk about below situations with using the ForEach Controller.

Case A: convert a input string and use it in ForEach Controller
Case B: Parse a JSON response and use it in ForEach Controler

Definition of FroEach Controller

Before we start, let's see the official defines for FroEach Controller.
A ForEach controller loops through the values of a set of related variables. When you add samplers (or controllers) to a ForEach controller, every sample (or controller) is executed one or more times, where during every loop the variable has a new value. The input should consist of several variables, each extended with an underscore and a number. Each such variable must have a value. So for example when the input variable has the name inputVar, the following variables should have been defined:
  • inputVar_1 = wendy
  • inputVar_2 = charles
  • inputVar_3 = peter
  • inputVar_4 = john
Note: the "_" separator is now optional.
The ForEach Controller does not run any samples if inputVar_1 is null. This would be the case if the Regular Expression returned no matches.

From the definition we know that FroEach Controller accept a set of variables which have the same prefix and formatted as Name_IndexNumber, the index starts from 1.And it will pass each value into all he Samples in the loop. And ForEach Controller will not run if the first value is null.

Case A: convert a input string and use it in ForEach Controller

Suppose there is an input value as "a,b,c,d", how could we use it in the ForEach Controller? First, we need to parse the input and add it into a set of variables.
Here is the JSR223 Sample "Sample 3" that we use for this tutorial
log.info(props.get("inputValue"));
String[] testvars = (props.get("inputValue")).split(",");
vars.putObject("inputVarA", testvars);

String[] testv = vars.getObject("inputVarA");
for(int i = 0; i < testv.length; i++) {
    vars.put("inputVarB_"+(i+1), testv[i]);
}
It reads the parameter inputValue, and split it into a string array. inputVarA is stored as an object in jmeter while in the for loop, we store each value into a new variable as inputVarB_[index]. And then We create two ForEach Controller, first one is using inputVarA


And the second one is using inputVarB
And run it, we will see

The foreach loop which using inputVarB runs successfully while the one using inputVarA does not run.Which approved that ForEach Controller only accept the input format as Name_indexNo.

Case B: Parse a JSON response and use it in ForEach Controler

Then, what if I want to use a JsonArray as the input for the ForEach Controller, as Json is a very common response format for the web service.

Suppose we have an API returns the response as below
[{id:1, attributeName:a},{id:2, attributeName:b},{id:3, attributeName:c},{id:4, attributeName:d}]
And we want to use the attributeName as the input for ForEach Controller.
To extract all the attributeName, we need to use the JSON Extractor.
And then use it as the input at the ForEach Controller
Run it, you will see the values in the attributeName has been used in ForEach loop.
Somebody may ask how the ForEach Controller recognize the extract variable? as there is no steps to add in the underscore and number after it. This is because the JSON Extractor automatically stores the extract value as the format of VariableName_Index.
Use a JSR223 to verify this:
log.info("log for input vars_1: " + vars.get("inputVar_1"));
log.info("log for input vars list: " + vars.getObject("inputVar"));
And we will get the log as below at debug view:
 INFO  - jmeter.extractor.JSR223PostProcessor: log for input vars_1: a
 INFO  - jmeter.extractor.JSR223PostProcessor: log for input vars list: null
For more information about how JSON Extractor work, please refer the doc here.





Thursday, December 20, 2018

How to run the shell scripts through the Jenkins Pipeline

It is a very common requirement to run shellscripts at a step of Jenkins Pipeline. Today I would like to introduce the Jenkins Plugin "SSH Pipeline Steps" which makes your life much easy to call a shell script at your Jenkins pipeline. It can upload/download files from the remote machines, and it can also run commands or shell scripts on that machine. Each function will start a session to do the work and close the session after work.

Below are the functions in this plugin:
  • sshCommand: Executes the given command on a remote node.
  • sshScript: Executes the given shell script on a remote node.
  • sshGet: Gets a file/directory from the remote node to current workspace.
  • sshPut: Puts a file/directory from the current workspace to remote node.
  • sshRemove: Removes a file/directory from the remote node.
Let's see how could we use it in Jenkins Pipeline script.

step 1. Install the plugin into your Jenkins.
step 2. Create a new pipeline job.


step 3. Choose definition as Pipeline script, where we can test scripts with this plugin


step 4. create pipeline script

node {
            stage('test plugin') {

           }
}

step 5. to use the plugin functions, we need to create a remote variable first. Let's create the remote variable.

node {
            def remote = [:]
            remote.name = 'testPlugin'
            remote.host = 'remoteMachineName'
            remote.user = 'loginuser'
            remote.password = 'loginpassword'
         

            stage('test plugin') {

           }
}

The remote variable is a key/value map, it stores the information that the functions will use.

step6. call the remote function.

node {
            def remote = [:]
            remote.name = 'testPlugin'
            remote.host = 'remoteMachineName'
            remote.user = 'loginuser'
            remote.password = 'loginpassword'
         

            stage('test plugin') {
                  sshPut remote: remote, from: 'test.txt', into: '.'
                  sshCommand remote: remote, command: "ls"               
           }
}
    
In this sample, we first upload the test.txt file from jenkins machine to the remote machine at /home/loginuser, and then we run the shell command ls to see whether or not the file exists.

This is a basic sample for using the SSH Pipeline Steps, for further information, please refer their git.

Little Tips: because the plugin command will close the sessions and disconnect from the machine, it will automatically end the running of the process it initialized. To make the thread keep running during the whole pipleline lifecycle, you will need to use "nohup" command. Next chapter, I will talk about how 'nohup' command works.

Tuesday, October 30, 2018

Code Coverage - How to On-the-fly instrumentation for Jacoco

Code coverage shows which lines of the code have been executed by the tests. A high test coverage could not guarantee a high quality of project but at some point it suggests that there is a lower chance of containing undetected software bugs.

For Java project the best way to measure the code coverage is through the instrumenting the code. There are different ways to do the instruments.

Offline instrumentation VS On-the-fly instrumentation

Offline instrumentation is a way through injecting the data collector calls into the source code or byte code. It usually works side by side with the project and does the modification at compile time. In my previous blogs, I described on how to configure the Jacoco for maven projects which is a typical example for offline instrumentation.

While the On-the-fly instrumentation does the instrument at a lower level like classes loader, usually through a Java agent. Today we will talk about how to do the Jacoco on-the-fly instrumentation.

Who will be benefit from the On-the-fly instrumentation

The On-the-fly instrumentation is very useful if your integration test is not located with the Dev project. The instrument will happen at the server machine and give out the code coverage report after running the integration test against it.

How to do the On-the-fly instrumentation for Jacoco

Jacoco is using the Java agent mechanism for in-memory pre-processing of all classes files during class loading. For details, please refer the Official Document for it.

Step 1. copy the jacoco agent file from official site to your machine.
Step 2. set the JAVA_OPTS to include the jacoco agent configuration. Let's learn the configurations from the sample:

"-javaagent:/jacocoHomePath/jacocoagent.jar=output=tcpserver,address=testerMachineAddress,port=8084,includes=com.testjacoco.*"

In this configuration, we have set up 4 parameters: output, address, port and includes. This one is setting up jacoco agent as a TCP  socket server, so the external tools such like jacococli can connect it through the configured port to dump the code coverage data. And you can use includes/excludes to manage the files on which you want to have the code coverage data.

The official site has mentioned that Jacoco provides three different modes for execution data ouput:
  • File System: At JVM termination execution data is written to a local file.
  • TCP Socket Server: External tools can connect to the JVM and retrieve execution data over the socket connection. Optional execution data reset and execution data dump on VM exit is possible.
  • TCP Socket Client: At startup the JaCoCo agent connects to a given TCP endpoint. Execution data is written to the socket connection on request. Optional execution data reset and execution data dump on VM exit is possible.
If you don't know how to set up the java opts for web service, this document have details for Tomcat and Jboss.

Step 3. restart the service and run the test against it.
Step 4. collect the code coverage by using tool jacococli. Sample command like below

"java -jar ~/Downloads/temp/lib/jacococli.jar dump --address testerMachineAddress --port 8084 --destfile jacocoTest.exec"

Now you will get your data coverage file to your local machine.


Sunday, December 10, 2017

Code Coverage - JaCoCo with Maven multi-module project

In this post, we will see how to configure the JaCoCo for Maven multi-module projects.
Previously, JaCoCo did not support the multi modules maven project, but with the latest version, it adds in the supports. JaCoCo has provided its sample at Git under its maven plugin test at it-report-aggregate. Let's use this sample project to see how it works.

First download the code by using command
svn checkout https://github.com/jacoco/jacoco/trunk/jacoco-maven-plugin.test/it/it-report-aggregate

This project is a parent-child project. It contains four modules, the child1, child1-test, child2 and report. The report project is used to create a centralized code coverage report. 


This project is a child project in Jacoco project by default, it has configured to have "setup-parent" as its parent project in the pom file. To make this project works as an independent project, we need to do a little modifications.
First, remove the parent configuration at it-report-aggregate/pom.xml and add in the groupId and version as below (remove the parent and add in the lines flagged as blue)


 <parent>

    <groupId>jacoco</groupId>
    <artifactId>setup-parent</artifactId>
    <version>1.0-SNAPSHOT</version>
  </parent>

  <groupId>jacoco</groupId>
  <artifactId>it-report-aggregate</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>pom</packaging>

Second, copy in all the dependences and plugin configurations from Jacobo/jacoco-maven-plugin.test/it/setup-parent/pom.xml to this pom.xml.

Third, change the "@project.groupId@" to "org.jacoco" for plugin configuration in this pom.xml and also the one under folder it-report-aggregate and it-report-aggregate/report.

Run the command "mvn clean verify", the report will be generated under report/target/site/jacoco-aggregate. However, it only includes the report for child1 and child1-test. Check at the file report/pom.xml. you will see that the module child1 and child2 are configured at different scope as dependency.

  <dependencies>
    <dependency>
      <groupId>jacoco</groupId>
      <artifactId>child1</artifactId>
      <version>${project.version}</version>
      <scope>compile</scope>
    </dependency>
    <dependency>
      <groupId>jacoco</groupId>
      <artifactId>child1-test</artifactId>
      <version>${project.version}</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>jacoco</groupId>
      <artifactId>child2</artifactId>
      <version>${project.version}</version>
      <scope>runtime</scope>
    </dependency>
  </dependencies>

Change the scope of child2 to compile, and run command "mvn clean verify" again, you will see the report generated with all three modules then.It automatically generates the report for all the modules.


Let's come back to see what configurations it has done. 
Check on all the pom.xml in the projects, we will find only the pom files under it-report-aggregate and it-report-aggregate/report have been added in the configurations for JaCoCo. 
  • it-report-aggregate/pom.xml:  It defined the JaCoCo plugin at "prepare-agent". All the child modules will inherit this configurations, so that's why you will see the Jacobo.exec generated under all child module target folder.
  <build>
    <plugins>
      <plugin>
        <groupId>@project.groupId@</groupId>
        <artifactId>jacoco-maven-plugin</artifactId>
        <executions>
          <execution>
            <id>prepare-agent</id>
            <goals>
              <goal>prepare-agent</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>
  • it-report-aggregation/report/pom.xml: it defined all other three modules as dependencies and also add in the JaCoCo plugin at "report-aggregate". This is the new feature that JaCoCo just added in for supporting this multi module case at version 0.7.7.

  <dependencies>
    <dependency>
      <groupId>jacoco</groupId>
      <artifactId>child1</artifactId>
      <version>${project.version}</version>
      <scope>compile</scope>
    </dependency>
    <dependency>
      <groupId>jacoco</groupId>
      <artifactId>child1-test</artifactId>
      <version>${project.version}</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>jacoco</groupId>
      <artifactId>child2</artifactId>
      <version>${project.version}</version>
      <scope>runtime</scope>
    </dependency>
  </dependencies>

  <build>
    <plugins>
      <plugin>
        <groupId>org.jacoco</groupId>
        <artifactId>jacoco-maven-plugin</artifactId>
        <executions>
          <execution>
            <id>report-aggregate</id>
            <phase>verify</phase>
            <goals>
              <goal>report-aggregate</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
  </build>

This is very clear on how to configure the JaCoCo, below is the summary steps.
1. add in the JaCoCo plugin for prepare-agent at parent pom
2. create a separate module in the project and announce it as a child from the parent
3. add all modules as dependences into this new module
4. add in the JaCoCo plugin for report-aggregate at the new module's pom

Refer the official Wiki for more information.

Work-Arounds
The life is much better if we could have a backup plan B. The JaCoCo Ant task has provide the alter way to create the centralized report for Maven multi module project. To use the ant tasks in maven plugin, we will need to use the plugin "maven-antrun-plugin".  Let's continue to use the same sample project to see how could we make it.

Create a new module named "report2" to demo this method. In the pom.xml, announce the same parent and dependences as the report module. Then we add in the maven-dependcy-plugin to copy the jacocoant.jar file to local. 

<plugin>
   <groupId>org.apache.maven.plugins</groupId>
   <artifactId>maven-dependency-plugin</artifactId>
   <executions>
<!-- Copy the ant tasks jar. Needed for ts.jacoco.report-ant . -->
<execution>
        <id>jacoco-dependency-ant</id>
        <goals>
<goal>copy</goal>
        </goals>
        <phase>process-test-resources</phase>
        <inherited>false</inherited>
        <configuration>
<artifactItems>
             <artifactItem>
                <groupId>org.jacoco</groupId>
                <artifactId>org.jacoco.ant</artifactId>
                 <version>0.7.6.201602180812</version>
             </artifactItem>
</artifactItems>
<stripVersion>true</stripVersion>
<outputDirectory>${basedir}/target/jacoco-jars</outputDirectory>
        </configuration>
</execution>
    </executions>
</plugin>

After this configure the ant task, the ant task will use the existing jacocoant.jar and computing the coverage. The configuration is very similar with JaCoCo Ant Task.

<plugin>
   <groupId>org.apache.maven.plugins</groupId>
   <artifactId>maven-antrun-plugin</artifactId>
   <version>1.6</version><!--$NO-MVN-MAN-VER$ -->
   <inherited>false</inherited>
   <executions>
      <execution>
<phase>post-integration-test</phase>
<goals>
           <goal>run</goal>
</goals>
<configuration>
           <target>
              <echo message="Generating JaCoCo Reports" />
              <taskdef name="report" classname="org.jacoco.ant.ReportTask">
                <classpath path="../target/jacoco-jars/org.jacoco.ant.jar" />
              </taskdef>
              <report>
                <executiondata>
                   <fileset dir="../child1/target">
                       <include name="jacoco.exec" />
                   </fileset>
                   <fileset dir="../child1-test/target">
                      <include name="jacoco.exec" />
                    </fileset>
                   <fileset dir="../child2/target">
                      <include name="jacoco.exec" />
                   </fileset>
                 </executiondata>
                <structure name="rmdaalser Coverage Project">
                   <group name="rmdaalser">
                      <sourcefiles encoding="UTF-8">
                           <fileset dir="./child1/src/main/java" />
                        <fileset dir="../child2/src/main/java" />
                      </sourcefiles>
                      <classfiles>
                        <fileset dir="../child1/target/classes" />
                        <fileset dir="../child2/target/classes" />
                      </classfiles>
                    </group>
                </structure>
                 <html destdir="${basedir}/target/jacoco-coverage-report/html" />
               </report>
           </target>
</configuration>
      </execution>
   </executions>
   <dependencies>
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.ant</artifactId>
<version>0.7.6.201602180812</version>
</dependency>
   </dependencies>
</plugin>

Run "mvn clean verify", you will see the coverage report under report2/target/jacoco-coverage-report/html.











Wednesday, December 6, 2017

Problem: JaCoCo - Skipping JaCoCo execution due to missing execution data file

When configured JaCoCo with maven project, it is very common to meet the problem "Skipping JaCoCo execution due to missing execution data file" when running the test. It is not an error, the build will still succeed, however, the JaCoCo will not generate the report because it could not find the Jacobo.exec file.

There are many reasons for this problem. But most common case is that the JaCoCo agent is failed to configured into the command line. Using the command "mvn -X clean verify" to check on the debug logs. At the "TESTS" step, you will be able to see a line for "Forking command line".

If the JaCoCo agent is configured correctly, you will be able to see similar logs as below.

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
[DEBUG] boot classpath:  /Users/jhuang8/.m2/raptor2/org/apache/maven/surefire/surefire-booter/2.17/surefire-booter-2.17.jar  /Users/jhuang8/.m2/raptor2/org/apache/maven/surefire/surefire-api/2.17/surefire-api-2.17.jar  /Users/jhuang8/Documents/workspace/JaCoCoPractice/target/test-classes  /Users/jhuang8/Documents/workspace/JaCoCoPractice/target/classes  /Users/jhuang8/.m2/raptor2/junit/junit/4.8.1/junit-4.8.1.jar  /Users/jhuang8/.m2/raptor2/org/apache/maven/surefire/surefire-junit4/2.17/surefire-junit4-2.17.jar
[DEBUG] boot(compact) classpath:  surefire-booter-2.17.jar  surefire-api-2.17.jar  test-classes  classes  junit-4.8.1.jar  surefire-junit4-2.17.jar

Forking command line: /bin/sh -c cd /Users/jhuang8/Documents/workspace/JaCoCoPractice && /Library/Java/JavaVirtualMachines/jdk1.8.0_144.jdk/Contents/Home/jre/bin/java -javaagent:/Users/jhuang8/.m2/raptor2/org/jacoco/org.jacoco.agent/0.7.6.201602180812/org.jacoco.agent-0.7.6.201602180812-runtime.jar=destfile=/Users/jhuang8/Documents/workspace/JaCoCoPractice/target/jacoco.exec -Xmx1024m -XX:MaxPermSize=512m -jar /Users/jhuang8/Documents/workspace/JaCoCoPractice/target/surefire/surefirebooter3359878682012651006.jar /Users/jhuang8/Documents/workspace/JaCoCoPractice/target/surefire/surefire8553706350128891199tmp /Users/jhuang8/Documents/workspace/JaCoCoPractice/target/surefire/surefire_04828513279405888630tmp

The JaCoCo part is flagged out as blue.

And how can we add in these JaCoCo agent info? according to the office site Jacobo:prepare-agent,  you can obviously add in the @{argLine} at your maven-surefire-plugin configurations as

<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<forkMode>once</forkMode>
<forkCount>1</forkCount>
<argLine>@{argLine} -Xmx1024m </argLine>
<reuseForks>false</reuseForks>
</configuration>
</plugin>

The default name is argLine, but sometime, the JaCoCo configuration will be overwrote by some other argLine setup or plugin in pom. A way to avoid this is to configure a new property name at JaCoCo plugin and put it at the maven-surefire-plugin. Below is the sample configruation.

Set a new propertyName at JaCoCo plugin

<plugin>
   <groupId>org.jacoco</groupId>
   <artifactId>jacoco-maven-plugin</artifactId>
   <version>0.7.6.201602180812</version>
   <executions>
<execution>
<id>prepare-agent</id>
<goals>
         <goal>prepare-agent</goal>
</goals>  
<configuration>
<propertyName>jaCoCoArgLine</propertyName>
      </configuration>
</execution>
<execution>
<id>post-unit-test</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
   </executions>
</plugin>

and put the new propertyName at maven-surefire-plugin like below

<argLine>jaCoCoArgLine -Xmx1024m </argLine>
This time, check at the maven logs, it will shows as jaCoCoArgline under JaCoCo maven plugin for prepare-agent stage. And you will be able to see the Jacobo.exec file generated under that path.

[INFO] --- jacoco-maven-plugin:0.7.6.201602180812:prepare-agent (prepare-agent) @ JaCoCoPractice ---
[INFO] jaCoCoArgLine set to -javaagent:/Users/jhuang8/.m2/raptor2/org/jacoco/org.jacoco.agent/0.7.6.201602180812/org.jacoco.agent-0.7.6.201602180812-runtime.jar=destfile=/Users/jhuang8/Documents/workspace/JaCoCoPractice/target/jacoco.exec

If the problem still shows up, then double check the maven-surefire-plugin, according to official site Maven Plug-in it has the restrictions as below.

When using the maven-surefire-plugin or maven-failsafe-plugin you must not use a forkCount of 0 or set the forkMode to never as this would prevent the execution of the tests with the javaagentset and no coverage would be recorded.