Finding the Pesky Todo Javadoc Tags

The common pattern while developing code is to drop a TODO  in the code. Other tropes are FIXME, XXX markers in the code to indicate that something has to be done to connect the pieces or finish an implementation.

package test;
public Test {
// TODO add an implementation.

At some point, you lose the number, the todos, and those pesky tags are lost. I found the mvn plugin taglist finds those pesky tags.

I ran the command

mvn taglist:taglist -f fhir-parent/pom.xml -Daggregate=true

I wanted to transform the report, so I added to the module’s target’s taglist.xml.

<?xml-stylesheet type="text/xml" href="#stylesheet"?>
<!DOCTYPE catelog [
<!ATTLIST xsl:stylesheet
<xsl:stylesheet id="stylesheet" version="1.0" xmlns:xsl="">
<xsl:template match="/">
<xsl:for-each select="report/tags/tag/files/file/comments/comment">
<td><xsl:value-of select="../../../../@name"/></td>
<td><xsl:value-of select="../../@name"/></td>
<td><xsl:value-of select="lineNumber"/></td>
<td><xsl:value-of select="comment"/></td>

When loaded in Firefox, it’ll generate a table output, and you can open an issue to address or fix the TODOs. 

I hope this helps you out. (GIST below)


Random EAR Projects

I had a similar experience and ran across .

m2e-wtp can also be disabled via the m2e.wtp.activation>false in the Maven section of your project pom;xml.

I create a new workspace and cloned a fresh copy of my code in a temp directory. I then updated the parent pom.xml with the following:


I then imported Existing Maven Projects and no extra EAR projects.

The  JAX-RS, JPA and JSF configurators can be enabled or disabled at a workspace level from Window > Preferences > Maven > Java EE Integration.

Lessons Learned for 20-SEPT-2019

XStream Illegal Reflective Access

If you are compiling with AdoptOpenJDK 11, you might hit –  “Illegal reflective access” com.thoughtworks.xstream.converters.collections.TreeMapConverter

If your build and your dependencies don’t show xstream in it, check your plugins:
mvn dependency:resolve-plugins -f fhir-parent/pom.xml | grep -B20 -i xst
[INFO] Plugin Resolved: maven-war-plugin-3.2.3.jar
[INFO] Plugin Dependency Resolved: maven-plugin-api-3.0.jar
[INFO] Plugin Dependency Resolved: maven-core-3.0.jar
[INFO] Plugin Dependency Resolved: maven-archiver-3.4.0.jar
[INFO] Plugin Dependency Resolved: commons-io-2.5.jar
[INFO] Plugin Dependency Resolved: plexus-archiver-4.1.0.jar
[INFO] Plugin Dependency Resolved: plexus-interpolation-1.25.jar
[INFO] Plugin Dependency Resolved: xstream-1.4.10.jar

I upgraded to the latest maven-war-plugin (3.1.0) and it was solved

Depgraph Maven Plugin

Use the depgraph-maven-plugin to aggregate the output as an image.  (I used this in the FHIR project to see if there were any unknown dependencies – )

 mvn com.github.ferstl:depgraph-maven-plugin:3.3.0:aggregate -f fhir-parent/pom.xml -DshowAllAttributesForJson=true -DcreateImage=true -DreduceEdges=true -DmergeClassifiers=true -DmergeTypes=true -Dexcludes=testng:: -DgraphFormat=json

Extract the Certs (All of Them)

Quick way to extract the main cert, and the intermediate CA and ROOT ca from a host.

echo “” | openssl s_client -showcerts -prexit -connect HOSTNAME:443 2> /dev/null | sed -n -e ‘/BEGIN CERTIFICATE/,/END CERTIFICATE/ p’

You’ll get a PEM as output (just capture into a file you can use)

Travis APIs

LINTING the Travis Yaml

curl -X POST -H “Accept: application/vnd.travis-ci.2+json” -H “Authorization: token <MYTOKEN>” –data-binary “@./.travis.yml” -v


Example response

{“lint”:{“warnings”:[{“key”:[“deploy”],”message”:”dropping \”file\” section: unexpected sequence”},{“key”:[],”message”:”value for \”addons\” section is empty, dropping”},{“key”:[“addons”],”message”:”unexpected key \”apt\”, dropping”}]}}

“lint”: {
“warnings”: [
“key”: [
“message”: “dropping \”file\” section: unexpected sequence”

“lint”: {
“warnings”: []

More detail at

Sonarcube – Maven and Docker analysis

SonarLint is one tool I gravitate towards – inline analysis of my code in Eclipse.  I have finally broken down and investigated using Sonarcube with maven – the heavy weight tool for evaluating code.  It’s exciting.

You need to pull your sonarqube docker. You can find more details at

:~/my-repo$ docker pull sonarqube
Using default tag: latest
latest: Pulling from library/sonarqube
b8f262c62ec6: Pull complete
377e264464dd: Pull complete
bde67c1ff89f: Pull complete
6ba84ddbf1b2: Pull complete
ee22adb378a6: Pull complete
41d339c20e4f: Pull complete
25c2c6b7a1f3: Pull complete
4b36ae3e85ab: Pull complete
1062305937e9: Pull complete
Digest: sha256:032ae6e1021533a3731d5c6c0547615dea8d888dcec58802f8db3a9bd6f26237
Status: Downloaded newer image for sonarqube:latest

Start the container with a localhost hostname using

--hostname localhost.
$ docker run --hostname localhost -d --name sonarqube -p 9000:9000 sonarqube

Now, that sonarcube ist started, you can execute maven to generate the report.

mvn clean verify jacoco:report-aggregate sonar:sonar -f my-parent/pom.xml -DskipTests -pl '!../my-big-model/'

Once you execute the maven goals, you don’t want to see any ‘SKIPPED’.  If you do, you should add clean and verify to the goals you send to maven.

[INFO] Analysis total time: 59.857 s
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for My-Project:
[INFO] my-maven-parent ...................... SUCCESS [01:01 min]
[INFO] my-maven-project .......................................... SUCCESS [ 1.193 s]
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:34 min
[INFO] Finished at: 2019-09-15T13:24:10-04:00
[INFO] ------------------------------------------------------------------------

For each project in the output, you see details about the execution, check to see that there are no WARNING or ERROR. If there are, you should check out some of the troubleshooting I did (at the end.)

[INFO] ------------- Run sensors on module my-maven-project
[INFO] Sensor JavaSquidSensor [java]
[INFO] Configured Java source version ( 8
[INFO] JavaClasspath initialization
[INFO] JavaClasspath initialization (done) | time=1ms
[INFO] JavaTestClasspath initialization
[INFO] JavaTestClasspath initialization (done) | time=0ms
[INFO] Java Main Files AST scan
[INFO] 1 source files to be analyzed
[INFO] Java Main Files AST scan (done) | time=51ms
[INFO] Java Test Files AST scan
[INFO] 1/1 source files have been analyzed
[INFO] 0 source files to be analyzed
[INFO] 0/0 source files have been analyzed
[INFO] Java Test Files AST scan (done) | time=1ms
[INFO] Sensor JavaSquidSensor [java] (done) | time=65ms
[INFO] Sensor JaCoCo XML Report Importer [jacoco]
[INFO] Sensor JaCoCo XML Report Importer [jacoco] (done) | time=0ms
[INFO] Sensor SurefireSensor [java]
[INFO] parsing [/repo-folder/my-maven-project/target/surefire-reports]
[INFO] Sensor SurefireSensor [java] (done) | time=1ms
[INFO] Sensor JaCoCoSensor [java]
[INFO] Sensor JaCoCoSensor [java] (done) | time=0ms
[INFO] Sensor JavaXmlSensor [java]
[INFO] 1 source files to be analyzed
[INFO] Sensor JavaXmlSensor [java] (done) | time=5ms
[INFO] 1/1 source files have been analyzed
[INFO] Sensor HTML [web]
[INFO] Sensor HTML [web] (done) | time=0ms
[INFO] Sensor XML Sensor [xml]
[INFO] 1 source files to be analyzed
[INFO] Sensor XML Sensor [xml] (done) | time=4ms
[INFO] 1/1 source files have been analyzed

Towards the end of the execution you see:

[INFO] ANALYSIS SUCCESSFUL, you can browse http://localhost:9000/dashboard?
[INFO] Note that you will be able to access the updated dashboard once the server has processed the submitted analysis report
[INFO] More about the report processing at http://localhost:9000/api/ce/task?id=AW01-ot7J-mI0tW_q5b5

Checking the Report Processing I can see the successful result:

I can dig into the report tosee various recommendations and errors.

I can re-run and see the differences on demand.  This tool is awesome.

Finally stop the container.

docker stop container d2c698884d4d01a527afd8f2231fcb6bbd514c5ed7c56d2dc3f7f7a758b4977d

Good luck, I hope this helps.


AST Out of Memory

If you see
Exception in thread “Report about progress of Java AST analyzer” java.lang.OutOfMemoryError: Java heap space“, then
Set the memory boundaries
export SONAR_SCANNER_OPTS="-Xmx3062m -XX:MaxPermSize=512m -XX:ReservedCodeCacheSize=128m"

Exclude a Project
Out of Memory issue for specific projects, you can exclude them.
If you still see an issue use – -pl '!../my-big-model/' to skip the offending project (specifically if you have a parent in a different folder.

Missing Class

If you see

[WARNING] Classes not found during the analysis : [com.mypackage.MyClass]
[INFO] Java Test Files AST scan (done) | time=25ms

make sure you have clean and verify in the the goal list (the byte code should exist) you can also use package and install

Alternatively, if you see
[WARNING] The following dependencies could not be resolved at this point of the build but seem to be part of the reactor:
[WARNING] o com.mygroup:my-jar:jar:4.0.0-SNAPSHOT (compile)
[WARNING] Try running the build up to the lifecycle phase “package”
[WARNING] The following dependencies could not be resolved at this point of the build but seem to be part of the reactor:
Then make sure you could packagemvn clean package -f my-parent/pom.xml -DskipTests


Migrating Source Code Git-to-Git

Migrating source code is a pain in the butt, I know.  There are about 9 million variations, and one of interest to me – git to 

There are a number of tools to clean up your git history and prepare to move.

  • Git and Scripting
  • BFG Repo Cleaner
  • Git-Python
  • JGit

I found Git-Python a bit cumbersome, BFG Repo Cleaner more than I needed/wanted, and Git / Scripting too much work. After some prototyping, I opted for JGit from Eclipse and some Git knowhow.

First, I switched to the source Git Repo branch I wanted to migrate and exported the commit list.

git rev-list HEAD > commits.txt

which results in




This commits.txt is useful down the line.

I am a Maven disciple so, I created a maven java project with Java 1.8 and the following dependencies:
















I used the JGit to check the list of commits (note the REPO here must have .git at end).

try (Git git = File(SOURCE_GIT_REPO))) {


            System.out.println("Starting Branch is " + git.getRepository().getBranch());

            Iterator<RevCommit> iter = git.log().call().iterator();

            while (iter.hasNext()) {

                RevCommit commit =;

                String binSha =;




I flip it around, so I can process OLDEST to NEWEST


I used the git log (LogCommand in JGit) to find out all the times a FILE was modified, and do custom processing:

try (Git git = File(REPO))) {

            LogCommand logCommand = git.log().add(git.getRepository().resolve(Constants.HEAD)).addPath(fileName.replace(REPO, ""));

            Set<String> years = new HashSet<>();

            for (RevCommit revCommit : {

                Instant instant = Instant.ofEpochSecond(revCommit.getCommitTime());





To find out the files specifically in the HEAD of the repo, gets the files and paths, and puts it in a List

try (Git git = File("test/.git"))) {

            Iterator<RevCommit> iter = git.log().call().iterator();

            if (iter.hasNext()) {

                RevCommit commit =;

                try (RevWalk walk = new RevWalk(git.getRepository());) {

                    RevTree tree = walk.parseTree(commit.getId());

                    try (TreeWalk treeWalk = new TreeWalk(git.getRepository());) {



                        while ( {








I built a history of changes.

try (Git git = File("test/.git"))) {

            Iterator<RevCommit> iter = git.log().call().iterator();

            while (iter.hasNext()) {

                RevCommit commit =;

                try (DiffFormatter df = new DiffFormatter(DisabledOutputStream.INSTANCE);) {

                    df.setRepository(git.getRepository());                    df.setDiffComparator(RawTextComparator.DEFAULT);


                    CommitHistoryEntry.Builder builder =                            CommitHistoryEntry.builder().binsha(;

                    RevCommit[] parents = commit.getParents();

                    if (parents != null && parents.length > 0) {

                        List<DiffEntry> diffs = df.scan(commit.getTree(), parents[0]);


                    } else {


                        try (RevWalk walk = new RevWalk(git.getRepository());) {

                            RevTree tree = walk.parseTree(commit.getId());

                            try (TreeWalk treeWalk = new TreeWalk(git.getRepository());) {



                                while ( {                                   









I did implement the Visitor pattern to optimize the modifications to the commit details and cleanup any bad Identity mappings (folks had many emails and names which I unified) and cleanedup the email addresses.

Next, I created a destination git (all fresh and ready to go):

try (Git thisGit = Git.init().setDirectory(new File(REPO_DIR)).call()) {

   git = thisGit;


One should make sure the path exists, and it doesn’t matter if you have files in it.

Commit the files in the git directory… you can commit without FILES!

CommitCommand commitCommand = git.commit();

        // Setup the Identity and date

        Date aWhen = new Date(entry.getCommitTime() * 1000);

        PersonIdent authorIdent =

                new PersonIdent(entry.getAuthorName(), entry.getAuthorEmail(), aWhen, TimeZone.getDefault());







Note, you can set to almost any point in time.  As long as you don’t sign it, it’ll be OK.  I don’t recommend this as a general practice.

To grab the file, you can do a tree walk, and resolve to the object ID.

try (TreeWalk treeWalk = new TreeWalk(git.getRepository());) {



                            int localCount = 0;

                            while ( {

                                String fileName = treeWalk.getPathString();

ObjectId objectId = treeWalk.getObjectId(0);

            ObjectLoader loader = git.getRepository().open(objectId);

            String fileOutput = GIT_OUTPUT + "/" + binSha + "/" + fileNameWithRelativePath;

            int last = fileOutput.lastIndexOf('/');

            String fileOutputDir = fileOutput.substring(0, last);

            File dir = new File(fileOutputDir);


            // and then one can the loader to read the file

            try (FileOutputStream out =

                    new FileOutputStream(GIT_OUTPUT + "/" + binSha + "/"

                            + fileNameWithRelativePath);) {

                // System.out

                byte[] bytes = loader.getBytes();

                if (hasBeenModified(bytes, fileNameWithRelativePath)) {



                    result = true;



Note, I did check if the file was duplicate, it saved a couple of steps.

If you want to add files, you can set:



Git in the background builds the DIFFs for any file that is not specially treated as binary in the .gitattributes file.

For each commit, I loaded the file – checking for stop-words, checked the copyright header, check the file type, and compared it against the head . 

Tip, if you need to reset from a bad test.

git reset --hard origin # reset the branch

rm -rf .git # reset the repo (also be sure to remove the files)

The moving of the branch, you can execute


git checkout <BRANCH_TO_MIGRATE>

git reset --hard origin

git pull

git gc --aggressive --prune=now

git push<MyOrg>/<DEST_REPO>.git <BRANCH_TO_MIGRATE>:master

Note, I did rename the master branch in the repo prior.  Voila.  550M+ Repo moved and cleaned up.

The repo is now migrated, and up-to-date.  I hope this helps you.


Rename Branch in Git


Rewrite History


BFG Repo Cleaner




Changing a Keystore and Key’s Password

Create a list of keystores

cat << EOF > keystore-list.txt
Iterate over the list to check status and process 
for KEYSTORE in `cat keystore-list.txt`
[ ! -f $KEYSTORE ] && echo NOT

VAL="`cat $KEYSTORE | wc -l`"
[ ${VAL} -eq "1" ] && echo NOT_RIGHT

# show the private key / trust key
keytool -keystore $KEYSTORE -list -storepass ACTUAL_PASS 2>&1 | grep -v Warn | grep -v PKCS12 | grep -i PrivateKey
keytool -keystore $KEYSTORE -list -storepass ACTUAL_PASS 2>&1 | grep -v Warn | grep -v PKCS12 | grep -i Trust

Change the Passwords for the Key 

keytool -keypasswd -alias default -keypass OLDKEYPASS -new NEWpassword -keystore testKeystore.jks -storepass OLDPassword

keytool -storepasswd -keystore ./fhir-server-test/src/test/resources/fhirClientKeystore.jks -new change-password -storepass password


EAR Projects Generated for JavaEE Import

Importing my project into Eclipse, I found so many additional EAR projects were being generated. To stop this feature, I went to Eclipse > Preference > Maven > Java EE Integration and unchecked Enable Java EE Configuration. I removed the cached EAR projects (deleting from disk) and removed the regular projects, and imported again.  Voila…. it worked.



Using Maven to Port Multiple version of the same code

As you can tell, I am a maven power user.  I use it for most of my projects (sometimes we run into pure Eclipse, Gradle and Ant). 

I started work on porting one version of the exact version of one model to a new version of the same model (same classes). I need to use both JARs in the same classpath.  This leads to a problem – how to use both at the same time.  

I like to use the maven-shade-plugin and relocate the class files such that package.class-file is relocated to package.old.class-file.  Generally this works well (except where you use reflexsive pathing in Class.forName to look up a class). I was able to prepend a string before shading the class.

The pom.xml I used is much like the attached project.  I hope this helps you all.