Asciidoctor-latex patch for html backend offline usage

After you have installed the asciidoctor-latex package from github following the instructions in the you can generate pretty cool html pages with mathematical formulas. All the advantages of asciidoc can be combined with latex features to generate html(5) webpages.

In the newest version of asciidoctor-latex at the time of writing this blog (29/01/2016) the generated html files contain a couple of links/references to internet resources:

  1. a css style sheet to load google webfonts
  2. the javascript library jquery
  3. the javascript library MathJax

Due to those dependencies the html files / documents generated with asciidoctor-latex can be used on webservers for use in situations when you have internet connection, but the document will not render the mathematical formulas correctly, if the internet connection is missing.

To get rid of this restriction the named resources can be downloaded from their internet locations and stored locally in corresponding subdirectories. If you don’t want to do this by yourown, I bundled all resources in a zip archive for you.

[download internet resources]

To make the generated html files usable offline, unzip the downloaded file in the same directory where you find the html file.

Now you have to make some changes to your asciidoc-latex installation, so that the generated html files contain links to the local resources instead of the internet links in the original distribution.

Some days ago I have configured a public bind9 DNS Server with a IPv4 and a IPv6 address.

You can get the IP addresses by

dig A

dig AAAA
=> 2a03:4000:6:5::1

Feel free to use the server but there are no warranties from our sides (you don’t need to pay for the usage)






Yarn / Tez / Google protocol buffer error

Some people seem to get errors when executing hive on tez. I saw the same stack trace in several posts:

Searching for details about this error in the hadoop 2.3.0 source code I couldn’t find the relevant classes in the stack trace. The trick is, that the class


is auto generated during the hadoop build.

mvn install -DskipTests=true -Dmaven.javadoc.skip=true

After the build is finished a search for this class is successful:

The lines that making the trouble are

Searching for this method in internet I found the post in

Using classfinder we can search all projects on our server for this class and find some relevant matches in several hive projects.

In the snapshot 0.14 build the explicite depdency to the google protocol buffer jar with version 2.4.1 is removed. After switching hive to this snapshot version the error in combination with tez disappears.

Issue using CXF soap to send idoc xml to SAP ERP

We created our own function module which just takes an arbitrary idoc in xml format as input parameter. The function module maps the idoc xml to the corresponding IDOC structure in SAP and afterwards calls standard function modules to send the IDOC to core SAP.

In transaction we had to set “trigger immediately” because of business restrictions.

When we are sending IDOC messages with the below code (fragment) we could detect different results for the status of the IDOC in WE02 when varying the timeout values.

If the timeouts are big enough the idoc gets status IDOC processed was wanted.

Decreasing the timeouts leads first to an IDOC with status 64 (Doc ready to be transferred to application) and it will never be processed without other actions.

Further decreasing the timeout leads to a situation where the IDOC could not at all be saved in SAP.

In any case the timeout exception looks like


With the special code above it is possible to set the timeout values to your needs so that the IDOCs are processed like wanted in SAP.

Attention: In the case of a timeout exception you can not simply resent the IDOC because of the above described scenarios. Before sending the IDOC again further checks have to done:

For example:

1) Check the table EDIDC

2) Get the content of the last IDOCs send to the system (info from 1) and check the content using a remote enabled function module with the following code

grep like seach in windows