Logstash filter for Robot Framework

We are currently working on CI/CD -setup at work. As part of that, the tests need to be able to be implemented as a part of the pipeline.
Generally, the pipeline consists of steps/stages done with jenkins pipeline. The benefit on this is that the whole process and definition of the stages (Deploy, test etc) will be done by the developer team and stored in the teams own repository and is therefore controlled by the team also. Which is definitely a great step towards for the teams having more freedom and more responsibility when it comes to deliver the applications/solutions to the production. Needless to say it will also affect to the visibility of the quality and to the need of tests.

Plus that it will definitely keep the test team on their toes. Keeping ahead becomes a really neat challenge 😀

Now that does add more requirements also on the testing tools. First of all, the tools we use should be able to be used from containers. Which means that everything is dockerized. Well, the test code itself is in the repository, but the engines running the tests are in the containers.
We use, whenever we can, a general docker images from dockerhub.
Sometimes it won’t work like that. So we end up re-inventing the wheel.

That was the case with logstash. We will need to be able to filter the Robot Framework’s output.xml and send it to elasticsearch. There was two possibilities to do that; logstash filtering or xml parsing. The xml-parsing remains to be done still (I am going to do it), but I did manage to create the logstash -filter. It is not completely flawless, not even the most elegant, but at the moment it seems to be working as it should. To be honest, I was aiming to have a one more blunt instrument for our test needs.

The filter:

robot-results.conf

input {
 file {
 path => [ "/output.xml"]
 }
}

filter {
 xml
 {
 source => "message"
 store_xml => true
 target => "doc"
 xpath =>
 [

"msg", "doc.msg",
 "arguments", "doc.args",
 "kw", "doc.keyword",
 "status", "doc.status",
 "status/@status", "doc.test.status",
 "robot", "doc.robot",
 "errors", "doc.errors",
 "statistics", "doc.statistics",
 "suite", "doc.suite",
 "tag", "doc.tag",
 "total", "doc.total",
 "/kw", "leftovers",
 "/arguments", "leftovers"

]



}
}



output {
 elasticsearch {
 hosts => ["elastic"]
 index => "logstash-%{+YYYY.MM.dd}"
 }
}

Dockerfile:

FROM logstash

ADD robot-results.conf /etc/logstash/conf.d/robot/results.conf
CMD logstash -f /etc/logstash/conf.d/robot/

Running the container:

docker run --add-host=elastic:127.0.0.1 janmat/logstash-robot