Audit logging - layout Log4j

Hi I tried to separate audit messages which ROR is logging because we have to send it to another destination for audit logs in these case to separate Kafka topic.
Fluent bit is routing messages on some flag in message.

Fleunt bit expecting json format of message.

I made separate appender in Log4J2. properties configuration.

What I found during disk task.

Would be nice to format all message with pattern for example as is for elasticsearch audit log.

But I cannot use a map in pattern layout…
Is there to posibility how to configure it?
I found that you emitting just formatted message and format of message is hard to parse and adds extra work…
Found that transformation is here elasticsearch-readonlyrest-plugin/RequestContext.scala at da21876359ed6f08334c44a858bbe5624f768c39 · sscarduzio/elasticsearch-readonlyrest-plugin · GitHub
Is there posibility for impovement in this. Audit logging is one of features why to use ROR but using it is not confortable or adapt to needs. Usually there is some standard format for audit in enterprise and I am expecting that this should be easy to adapt without changing code or witing custom class for audit loggind. Thanks Rudolf

Hello @rjan, excellent question.

The audit log we print to log file are engineered as compact, human readable lines meant to help integrators to debug in test environments.

Your use case is different, you actually need structured events to be consumed by a pipeline, where something like JSON would be more suitable.

I suggest you create a small log serializer class and load it as custom serializer

Please refer to us if you find the process too steep.

Hi Simone I know about posibility of custom serializer and this is the most customizeble oprtion how to achive custom format.
Let me give you some hints tothis why I think there will be nice to have more posibilities except custom serializer.

ROR has enterprise edition with audit feature. What does it mean it is suitable to easy integrate audit in enterprise. So should be easy to have separate audit posibilities and easy to integrate for example to send it to some SIEM solution or other centralised audit solution.
Custom serializer needs to have dev skills to integrate so it is not common for just op engineers. Extra work, debugging, building with upgrades etc. So customer expactation audit as I described out of the box. And as is for example in Elasticsearch paid editions audit feature confiuration. Why dont dont get more from Log4j conf and have more control without development.
If there is human readable form which you cannot change it is not audit just tool for troubleshooting.
I understand that there is collector sending json to Elasticsearch index so why not to add posibility to log to file or console than you can add some extra parsing of json more suitable then compact format. For example in fluentbit, filebeat or Logstash… So this is my opinion and I thing this will be heplful for more users…
Thanks Rudolf

I agree a lot with this, definitely something of great utility that we can implement quickly.

To begin, we could start offering more pre-built serializers (we currently have just two: default and QueryAuditLogSerializer). So we could add a JSONAuditLogSerializer to the party.

@coutoPL WDYT?

Sound good Simone :+1: Start with additional serializer out of the box…

R

1 Like

I don’t know if I understand both of you correctly.

The current state is as follows:
ROR’s auditing feature offers the ACL result (matched/mismatched) in a form of a JSON document stored in the ES index.
Moreover, we have two serializers that encode our internal representation of the ACL result to JSON (the QueryAuditLogSerializer has an additional field). There is an official method to create own serializers.

AFAIU, you would like to:

  1. control the sink of the audit entries (not only the ES index, but also a file, console, etc)
  2. control the layout of the audit entry (JSON with different fields or/and maybe CSV, etc). It would be good to be able to configure it with no-code fashion

Is it correct, or did I overinterpret it?

Actually, the way I imagine it would be: as a user, I can specify a serializer for each sink. For example:

  • Human readable format in the logs for stderr/stdout (via Log4J)
  • Full QueryAuditLogSerializer to a remote cluster
  • Standard JSON serializer to separate file appender (via Log4J)

This could be achieved by a mix of:

  1. ROR supporting arbitrary assignment of serializers to sinks
  2. Log4J configuration

Not sure how programmatic we can go with the point #2 in ROR code, but otherwise the user just tweaks the Log4J file.


On the other hand, as you point out, I’m not sure if @rjan is aware that ROR can already ship without the need for external log shipping agents all the audit logs in JSON format to either ES indices or even to external ES clusters via HTTP? Just making sure.

Hi we created custom serializer class. We achieved that custom format works for audit log which is sending to ES index.
We cannot achieve how to configure to use this serializer for formating output to console, file log via Log4j .
Can I ask you for some hints how to do it. Can you helps us with Log4j config please to log with custom serializer to console please

Hi @rjan,

Currently, our audit has only one sink available: index. In our backlog, we have a task to add a new type of sink: log. It seems that this is what you want (ref: Remove audit log information from elasticsearch log - #9 by sscarduzio)

I’ll move the task to the top of our current sprint and we will try to implement it as fast as we can.

@rjan we have this implemented. Do you want to the pre-build? Please let me know what ES version you use

Hi Mateusz sorry I didnot mentioned your feedback.
Of course we would like to test it if you will send us prebuilt.

Thanks

R

The feature was released with ROR 1.48.0. See our docs

1 Like