Troubles getting audit to log file

I have configured audit logging

  audit_collector: true
  audit_serializer: tech.beshu.ror.requestcontext.DefaultAuditLogSerializer


#---- BEGIN ReadOnlyRest
appender.access_log_rolling.type = RollingFile = access_log_rolling
appender.access_log_rolling.fileName = ${sys:es.logs.base_path}${sys:file.separator}${sys:es.logs.cluster_
appender.access_log_rolling.filePattern = ${sys:es.logs.base_path}${sys:file.separator}${sys:es.logs.clust
appender.access_log_rolling.layout.type = LogstashLayout
appender.access_log_rolling.policies.type = Policies
appender.access_log_rolling.policies.time.type = TimeBasedTriggeringPolicy
appender.access_log_rolling.policies.time.interval = 1
appender.access_log_rolling.policies.time.modulate = true = org.elasticsearch.plugin.readonlyrest.acl
logger.access_log_rolling.level = info
logger.access_log_rolling.appenderRef.access_log_rolling.ref = access_log_rolling
logger.access_log_rolling.additivity = false

# exclude kibana, beat and logstash users as they generate too much noise
logger.access_log_rolling.filter.regex.type = RegexFilter
logger.access_log_rolling.filter.regex.regex = .*USR:(kibana|beat|logstash),.*
logger.access_log_rolling.filter.regex.onMatch = DENY
logger.access_log_rolling.filter.regex.onMismatch = ACCEPT
#---- END ReadOnlyRest

(I drop the log4j2-logstash-layout jar into es/lib and configure all its layout.type’s to be LogstashLayout – works great.)

Relevant bits from the es logs include

{"@version":1,"source_host":"s-ror-es-1","message":"Using custom serializer: tech.beshu.ror.requestcontext.DefaultAuditLogSerializer","thread_name":"main","@timestamp":"2018-12-17T01:44:39.5

The audit entries are being created in the es index, and the desired ror audit log file gets created (ror-ror-audit.log)

[[email protected]]$ ll
total 2.1M
-r--rw---- 1 logstash logstash 323K Dec 17 01:52 gc.log.0.current
-r--rw---- 1 logstash logstash    0 Dec 14 23:47 ror_access.log
-r--rw---- 1 logstash logstash    0 Dec 14 23:47 ror_audit.log
-r--rw---- 1 logstash logstash  19K Dec 17 01:44 ror_deprecation.log
-r--rw---- 1 logstash logstash    0 Dec 14 23:47 ror_index_indexing_slowlog.log
-r--rw---- 1 logstash logstash    0 Dec 14 23:47 ror_index_search_slowlog.log
-r--rw---- 1 logstash logstash 1.7M Dec 17 01:52 ror.log
-r--rw---- 1 logstash logstash    0 Dec 17 01:23 ror-ror-audit.log

But, as the size shows, nothing ever gets written to it. The ror audit logs (in json format) are getting written the default elasticsearch log (ror.log).

I have clearly something misconfigured (probably the appender name?). I had to make a small tweak to the log4j.snippet from the docs (…regex.onMisMatch -> regex.onMismatch), so maybe the snippet is a bit out of date.

I’ll keep looking into this, but if the problem is obvious to someone, please let me know.

Digging thru the code, now I’m not so sure audit_serializer: tech.beshu.ror.requestcontext.DefaultAuditLogSerializer has anything to with logging to a file. That class is used to create the json to store in the es audit index, no?

I guess I should say what I am trying to accomplish:

Rather than text based logging acl logs in the “protected by ror es cluster” log, I want to be able to log the same json that is going in the index on the protected es cluster to a log file so I can route to our logging elk cluster.

So the problem was that = org.elasticsearch.plugin.readonlyrest.acl

needed to be = tech.beshu.ror.acl

But this does not really get me what I want because I just get human readable acl log in the message field.

What I think I want is for ACL#doLog to have something along the lines of

    if (context.getSettings().isAclJsonLogging()) {;
    } else {
      StringBuilder sb = new StringBuilder();
      sb.append ...;

Actually the above is not quite right. It will just log the json in the ‘message’ field. Need some additional log4j magic.

We support custom/alternative audit loggers only for in index audit logs at the moment!