Installed Elasticsearch and Kibana plugins, having some issues

I have a 3 node elasticsearch cluster with one dedicated Kibana host. I have the free elasticsearch plugin installed and configured on each elasticsearch node and the free Kibana plugin installed and configured on the Kibana host.

elasticsearch.yml:

cluster.initial_master.nodes: ["es-node-01", "es-node-02", "es-node-03"]
cluster.name: elasticsearch
discovery.seed_hosts: ["es-node-01.com", "es-node-02.com", "es-node-03.com"]
node.name: es-node-01
http.port: 9200
transport.port: 9300
network.host: 0.0.0.0
xpack.security.enabled: false
http.type: ssl_netty4
transport_type: ror_ssl_internode

readonlyrest.yml:

readonlyrest:
    enable: true
    ssl:
      enable: true
      keystone_file: "keystonefile.jks"
      keystone_pass: "examplepw"
      key_pass: "examplepw"
    ssl_internode:
      keystone_file: "keystonefile.jks"
      keystone_pass: "examplepw"
      key_pass: "examplepw"
    ldaps:
    - name: myldap
      host: myldaphost.com
      port: 389
      ssl_enabled: false
      bind_dn: "cn=appUser,ou=someOU,dc=dcName,dc=com"
      bind_password: "examplepw"
      user_id_attribute: cn
      search_user_base_DN: "dc=dcName,dc=com"
      search_groups_base_DN: "dc=dcName,dc=com"
      cache_ttl_in_sec: 3600
    access_control_rules:
    - name: "Allow Test Users"
      ldap_authentication: myldap
      ldap_authentication: 
        name: myldap
        groups: ['testUsers']
    - name "Kibana Host"
      hosts: ["192.168.4.56"]
      type: allow

kibana.yml

elasticsearch.hosts: 
- https://es-node-01.com:9200
- https://es-node-02.com:9200
- https://es-node-03.com:9200
elasticsearch.username: "elastic"
elasticsearch.password: "myelasticpw"
xpack.security.enabled: false
server.ssl.enabled: true
server.ssl.certificate: "/path/to/certificate.cer"
server.ssl.key: "/path/to/certificate.key"
elasticsearch.ssl.certificateAuthorities: ["/path/to/ca.pem"]

The elasticsearch nodes seem to be working fine and I can run…
curl -XGET https://es-node-01.com:9200/_cluster/health?pretty --user "davebloggs"
…and I am able to authenticate using that ldap user. Authentication fails if I purposefully use an incorrect user/password, so I know that is working fine.

My problem seems to be with Kibana.

My first question is:
I was initially unable to get Kibana to authenticate to the es nodes (kibana.log was full of ‘forbidden’ errors) so I added a block to allow the kibana host in Readonlyrest.yml (as you can see above). That seemed to work as those permission errors have now gone. Was that the correct way too do it?

Second question is:
Kibana is still not happy, I can’t get the GUI up (ERR_CONNECTION REFUSED). It has been working previously, it’s just the current config which seems to have broken it.

I’m also seeing these two errors in kibana.log:

{"type":"log","@timestamp":"2021-09-10T07:34:45+00:00", "tags":"["error","plugins","eventLog"], "pid":32271,"message":"initialization failed, events will not be indexed"}
{"type":"log","@timestamp":"2021-09-10T07:56:12+00:00", "tags":"["error","plugins","eventLog"], "pid":32271,"message":"initialization elasticsearch resources, error creating initial index: invalid_alias_name_exception"}

For context, what I am really after at the moment is to just get the Kibana GUI working where it displays a log in page that authenticates to LDAP.

In my experience using hosts rule alone is super dangerous, full of pitfalls and unintended security/administrative consequences. I recommend useing (hashed) basic http authentication for the Kibana daemon.

I would do:

    - name "Kibana Host"
      #hosts: ["192.168.4.56"]
      type: allow
      auth_key_sha512: kibana:XXXXX

And in the kibana.yml I’d add:

elasticsearch.username: kibana
elasticsearch.password: <password>

Basic auth between machines over SSL is just fine. Just maybe I would inject the password in kibana.yml with some devops witchcraft. But that is advised for the LDAP password in readonlyrest.tml as well.

@joemo2023 all good with your configuration?

Thank you very much, my configuration worked after this :slight_smile:
I do have another question to post soon, currently stuck on something else!

1 Like