Where is the upgraded plugin for ES can you link me the build please?
ah ok I’m seeing now
No luck
I have 4 different users:
- freddie: group 013
- brian: group 013
- roger: group 011
- john: group 011
Steps:
- I connect with user freddie and I see the visualizations and index_patterns, moreover I see the data correctly filtered by fimrid=011
- Logout
- I connect with user roger and I don’t see index_pattern and visualization neither, the user is correclty logged in however.
I see on kibana Log this error
lookout-kibana | {"type":"error","@timestamp":"2019-03-18T12:46:32Z","tags":["error","readonlyrest_kbn"],"pid":1,"level":"error","error":{"message":"[doc][config:6.6.1]: version conflict, document already exists (current version [9]): [version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [9]), with { index_uuid=\"NNliK8kkQb2HDLo7rjWGfA\" & shard=\"0\" & index=\".kibana_roger\" }","name":"Error","stack":"[version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [9]), with { index_uuid=\"NNliK8kkQb2HDLo7rjWGfA\" & shard=\"0\" & index=\".kibana_roger\" } :: {\"path\":\"/.kibana_roger/doc/config%3A6.6.1/_create\",\"query\":{\"refresh\":\"wait_for\"},\"body\":\"{\\\"config\\\":{\\\"buildNum\\\":19513},\\\"type\\\":\\\"config\\\",\\\"updated_at\\\":\\\"2019-03-18T12:46:32.441Z\\\"}\",\"statusCode\":409,\"response\":\"{\\\"error\\\":{\\\"root_cause\\\":[{\\\"type\\\":\\\"version_conflict_engine_exception\\\",\\\"reason\\\":\\\"[doc][config:6.6.1]: version conflict, document already exists (current version [9])\\\",\\\"index_uuid\\\":\\\"NNliK8kkQb2HDLo7rjWGfA\\\",\\\"shard\\\":\\\"0\\\",\\\"index\\\":\\\".kibana_roger\\\"}],\\\"type\\\":\\\"version_conflict_engine_exception\\\",\\\"reason\\\":\\\"[doc][config:6.6.1]: version conflict, document already exists (current version [9])\\\",\\\"index_uuid\\\":\\\"NNliK8kkQb2HDLo7rjWGfA\\\",\\\"shard\\\":\\\"0\\\",\\\"index\\\":\\\".kibana_roger\\\"},\\\"status\\\":409}\"}\n at respond (/usr/share/kibana/node_modules/elasticsearch/src/lib/transport.js:308:15)\n at checkRespForFailure (/usr/share/kibana/node_modules/elasticsearch/src/lib/transport.js:267:7)\n at HttpConnector.<anonymous> (/usr/share/kibana/node_modules/elasticsearch/src/lib/connectors/http.js:166:7)\n at IncomingMessage.wrapper (/usr/share/kibana/node_modules/elasticsearch/node_modules/lodash/lodash.js:4935:19)\n at IncomingMessage.emit (events.js:194:15)\n at endReadableNT (_stream_readable.js:1103:12)\n at process._tickCallback (internal/process/next_tick.js:63:19)"},"message":"[doc][config:6.6.1]: version conflict, document already exists (current version [9]): [version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [9]), with { index_uuid=\"NNliK8kkQb2HDLo7rjWGfA\" & shard=\"0\" & index=\".kibana_roger\" }"}
I think you should delete the users own kibana indices, and let the template feature copy things across all over again now that we have a working reindex in Es.
I guess I did not understand your suggestion
I do not created kibana_@{user} index manually I let creation to the readonlyrest API, in fact in kibana.yml there is
readonlyrest_kbn.kibanaIndexTemplate: ".kibana_template"
and the rule is
- name: "011"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
The .kibana_@{user} indexes are created when the user is logged in, (for the first time I guess)
However should I delete all .kibana_@{user} indexes except of .kibana_template every time the user logged in?
I deleted all .kibana_@{user} indexes. I only left .kibana_template index>
I connected with freddie user BUT I 'm not seeing index_pattern and visualization again.
Moreover kibana recreated .kibana_freddie index:
Here my _cat/indices
green open .kibana Dxdak-HgRvaHklKxzZajrQ 1 0 3 0 13.3kb 13.3kb
green open .kibana_template 4j96UE7gTUa4KSsWdLZxXQ 1 0 3 0 19.7kb 19.7kb
green open .kibana_freddie h6ZNsuZLSHe7v9ahmHOkiA 1 0 3 1 16.6kb 16.6kb
same kibana error:
lookout-kibana | {"type":"error","@timestamp":"2019-03-18T13:29:20Z","tags":["error","readonlyrest_kbn"],"pid":1,"level":"error",
"error":{"message":"[doc][config:6.6.1]: version conflict, document already exists (current version [4]): [version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [4]), with { index_uuid=\"h6ZNsuZLSHe7v9ahmHOkiA\" & shard=\"0\" & index=\".kibana_freddie\" }","name":"Error","stack":"[version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [4]), with { index_uuid=\"h6ZNsuZLSHe7v9ahmHOkiA\" & shard=\"0\" & index=\".kibana_freddie\" } :: {\"path\":\"/.kibana_freddie/doc/config%3A6.6.1/_create\",\"query\":{\"refresh\":\"wait_for\"},\"body\":\"{\\\"config\\\":{\\\"buildNum\\\":19513},\\\"type\\\":\\\"config\\\",\\\"updated_at\\\":\\\"2019-03-18T13:29:20.185Z\\\"}\",\"statusCode\":409,\"response\":\"{\\\"error\\\":{\\\"root_cause\\\":[{\\\"type\\\":\\\"version_conflict_engine_exception\\\",\\\"reason\\\":\\\"[doc][config:6.6.1]: version conflict, document already exists (current version [4])\\\",\\\"index_uuid\\\":\\\"h6ZNsuZLSHe7v9ahmHOkiA\\\",\\\"shard\\\":\\\"0\\\",\\\"index\\\":\\\".kibana_freddie\\\"}],\\\"type\\\":\\\"version_conflict_engine_exception\\\",\\\"reason\\\":\\\"[doc][config:6.6.1]: version conflict, document already exists (current version [4])\\\",\\\"index_uuid\\\":\\\"h6ZNsuZLSHe7v9ahmHOkiA\\\",\\\"shard\\\":\\\"0\\\",\\\"index\\\":\\\".kibana_freddie\\\"},\\\"status\\\":409}\"}\n at respond (/usr/share/kibana/node_modules/elasticsearch/src/lib/transport.js:308:15)\n at checkRespForFailure (/usr/share/kibana/node_modules/elasticsearch/src/lib/transport.js:267:7)\n at HttpConnector.<anonymous> (/usr/share/kibana/node_modules/elasticsearch/src/lib/connectors/http.js:166:7)\n at IncomingMessage.wrapper (/usr/share/kibana/node_modules/elasticsearch/node_modules/lodash/lodash.js:4935:19)\n at IncomingMessage.emit (events.js:194:15)\n at endReadableNT (_stream_readable.js:1103:12)\n at process._tickCallback (internal/process/next_tick.js:63:19)"},"message":"[doc][config:6.6.1]: version conflict, document already exists (current version [4]): [version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [4]), with { index_uuid=\"h6ZNsuZLSHe7v9ahmHOkiA\" & shard=\"0\" & index=\".kibana_freddie\" }"}
Wait, I went through your settings once again.
This:
- name: "011-data"
auth_key: zeroundici:zeroundici
indices: ["negotiation-*", "trade-*", "transation-*"]
filter: '{"bool": { "must": { "match": { "firmid": "011" }}}}'
- name: "011"
auth_key: zeroundici:zeroundici
kibana_access: rw
kibana_index: ".kibana_011"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
Should be:
- name: "011-data"
auth_key: zeroundici:zeroundici
indices: ["negotiation-*", "trade-*", "transation-*"]
filter: '{"bool": { "must": { "match": { "firmid": "011" }}}}'
kibana_access: rw
kibana_index: ".kibana_011"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
- name: "011"
auth_key: zeroundici:zeroundici
indices: ["negotiation-*", "trade-*", "transation-*"]
kibana_access: rw
kibana_index: ".kibana_011"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
The idea is that you duplicate the block, omitting the filter/fields rule. As these two rules only admit read requests.
THis is the static version, I switched on dynamic version where authentications and group association are external
so I have this one:
- name: "011-data"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
indices: ["negotiation-*", "trade-*", "transation-*"]
filter: '{"bool": { "must": { "match": { "firmid": "011" }}}}'
- name: "011"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
Should i change into this?
- name: "011-data"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
indices: ["negotiation-*", "trade-*", "transation-*"]
filter: '{"bool": { "must": { "match": { "firmid": "011" }}}}'
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
- name: "011"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
Yeah sure, the same principle applies!
I’ve got the same error, again.
I have reingested all data
- logged as admin
- switch to template tenant
- create index_pattern and visualization
- logged out
- logged in as freddie, I see index_pattern and visualization.
- logged out
- re-logged in as freddie, index_pattern and visualization are disappeared
Same error in kibana log:
{"type":"log","@timestamp":"2019-03-18T14:02:18Z","tags":["error","readonlyrest_kbn"],"pid":1,"message":"got an error [409] Conflict for path /api/kibana/settings"}
lookout-kibana | {"type":"error","@timestamp":"2019-03-18T14:02:18Z","tags":["error","readonlyrest_kbn"],"pid":1,"level":"error","error":{"message":"[doc][config:6.6.1]: version conflict, document already exists (current version [3]): [version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [3]), with { index_uuid=\"jC-aW02HR7aLsd_Hh5dxQg\" & shard=\"0\" & index=\".kibana_freddie\" }","name":"Error","stack":"[version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [3]), with { index_uuid=\"jC-aW02HR7aLsd_Hh5dxQg\" & shard=\"0\" & index=\".kibana_freddie\" } :: {\"path\":\"/.kibana_freddie/doc/config%3A6.6.1/_create\",\"query\":{\"refresh\":\"wait_for\"},\"body\":\"{\\\"config\\\":{\\\"buildNum\\\":19513},\\\"type\\\":\\\"config\\\",\\\"updated_at\\\":\\\"2019-03-18T14:02:18.735Z\\\"}\",\"statusCode\":409,\"response\":\"{\\\"error\\\":{\\\"root_cause\\\":[{\\\"type\\\":\\\"version_conflict_engine_exception\\\",\\\"reason\\\":\\\"[doc][config:6.6.1]: version conflict, document already exists (current version [3])\\\",\\\"index_uuid\\\":\\\"jC-aW02HR7aLsd_Hh5dxQg\\\",\\\"shard\\\":\\\"0\\\",\\\"index\\\":\\\".kibana_freddie\\\"}],\\\"type\\\":\\\"version_conflict_engine_exception\\\",\\\"reason\\\":\\\"[doc][config:6.6.1]: version conflict, document already exists (current version [3])\\\",\\\"index_uuid\\\":\\\"jC-aW02HR7aLsd_Hh5dxQg\\\",\\\"shard\\\":\\\"0\\\",\\\"index\\\":\\\".kibana_freddie\\\"},\\\"status\\\":409}\"}\n at respond (/usr/share/kibana/node_modules/elasticsearch/src/lib/transport.js:308:15)\n at checkRespForFailure (/usr/share/kibana/node_modules/elasticsearch/src/lib/transport.js:267:7)\n at HttpConnector.<anonymous> (/usr/share/kibana/node_modules/elasticsearch/src/lib/connectors/http.js:166:7)\n at IncomingMessage.wrapper (/usr/share/kibana/node_modules/elasticsearch/node_modules/lodash/lodash.js:4935:19)\n at IncomingMessage.emit (events.js:194:15)\n at endReadableNT (_stream_readable.js:1103:12)\n at process._tickCallback (internal/process/next_tick.js:63:19)"},"message":"[doc][config:6.6.1]: version conflict, document already exists (current version [3]): [version_conflict_engine_exception] [doc][config:6.6.1]: version conflict, document already exists (current version [3]), with { index_uuid=\"jC-aW02HR7aLsd_Hh5dxQg\" & shard=\"0\" & index=\".kibana_freddie\" }"}
OK I think I know what it is. Build coming soon.
Here is a new build for you
I’m getting this error (same steps)
kibana log
lookout-kibana | { AssertionError [ERR_ASSERTION]: New route /elasticsearch/.kibana_freddie/{paths*} conflicts with existing /elasticsearch/.kibana_freddie/{paths*}
lookout-kibana | at new AssertionError (internal/assert.js:269:11)
lookout-kibana | at Object.exports.assert (/usr/share/kibana/node_modules/hoek/lib/index.js:736:11)
lookout-kibana | at module.exports.internals.Segment.internals.Segment.add (/usr/share/kibana/node_modules/call/lib/segment.js:67:14)
lookout-kibana | at module.exports.internals.Segment.internals.Segment.add (/usr/share/kibana/node_modules/call/lib/segment.js:61:40)
lookout-kibana | at module.exports.internals.Segment.internals.Segment.add (/usr/share/kibana/node_modules/call/lib/segment.js:61:40)
lookout-kibana | at exports.Router.internals.Router.internals.Router.add (/usr/share/kibana/node_modules/call/lib/index.js:63:26)
lookout-kibana | at internals.Server._addRoute (/usr/share/kibana/node_modules/hapi/lib/server.js:467:46)
lookout-kibana | at internals.Server.route (/usr/share/kibana/node_modules/hapi/lib/server.js:452:26)
lookout-kibana | at createProxy (/usr/share/kibana/plugins/readonlyrest_kbn/server/routes/lib/multitenancy/create_proxy.js:1:1094)
lookout-kibana | at k (/usr/share/kibana/plugins/readonlyrest_kbn/server/routes/lib/kbnIndexUtils.js:1:1678)
lookout-kibana | at exports.default (/usr/share/kibana/plugins/readonlyrest_kbn/server/routes/lib/kbnIndexUtils.js:1:2343)
lookout-kibana | generatedMessage: false,
lookout-kibana | name: 'AssertionError [ERR_ASSERTION]',
lookout-kibana | code: 'ERR_ASSERTION',
lookout-kibana | actual: false,
lookout-kibana | expected: true,
lookout-kibana | operator: '==' }
lookout-kibana | { AssertionError [ERR_ASSERTION]: New route /es_admin/.kibana_freddie/{paths*} conflicts with existing /es_admin/.kibana_freddie/{paths*}
lookout-kibana | at new AssertionError (internal/assert.js:269:11)
lookout-kibana | at Object.exports.assert (/usr/share/kibana/node_modules/hoek/lib/index.js:736:11)
lookout-kibana | at module.exports.internals.Segment.internals.Segment.add (/usr/share/kibana/node_modules/call/lib/segment.js:67:14)
lookout-kibana | at module.exports.internals.Segment.internals.Segment.add (/usr/share/kibana/node_modules/call/lib/segment.js:61:40)
lookout-kibana | at module.exports.internals.Segment.internals.Segment.add (/usr/share/kibana/node_modules/call/lib/segment.js:61:40)
lookout-kibana | at exports.Router.internals.Router.internals.Router.add (/usr/share/kibana/node_modules/call/lib/index.js:63:26)
lookout-kibana | at internals.Server._addRoute (/usr/share/kibana/node_modules/hapi/lib/server.js:467:46)
lookout-kibana | at internals.Server.route (/usr/share/kibana/node_modules/hapi/lib/server.js:452:26)
lookout-kibana | at createProxy (/usr/share/kibana/plugins/readonlyrest_kbn/server/routes/lib/multitenancy/create_proxy.js:1:1094)
lookout-kibana | at k (/usr/share/kibana/plugins/readonlyrest_kbn/server/routes/lib/kbnIndexUtils.js:1:1678)
lookout-kibana | at exports.default (/usr/share/kibana/plugins/readonlyrest_kbn/server/routes/lib/kbnIndexUtils.js:1:2343)
lookout-kibana | generatedMessage: false,
lookout-kibana | name: 'AssertionError [ERR_ASSERTION]',
lookout-kibana | code: 'ERR_ASSERTION',
lookout-kibana | actual: false,
lookout-kibana | expected: true,
lookout-kibana | operator: '==' }
elasticsearch log
lookout-elasticsearch | [2019-03-18T15:51:24,662][WARN ][o.e.d.a.a.i.t.p.PutIndexTemplateRequest] [j3M0fMm] Deprecated field [template] used, replaced by [index_patterns]
lookout-elasticsearch | [2019-03-18T15:51:24,669][INFO ][o.e.c.m.MetaDataIndexTemplateService] [j3M0fMm] adding template [kibana_index_template:.kibana_freddie] for index patterns [.kibana_freddie]
lookout-elasticsearch | [2019-03-18T15:51:24,684][INFO ][t.b.r.a.ACL ] [j3M0fMm] ALLOWED by { name: '013', policy: ALLOW, rules: [external_authentication, groups_provider_authorization, kibana_access, kibana_index, kibana_hide_apps]} req={ ID:1242422130-546335540#29824, TYP:UpdateRequest, CGR:013, USR:freddie, BRS:false, KDX:.kibana_freddie, ACT:indices:data/write/update, OA:172.18.0.7, DA:172.18.0.4, IDX:.kibana_freddie, MET:POST, PTH:/.kibana_freddie/doc/config%3A6.6.1/_update?refresh=wait_for, CNT:<OMITTED, LENGTH=80>, HDR:{authorization=<OMITTED>, x-ror-current-group=013, Connection=keep-alive, content-type=application/json, Host=elasticsearch:9200, Content-Length=80}, HIS:[::KIBANA-SRV::->[auth_key->false]], [::LOGSTASH::->[auth_key->false]], [Admin Tenancy->[groups->false]], [Template Tenancy->[groups->false]], [011-data->[kibana_access->true, indices->false, external_authentication->true]], [011->[kibana_access->true, groups_provider_authorization->false, external_authentication->true]], [013-data->[kibana_access->true, indices->false, external_authentication->true]], [013->[kibana_access->true, groups_provider_authorization->true, kibana_hide_apps->true, kibana_index->true, external_authentication->true]] }
does it work though?
nope, visualizations and index_pattern disappeared. Same as before
This is my complete configuration (with the changes you suggested) could you reproduce the scenario on your own?
- 1 admin user with 2 tenant.
- 4 user: freddie, brian, roger, john
- 2 groups: 011, 013 (freddie and brian belong to 013, john and roger belong to 011). Each group filters firmid field by own value. (011 and 013)
- shared index_patterns, visualization between users(ro) and admin
readonlyrest:
audit_collector: true
access_control_rules:
- name: "::KIBANA-SRV::"
auth_key: kibana:kibana
verbosity: error
- name: "::LOGSTASH::"
auth_key: logstash:logstash
actions: ["cluster:monitor/main","indices:admin/types/exists","indices:data/read/*","indices:data/write/*","indices:admin/template/*","indices:admin/create"]
indices: ["negotiation-*", "trade-*", "transation-*", "logstash-*"]
verbosity: error
- name: "Admin Tenancy"
groups: ["Admins"]
verbosity: error
kibana_access: admin
kibana_index: ".kibana"
verbosity: error
- name: "Template Tenancy"
groups: ["Template"]
verbosity: error
kibana_access: admin
kibana_index: ".kibana_template"
verbosity: error
- name: "011-data"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
indices: ["negotiation-*", "trade-*", "transation-*"]
filter: '{"bool": { "must": { "match": { "firmid": "011" }}}}'
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
- name: "011"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["011"]
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
- name: "013-data"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["013"]
indices: ["negotiation-*", "trade-*", "transation-*"]
filter: '{"bool": { "must": { "match": { "firmid": "013" }}}}'
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
- name: "013"
external_authentication: "ExternalAuthService"
groups_provider_authorization:
user_groups_provider: "ExternalGroupService"
groups: ["013"]
kibana_access: rw
kibana_index: ".kibana_@{user}"
kibana_hide_apps: ["readonlyrest_kbn", "kibana:dev_tools", "kibana:management"]
#verbosity: error
external_authentication_service_configs:
- name: "ExternalAuthService"
authentication_endpoint: "http://authprovider:8080/auth"
success_status_code: 200
cache_ttl_in_sec: 1
validate: false # SSL certificate validation (default to true)
user_groups_providers:
- name: "ExternalGroupService"
groups_endpoint: "http://authprovider:8080/groups"
auth_token_name: "token"
auth_token_passed_as: QUERY_PARAM # HEADER OR QUERY_PARAM
response_groups_json_path: "$..groups[?(@.name)].name" # see: https://github.com/json-path/JsonPath
cache_ttl_in_sec: 1
http_connection_settings:
connection_timeout_in_sec: 5 # default 2
socket_timeout_in_sec: 3 # default 5
connection_request_timeout_in_sec: 3 # default 5
connection_pool_size: 10
users:
- username: admin
auth_key: admin:admin
groups: ["Admins", "Template"] # can hop between two tenancies with top-left drop-down menu
I wrote this mock micro service that responds with code 200 and { “groups”: [{“name”: “013”},{“name”: “011”}]} to any request.
var url = require('url');
var s = require('http').createServer(function (req, res) {
var url_parts = url.parse(req.url, true);
var query = url_parts.query;
console.log(query)
var groups =
`{
"groups": [{"name": "013"},{"name": "011"}]
}`
res.write(groups)
res.end()
})
console.log("listening kibana on http://localhost:5001")
s.listen(5001)
I used it for authentication and authorization.
So I’m able to replicate your test script with almost identical readonlyrest.yml and kibana.yml.
I’m using Kibana and ES 6.6.1 (non OSS), and 1.17-pre4 of ROR plugin.
I was able to reproduce your assertion error due to the route clash, but I definitely could still see the index pattern I created in the template tenancy.
I can fix the route clash exception, but IMO it should not delete the index pattern and visualization