Elasticsearch and logstash giving exception for long and float type casting

Hi Team,
I have to create index dynamically using logstash jdbc. I have used sql query in logstash. And my few column contains some decimal and some long values mixed. After starting logstash I am getting exception
cannot be changed from type [long] to [float]
Please help us to resolve the issue [long] to [float] and [float] to [long].
Note: I have used mutate filter to convert data type but its not working. And I don’t want to make mapping manually for index.
Please provide solution on this issue.
Below is my configuration logstash.

input {
    jdbc {
        jdbc_validate_connection => true
        jdbc_connection_string => "jdbc:oracle:thin:@DEV:8080/SID"
        jdbc_user => "AJIT"
        jdbc_password => "pass#1234"
        jdbc_driver_library => "/opt/READONLYREST/OSS/logstash-6.3.0/ojdbc7.jar"
        jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"

        statement => "Select Distinct Bm.Crmcompanyid,Cd.Company_Name,
  Bm.Datatype,Bm.Template,
  Bm.Isconsolidated,
  Bm.Periodendson,
  Bm.Crr_Date,
  cast(bm.period_code as decimal(15,2)),
cast(Column1 as decimal(15,2))  TOTAL_INCOME  ,
cast(Column10 as decimal(15,2))  TOTAL_OPERATING_EXPENDITURE ,
cast(Column11 as decimal(15,2))  Total_Provisions_Cont ,
cast(Column12 as decimal(15,2))  Adjusted_PROFIT_AFTER_TAX ,
cast(Column13 as decimal(15,2))  Net_Intrst_Incm_Avg_Ttl_Assts ,
cast(Column14 as decimal(15,2))  Non_Int_Income_Avg_Ttl_Assts  ,
cast(Column15 as decimal(15,2))  Non_Int_expenses_Avg_Ttl_Assts  ,
cast(Column16 as decimal(15,2))  PAT_Adjusted_Avg_Total_Assets ,
cast(Column17 as decimal(15,2))  Intrst_Paid_on_Dep_Avg_Dep  ,
cast(Column18 as decimal(15,2))  Tier_I_Capital_Percentage ,
cast(Column19 as decimal(15,2))  Capital_Adequacy_Ratio  ,
cast(Column2 as decimal(15,2)) TOTAL_ASSETS  ,
cast(Column20 as decimal(15,2))  Gross_NPA_Loans_Advances  ,
cast(Column21 as decimal(15,2))  Net_NPA_Loans_Advances  ,
cast(Column22 as decimal(15,2))  Networth_Net_NPA  ,
cast(Column23 as decimal(15,2))  CASA  ,
cast(Column24 as decimal(15,2))  Operating_Expenses  ,
cast(Column25 as decimal(15,2))  Equity_Share_Capital  ,
cast(Column3 as decimal(15,2)) TOTAL_LOANS_ADVANCES  ,
cast(Column4 as decimal(15,2)) TANGIBLE_NETWORTH ,
cast(Column5 as decimal(15,2)) TOTAL_DEPOSITS  ,
cast(Column6 as decimal(15,2)) TOTAL_INTEREST_INCOME ,
cast(Column7 as decimal(15,2)) TOTAL_INTEREST_PAID ,
cast(Column8 as decimal(15,2)) NET_INTEREST_INCOME ,
cast(Column9 as decimal(15,2)) TOTAL_OTHER_INCOME  
From Banknbfc_Periodmaster_Synm Bm,
  TEMP Bd,
  company_details_mv_synm cd
Where Bm.Period_Code = Bd.Period_Code
And Cd.Company_Code = Bm.Crmcompanyid
and bm.template = 'Bank'
and cd.company_status = 'Active'
--and column1 in (1322394.26,1299694)
ORDER BY Periodendson "       
 }
}

filter{

  mutate {convert => ["TOTAL_INCOME","float"] }
  mutate {convert => ["Capital_Adequacy_Ratio","float"] }
  mutate {convert => ["Networth_Net_NPA","float"] }
  mutate {convert => ["TANGIBLE_NETWORTH","float"] }
  mutate {convert => ["Non_Int_expenses_Avg_Ttl_Assts","float"] }
  mutate {convert => ["Total_Provisions_Cont","float"] }
  mutate {convert => ["TOTAL_INTEREST_INCOME","float"] }
  mutate {convert => ["TOTAL_LOANS_ADVANCES","float"] }
  mutate {convert => ["Net_NPA_Loans_Advances","float"] }
  mutate {convert => ["Tier_I_Capital_Percentage","float"] }
}

output   {
   
    elasticsearch {
    	hosts => "localhost:9200"
        index => "ajit_test"    
        user => "c-ajitb"
        password => "pass#1234"
        ssl => true
        ssl_certificate_verification => false
        truststore => "/opt/READONLYREST/elasticsearch-6.3.0/config/keystore.jks"
        truststore_password => "readonlyrest"

    
    }
}

Please give us solution on this issue.

Hi,
I know this is not the place to ask the above question. But in logstash I am getting above exception. And we are not getting support from elastic team. After installation of readonly rest elasticsearch not supports long and float data type conversion.
If you have any solution on this please let us know.

Can you see if it works with:

output {
    stdout { }
} 

Instead of

Yes,
I have 188 records in that 180 gets pushed in elasticsearch. And in my column there are few values which are decimal and few are long in same column. But I have given decimal format in sql query. Also I have used mutate but its not working. Randomly logstash giving exception for long and float data type casting. Please give us solution on this issue.

If it’s something that happens only when you send to elasticsearch, does elasticsearch show any errors in the logs? Because it should, right?

Yes, Same exception we are getting in elasticsearch also. I have checked mapping created in elasticsearch for that index (logstash index) but column (key) contains float data type. though I am getting long to float exception. Sometime getting float to long exception randomly. And in kibana it is showing number as data type.
I dont have solution for this issue any where. Because it is happening randomly. After creation of index in elasticsearch with 8 records failure out of 188 I am restarting logstash again with same query at that time I am able to push whole 188 records. means in elasticsearch records = 180 + 188 = 368.

Hi,

I may not understand correctly, but this issue, is it related to Readonlyrest ?

kr,

Fred

Yeah for as much as I want to help, seems quite specific and I feel quite powerless about this bug. Moreover, I struggle to understand how ROR is involved in this.

However, @ajit maybe sharing the Java stack trace will be helpful.

Well I ahd some trouble also with integer, long and float, but it was related to my indice definition and especially about defined field.

in fact I replace all long and float definition by double.

but it was not related to RoR

1 Like