Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-34212

For parquet table, after changing the precision and scale of decimal type in hive, spark reads incorrect value

    XMLWordPrintableJSON

Details

    • Bug
    • Status: Resolved
    • Blocker
    • Resolution: Fixed
    • 2.4.5, 3.0.1, 3.1.1
    • 2.4.8, 3.0.2, 3.1.1
    • SQL

    Description

      In Hive, 

      create table test_decimal(amt decimal(18,2)) stored as parquet; 
      insert into test_decimal select 100;
      alter table test_decimal change amt amt decimal(19,3);
      

      In Spark,

      select * from test_decimal;
      
      +--------+
      |    amt |
      +--------+
      | 10.000 |
      +--------+
      

      Attachments

        Activity

          People

            dongjoon Dongjoon Hyun
            jack86596 Yahui Liu
            Votes:
            0 Vote for this issue
            Watchers:
            4 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: