Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Incomplete
-
2.3.1
-
None
-
Ubuntu 18.04, Spark 2.3.1, org.postgresql:postgresql:42.2.4
Description
Whenever I try to save the dataframe with one of the columns with JSON string inside to the latest Postgres I get org.apache.spark.sql.catalyst.parser.ParseException: DataType json is not supported. As Postgres supports JSON well and I use the latest postgresql client I expect it to work. Here is an example of the code that crashes
val columnTypes = """id integer, parameters json, title text, gsm text, gse text, organism text, characteristics text, molecule text, model text, description text, treatment_protocol text, extract_protocol text, source_name text,data_processing text, submission_date text,last_update_date text, status text, type text, contact text, gpl text"""
myDataframe.write.format("jdbc").option("url", "jdbc:postgresql://db/sequencing").option("customSchema", columnTypes).option("dbtable", "test").option("user", "postgres").option("password", "changeme").save()