Details
-
Bug
-
Status: Resolved
-
Major
-
Resolution: Invalid
-
2.3.0
-
None
-
None
Description
I am executing a jar file with spark-submit.
This jar file is a scala program, which combines operations spark-related and non-spark-related.
The issue comes when I execute a stored procedure from scala using jdbc. This SP is in a Microsoft SQL database and, basically, performs some operations and populates a table with about 500 rows, one by one.
Then, the next step in the program is read that table and perform some additional calculations. This step is grabbing always less rows than created by stored procedure, but this is because this step is not properly sync with the previous one, starting its execution without waiting the previous step to be done.
I have tried:
- Insert a Thread.sleep(10000) between both instructions and it seems to work.
- Execute the program just with one Executor => it doesn't work.
I would like to know why is it happening and how can I solve it without the sleep, because that's not a admissible solution.
Thank you very much!!