Details
-
Improvement
-
Status: Open
-
Minor
-
Resolution: Unresolved
-
3.1.0
-
None
-
None
-
Sure, if you can figure out how to enable deploying of javadoc artifacts with the release, that'd do it. I think they're already built of course, just not part of what gets pushed. From a quick look at the build, I don't see what if anything would disable those artifacts, and I see them built locally. The copy script also seems to take all .jar files. It's a legit enhancement just not something I immediately see how to fix.
Sure, if you can figure out how to enable deploying of javadoc artifacts with the release, that'd do it. I think they're already built of course, just not part of what gets pushed. From a quick look at the build, I don't see what if anything would disable those artifacts, and I see them built locally. The copy script also seems to take all .jar files. It's a legit enhancement just not something I immediately see how to fix.
Description
Currently Spark javadoc is only accessible here : http://spark.apache.org/docs/latest/api/java
Maven isn't able to find it, through a maven dependency:resolve -Dclassifier=javadoc and Eclipse, for example, isn't able to set javadoc location automatically to good location.
Therefore, each developper
in each sub-project of his project that uses Spark,
must edit each jar related to spark (about ten)
and set manually the http location of the javadoc.
Now 99% of the API available on Maven respect the standard of delivering a separate downloadable javadoc through the javadoc classifer.
Can Spark respect this standard too ?