powered by Jive Software

Successfully built the Spark''s JavaDocs on Windows

Hi Spark Comunity,

In the case someone have had troubles to build the spark’'s JavaDocs under Windows (like me), here is the solution:

I’‘ve extracted from the ant’'s debug output the needed information about what sources should be processed and with which options, and created a javadocs argument file which can be used to build the JavaDocs for the Spark 2.5.1 source code release.

If anyone is interested I can mail her/him the batch file and the arguments file and also the whole JavaDocs package as RAR-archive.

Thanks for the attention.

Kind regards,

sbogus.

sbogus: Would you mind emailing me those files, as I have been trying to build the Spark Javadocs myself. Please emailto: michael.botsakos@gmail.com. Thanks.

Hi Michael,

I’'ve sent you the build files, you just need to place them in the build sub-folder of the Spark source tree and then execute the build_javadocs.bat batch file.

Hope to helped you.

Kind regards,

sbogus.

Please send me those files.(yuan_tiger@sina.com.cn)

I also look forward for you to tell me how make spark’'s install file under Windows(such as spark_2_5_1.msi)

Thanks

Hi sbogus,

please send me those files at shweta0110@gmail.com.

Thanks n Regards,

Shweta

His Shweta,

Hi Yuan,

I’'ve sent to you both the javadocs package, hope to helped you.

Kind regards,

sbogus.

I’'ve received.great work

Message was edited by: yuan_tiger

Hi Yuan,

I’‘ve re-read your post, you’'re asking for way to build MS Windows Installer setup files from the Spark sources.

In order to successfully build MS Windows Installer targets in the Spark build file you need a proper installation of the Advanced Installer (a commercial software packet for embedding Java applications in MS Windows executable proxies (Java hosts)).

If you’‘d like to be able to create MS Windows executable setups with your Java applications (including Spark) you’‘ll need the commersial product Install4j. The corresponding build target has already been integrated in the Spark’'s build.xml file.

Hope to helped you.

Kind regards,

sbogus.