Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Refactor Travis CI to support multiple Java Builds
* Moved mapred-site and yarn-site xml files to a created a folder that contains the artifacts for either hadoop 2.6 or 3.2, those will be pickd up depending on the testing needs in travis.yml * Moved spark-env file to a created a folder that contains the artifacts for either spark1.6 or 2.4, those will be pickd up depending on the testing needs in travis.yml * Created hadoop-env.sh file for Hadoop 3.2 to store required environment variables to start hdfs and yarn services. * Removed harcoded values from haddop.conf and spark.conf, this will be filled up depending on the testing needs. * Added an `install_hadoop_spark` script that will download hadoop and spark binaries depending on the testing needs. * Added a `config_hadoop_spark` script that will setup hadoop, spark and hibench depending on the testing needs. * Added a `jdk_ver` script to pick up the current java version installed for travis CI. * `restart_hadoop_spark` script modified to be agnostic to the required binaries for testing. * travis/config_hadoop_spark.sh: * for Java 8 and 11 skiping `sql` test since HIVE is no longer used to perform queries. Newer Spark version perform queries using `SparkSession` no longer used `import org.apache.spark.sql` * .travis.yml: * Added `dist: trusty` to keep using this distro, Travis picks up xenial if not especified.. If Any greather Ubuntu version required in Travis won't support openjdk 7. * Refactored the CI flow to behave, download, setup, run and test hadoop and spark depending on the jdk required either versions 7, 8 and 11. * Hibench will be configured depending on the jdk required either versions 7, 8 and 11. * Hibench will be built depending on the jdk required either versions 7, 8 and 11. * benchmarks will be run for all jdk versions set. Signed-off-by: Luis Ponce <[email protected]>
- Loading branch information