Skip to content

Commit

Permalink
Refactor Travis CI to support multiple Java Builds
Browse files Browse the repository at this point in the history
* Moved mapred-site and yarn-site xml files to a created a folder that contains the artifacts for
either hadoop 2.6 or 3.2, those will be pickd up depending on the testing
needs in travis.yml

* Moved spark-env file to a created a folder that contains the artifacts for
either spark1.6 or 2.4, those will be pickd up depending on the testing
needs in travis.yml

* Created hadoop-env.sh file for Hadoop 3.2 to store required
environment variables to start hdfs and yarn services.

* Removed harcoded values from haddop.conf and spark.conf, this will be
filled up depending on the testing needs.

* Added an `install_hadoop_spark` script that will download hadoop and spark
binaries depending on the testing needs.

* Added a `config_hadoop_spark` script that will setup hadoop, spark and
hibench depending on the testing needs.

* Added a `jdk_ver` script to pick up the current java version installed for
travis CI.

* `restart_hadoop_spark` script modified to be agnostic to the required
binaries for testing.

* travis/config_hadoop_spark.sh:
	* for Java 8 and 11 skiping `sql` test since HIVE is no longer
used to perform queries. Newer Spark version perform queries using
`SparkSession` no longer used `import org.apache.spark.sql`

* .travis.yml:
	* Added `dist: trusty` to keep using this distro, Travis picks
up xenial if not especified.. If Any
greather Ubuntu version required in Travis won't support openjdk 7.
	* Refactored the CI flow to behave, download, setup, run and
test hadoop and spark depending on the jdk required either versions 7, 8 and
11.
	* Hibench will be configured depending on the jdk required
either versions 7, 8 and 11.
	* Hibench will be built depending on the jdk required
either versions 7, 8 and 11.
	* benchmarks will be run for all jdk versions set.

Signed-off-by: Luis Ponce <[email protected]>
  • Loading branch information
luisfponce committed Aug 22, 2019
1 parent 4eb1191 commit 2e04a1d
Show file tree
Hide file tree
Showing 16 changed files with 773 additions and 35 deletions.
67 changes: 44 additions & 23 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
dist: trusty
sudo: required
language: java
jdk:
- openjdk11
- openjdk8
- openjdk7
before_install:
- cat /etc/hosts # optionally check the content *before*
Expand All @@ -10,32 +13,50 @@ before_install:
- cat /proc/cpuinfo | grep cores | wc -l
- free -h
install:
- hibench=$(pwd)
- cd /opt/
- wget http://d3kbcqa49mib13.cloudfront.net/spark-1.6.0-bin-hadoop2.6.tgz
- tar -xzf spark-1.6.0-bin-hadoop2.6.tgz
- wget https://archive.apache.org/dist/hadoop/core/hadoop-2.6.5/hadoop-2.6.5.tar.gz
- tar -xzf hadoop-2.6.5.tar.gz
- cd ${hibench}
- cp ./travis/spark-env.sh /opt/spark-1.6.0-bin-hadoop2.6/conf/
- cp ./travis/core-site.xml /opt/hadoop-2.6.5/etc/hadoop/
- cp ./travis/hdfs-site.xml /opt/hadoop-2.6.5/etc/hadoop/
- cp ./travis/mapred-site.xml /opt/hadoop-2.6.5/etc/hadoop/
- cp ./travis/yarn-site.xml /opt/hadoop-2.6.5/etc/hadoop/
- cp ./travis/hibench.conf ./conf/
- cp ./travis/benchmarks.lst ./conf/
- |
export java_ver=$(./travis/jdk_ver.sh)
if [[ "$java_ver" == 11 ]]; then
export HADOOP_VER=3.2.0
export SPARK_VER=2.4.3
export SPARK_PACKAGE_TYPE=without-hadoop-scala-2.12
elif [[ "$java_ver" == 8 ]]; then
export HADOOP_VER=3.2.0
export SPARK_VER=2.4.3
export SPARK_PACKAGE_TYPE=without-hadoop
elif [[ "$java_ver" == 7 ]]; then
export HADOOP_VER=2.6.5
export SPARK_VER=1.6.0
export SPARK_PACKAGE_TYPE=hadoop2.6
else
exit 1
fi
# Folders where are stored Spark and Hadoop depending on version required
export SPARK_BINARIES_FOLDER=spark-$SPARK_VER-bin-$SPARK_PACKAGE_TYPE
export HADOOP_BINARIES_FOLDER=hadoop-$HADOOP_VER
export HADOOP_CONF_DIR=/opt/$HADOOP_BINARIES_FOLDER/etc/hadoop/
sudo -E ./travis/install_hadoop_spark.sh
sudo -E ./travis/config_hadoop_spark.sh
before_script:
- "export JAVA_OPTS=-Xmx512m"
cache:
directories:
- $HOME/.m2
script:
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11
- mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10
- sudo -E ./travis/configssh.sh
- sudo -E ./travis/restart_hadoop_spark.sh
- cp ./travis/hadoop.conf ./conf/
- cp ./travis/spark.conf ./conf/
- /opt/hadoop-2.6.5/bin/yarn node -list 2
- sudo -E ./bin/run_all.sh
- |
if [[ "$java_ver" == 11 ]]; then
mvn clean package -q -Psparkbench -Phadoopbench -Dmaven.javadoc.skip=true -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.12 -Dmaven-compiler-plugin.version=3.8.0 -Dexclude-streaming
elif [[ "$java_ver" == 8 ]]; then
mvn clean package -q -Dmaven.javadoc.skip=true -Dhadoop=3.2 -Dspark=2.4 -Dscala=2.11
elif [[ "$java_ver" == 7 ]]; then
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.2 -Dscala=2.11
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=2.0 -Dscala=2.11
mvn clean package -q -Dmaven.javadoc.skip=true -Dspark=1.6 -Dscala=2.10
else
exit 1
fi
sudo -E ./travis/configssh.sh
sudo -E ./travis/restart_hadoop_spark.sh
sudo -E ./bin/run_all.sh
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Loading

0 comments on commit 2e04a1d

Please sign in to comment.