-
Notifications
You must be signed in to change notification settings - Fork 767
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update code to support newer java versions #586
Open
luisfponce
wants to merge
6
commits into
Intel-bigdata:master
Choose a base branch
from
luisfponce:update_code_to_support_newer_java_versions
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Update code to support newer java versions #586
luisfponce
wants to merge
6
commits into
Intel-bigdata:master
from
luisfponce:update_code_to_support_newer_java_versions
Commits on Jul 23, 2019
-
Due Scala < 2.12 does not compiles on java 1.11 jdk and, scala 2.12 requires to change org.apache.kafka from `0.8.2.1` to at least `0.10.2.2` This last kafka version will require to change/port the code of following classes: * KafkaCollector.scala * KafkaConsumer.scala * MetricsUtil.scala To avoid break the streaming benchmarks in scala 2.11 and 2.10 the following changes are implemented: * Added profile to skip compiling scala code by using `maven.assembly.plugin.goal` variable. * Changed the goal in `scala-compile-first` to change in function of this added profile when no need to compile Kafka scala code. Signed-off-by: Luis Ponce <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 18b4234 - Browse repository at this point
Copy the full SHA 18b4234View commit details -
Update Code to be prepare for hadoop version 3.2.0
* Created method copyMerge in TestDFSIOEnh.java due hadoop-common-3.2.0 jar has deprecatted it. [Removed FileUtil.copyMerge](https://issues.apache.org/jira/browse/HADOOP-12967). * Created method checkDest in TestDFSIOEnh.java because is private in org.apache.hadoop.fs.FileUtil and required for copyMerge. * LoggerFactory changed to org.slf4j due org.slf4j.Logger cannot be converted to org.apache.commons.logging.Log Signed-off-by: Luis Ponce <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 40dfcce - Browse repository at this point
Copy the full SHA 40dfcceView commit details
Commits on Aug 22, 2019
-
* sparkbench/assembly/pom.xml: * Changed property name activation on `allModules` profile. * Added new profile that excludes `sparkbench-streaming` artifact. * sparkbench/pom.xml: * Changed property name activation on `allModules` profile. * Added new profile that excludes `streaming` module. * Added profile spark2.4 due spark-core_2.12 supports > 2.4.0 version. * Added profile scala 2.12. Scala < 2.12 does not compiles on java 1.11 jdk. * Added profile hadoop3.2 to propagate this variable to all spark benchmark. * sparkbecnh/streaming/pom.xml: * Added profile spark2.4 on sparkbench-streaming POM with spark-streaming-kafka-0-8_2.11 version 2.4.0. Signed-off-by: Luis Ponce <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 4eb1191 - Browse repository at this point
Copy the full SHA 4eb1191View commit details
Commits on Aug 27, 2019
-
Refactor Travis CI to support multiple Java Builds
* Moved mapred-site and yarn-site xml files to a created a folder that contains the artifacts for either hadoop 2.6 or 3.2, those will be pickd up depending on the testing needs in travis.yml * Moved spark-env file to a created a folder that contains the artifacts for either spark1.6 or 2.4, those will be pickd up depending on the testing needs in travis.yml * Created hadoop-env.sh file for Hadoop 3.2 to store required environment variables to start hdfs and yarn services. * Removed harcoded values from haddop.conf and spark.conf, this will be filled up depending on the testing needs. * Added an `install_hadoop_spark` script that will download hadoop and spark binaries depending on the testing needs. * Added a `config_hadoop_spark` script that will setup hadoop, spark and hibench depending on the testing needs. * Added a `jdk_ver` script to pick up the current java version installed for travis CI. * `restart_hadoop_spark` script modified to be agnostic to the required binaries for testing. * travis/config_hadoop_spark.sh: * for Java 8 and 11 skiping `sql` test since HIVE is no longer used to perform queries. Newer Spark version perform queries using `SparkSession` no longer used `import org.apache.spark.sql` * .travis.yml: * Added `dist: trusty` to keep using this distro, Travis picks up xenial if not especified.. If Any greather Ubuntu version required in Travis won't support openjdk 7. * Refactored the CI flow to behave, download, setup, run and test hadoop and spark depending on the jdk required either versions 7, 8 and 11. * Hibench will be configured depending on the jdk required either versions 7, 8 and 11. * Hibench will be built depending on the jdk required either versions 7, 8 and 11. * benchmarks will be run for all jdk versions set. Signed-off-by: Luis Ponce <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for e5bedc0 - Browse repository at this point
Copy the full SHA e5bedc0View commit details -
* autogen/pom.xml * Add hadoop mr2 profile to be used for hadoop hdfs and client. Signed-off-by: Luis Ponce <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 27085de - Browse repository at this point
Copy the full SHA 27085deView commit details -
* docs/build-hibench.md: * Update 2.4 version to specify Spark Version. * Add Specify Hadoop version documentation. * Add Build using JDK 11 documentation. * README.md: * Update Supported Hadoop/Spark releases to hadoop 3.2 and spark 2.4 Signed-off-by: Luis Ponce <[email protected]>
Configuration menu - View commit details
-
Copy full SHA for 0e48596 - Browse repository at this point
Copy the full SHA 0e48596View commit details
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.