用華為云鏡像源碼編譯Spark3.0.1

      網友投稿 714 2022-05-30

      1. 環境準備

      git version 1.8.3.1

      java version "1.8.0_221"

      scala version 2.12.8

      apache-maven-3.6.1

      [root@hadoop001 spark]# yum install -y git

      git clone https://github.com/apache/spark.git

      這里可能會出現下面的問題

      問題1:

      Cloning into 'spark'...

      fatal: unable to access 'https://github.com/apache/spark.git/': Could not resolve host: github.com; Unknown error

      這種情況的解決辦法如下:

      [root@hadoop001 ~]# ping github.com

      PING github.com (192.30.255.113) 56(84) bytes of data.

      # 把 github.com 地址加入到 /etc/hosts 里面

      [root@hadoop001 sourcecode]$ vim /etc/hosts

      127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4

      ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6

      192.168.100.100 hadoop001

      192.30.255.113 github.com

      問題2:

      [root@hadoop001 ~]# git clone https://github.com/apache/spark.git

      Cloning into 'spark'...

      fatal: unable to access 'https://github.com/apache/spark.git/': Failed connect to github.com:443; Connection refused

      解決辦法:先把子切換到全局,然后再取消,接著取消全局代理

      [root@hadoop001 ~]# git config --global http.proxy http://127.0.0.1:1080

      [root@hadoop001 ~]# git config --global https.proxy http://127.0.0.1:1080

      [root@hadoop001 ~]# git config --global --unset http.proxy

      [root@hadoop001 ~]# git config --global --unset https.proxy

      查看 spark 源碼包

      [root@hadoop001 ~]# cd spark/

      [root@hadoop001 spark]# ll

      total 380

      -rw-r--r-- 1 root root 2643 Nov 28 11:24 appveyor.yml

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 assembly

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 bin

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 binder

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 build

      drwxr-xr-x 9 root root 4096 Nov 28 11:24 common

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 conf

      -rw-r--r-- 1 root root 997 Nov 28 11:24 CONTRIBUTING.md

      drwxr-xr-x 4 root root 4096 Nov 28 11:24 core

      drwxr-xr-x 5 root root 4096 Nov 28 11:24 data

      drwxr-xr-x 6 root root 4096 Nov 28 11:24 dev

      drwxr-xr-x 9 root root 12288 Nov 28 11:24 docs

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 examples

      drwxr-xr-x 12 root root 4096 Nov 28 11:24 external

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 graphx

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 hadoop-cloud

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 launcher

      -rw-r--r-- 1 root root 13453 Nov 28 11:24 LICENSE

      -rw-r--r-- 1 root root 23221 Nov 28 11:24 LICENSE-binary

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 licenses

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 licenses-binary

      drwxr-xr-x 4 root root 4096 Nov 28 11:24 mllib

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 mllib-local

      -rw-r--r-- 1 root root 2002 Nov 28 11:24 NOTICE

      -rw-r--r-- 1 root root 57677 Nov 28 11:24 NOTICE-binary

      -rw-r--r-- 1 root root 122016 Nov 28 11:24 pom.xml

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 project

      drwxr-xr-x 7 root root 4096 Nov 28 11:24 python

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 R

      -rw-r--r-- 1 root root 4488 Nov 28 11:24 README.md

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 repl

      drwxr-xr-x 5 root root 4096 Nov 28 11:24 resource-managers

      drwxr-xr-x 2 root root 4096 Nov 28 11:24 sbin

      -rw-r--r-- 1 root root 20431 Nov 28 11:24 scalastyle-config.xml

      drwxr-xr-x 6 root root 4096 Nov 28 11:24 sql

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 streaming

      drwxr-xr-x 3 root root 4096 Nov 28 11:24 tools

      查看 spark 分支情況

      [root@hadoop001 spark]# git branch -a

      * master

      remotes/origin/HEAD -> origin/master

      remotes/origin/branch-0.5

      remotes/origin/branch-0.6

      remotes/origin/branch-0.7

      remotes/origin/branch-0.8

      remotes/origin/branch-0.9

      remotes/origin/branch-1.0

      remotes/origin/branch-1.0-jdbc

      remotes/origin/branch-1.1

      remotes/origin/branch-1.2

      remotes/origin/branch-1.3

      remotes/origin/branch-1.4

      remotes/origin/branch-1.5

      remotes/origin/branch-1.6

      remotes/origin/branch-2.0

      remotes/origin/branch-2.1

      remotes/origin/branch-2.2

      remotes/origin/branch-2.3

      remotes/origin/branch-2.4

      remotes/origin/branch-3.0

      remotes/origin/master

      用華為云鏡像源碼編譯Spark3.0.1

      切換到 spark 3.0.1

      [root@hadoop001 spark]# git checkout v3.0.1

      Note: checking out 'v3.0.1'.

      You are in 'detached HEAD' state. You can look around, make experimental

      changes and commit them, and you can discard any commits you make in this

      state without impacting any branches by performing another checkout.

      If you want to create a new branch to retain commits you create, you may

      do so (now or later) by using -b with the checkout command again. Example:

      git checkout -b new_branch_name

      HEAD is now at 2b147c4... Preparing Spark release v3.0.1-rc3

      2. 修改 spark 源碼

      簡單的修改一下 spark 源碼,再編譯,我們想要的結果是執行 spark-shell 后會在命令行里面多出現 “Hitman who wakes up at five in the morning”

      [root@hadoop001 deploy]# vim /root/spark/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala

      3. spark 源碼編譯

      需要在 Maven 的 settings.xml 中添加如下內容,我是用的華為云鏡像,也可以用阿里云鏡像,如果有梯子可以不用國內的鏡像

      [root@hadoop001 ~]# vim /usr/manven/apache-maven-3.6.1/conf/settings.xml

      huaweicloud

      *

      https://mirrors.huaweicloud.com/repository/maven/

      nexus-aliyun

      central

      Nexus aliyun

      http://maven.aliyun.com/nexus/content/groups/public

      -->

      ......

      cloudera

      https://repository.cloudera.com/artifactory/cloudera-repos/

      true

      false

      cloudera-profile

      由于我的 hadoop 版本是基于 CDH 源碼編譯的,因此需要在 spark 源碼目錄下的 pom 文件添加 CDH maven 倉庫

      [root@hadoop001 spark]# vim /root/spark/pom.xml

      cloudera

      https://repository.cloudera.com/artifactory/cloudera-repos/

      注意:在編譯前需要檢查所需要的軟件是否安裝成功

      [root@hadoop001 spark]# echo $HADOOP_HOME

      /home/hadoop/app/hadoop-2.6.0-cdh5.7.0

      [root@hadoop001 spark]#

      [root@hadoop001 spark]# echo $JAVA_HOME

      /usr/java/jdk1.8.0_221

      [root@hadoop001 spark]# echo $SCALA_HOME

      /usr/scala/scala-2.12.8

      [root@hadoop001 spark]# echo $MAVEN_HOME

      /usr/maven/apache-maven-3.6.1

      [root@hadoop001 spark]#/root/spark/dev/make-distribution.sh --name 2.6.0-cdh5.7.0 --tgz -Phadoop-2.6 -Phive -Phive-thriftserv

      er -Pyarn -Pkubernetes -Dhadoop.version=2.6.0-cdh5.7.0

      [INFO] ------------------------------------------------------------------------

      [INFO] Reactor Summary for Spark Project Parent POM 3.0.1:

      [INFO]

      [INFO] Spark Project Parent POM ........................... SUCCESS [ 2.695 s]

      [INFO] Spark Project Tags ................................. SUCCESS [ 5.812 s]

      [INFO] Spark Project Sketch ............................... SUCCESS [ 7.493 s]

      [INFO] Spark Project Local DB ............................. SUCCESS [ 2.248 s]

      [INFO] Spark Project Networking ........................... SUCCESS [ 4.814 s]

      [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 1.749 s]

      [INFO] Spark Project Unsafe ............................... SUCCESS [ 11.155 s]

      [INFO] Spark Project Launcher ............................. SUCCESS [01:24 min]

      [INFO] Spark Project Core ................................. SUCCESS [08:23 min]

      [INFO] Spark Project ML Local Library ..................... SUCCESS [01:20 min]

      [INFO] Spark Project GraphX ............................... SUCCESS [01:00 min]

      [INFO] Spark Project Streaming ............................ SUCCESS [01:42 min]

      [INFO] Spark Project Catalyst ............................. SUCCESS [05:16 min]

      [INFO] Spark Project SQL .................................. SUCCESS [06:20 min]

      [INFO] Spark Project ML Library ........................... SUCCESS [04:25 min]

      [INFO] Spark Project Tools ................................ SUCCESS [ 31.632 s]

      [INFO] Spark Project Hive ................................. SUCCESS [04:32 min]

      [INFO] Spark Project REPL ................................. SUCCESS [ 29.968 s]

      [INFO] Spark Project Assembly ............................. SUCCESS [ 5.102 s]

      [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [ 36.694 s]

      [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [01:48 min]

      [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [01:26 min]

      [INFO] Spark Project Examples ............................. SUCCESS [ 59.937 s]

      [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 4.603 s]

      [INFO] Spark Avro ......................................... SUCCESS [01:01 min]

      [INFO] ------------------------------------------------------------------------

      [INFO] BUILD SUCCESS

      [INFO] ------------------------------------------------------------------------

      [INFO] Total time: 42:07 min

      [INFO] Finished at: 2020-11-28T14:31:20+08:00

      [INFO] ------------------------------------------------------------------------

      4. 結果驗證

      [root@hadoop001 spark]# tar -zxvf /root/spark/spark-3.0.1-bin-2.6.0-cdh5.7.0.tgz /home/hadoop/app/

      [root@hadoop001 bin]# ./spark-shell

      20/11/28 15:45:01 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

      Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

      Setting default log level to "WARN".

      To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).

      Spark context Web UI available at http://hadoop001:4040

      Spark context available as 'sc' (master = local[*], app id = local-1606549507863).

      Spark session available as 'spark'.

      Welcome to

      ____ __

      / __/__ ___ _____/ /__

      _\ \/ _ \/ _ `/ __/ '_/

      /___/ .__/\_,_/_/ /_/\_\ version 3.0.1

      /_/

      Using Scala version 2.12.10 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_221)

      Type in expressions to have them evaluated.

      Type :help for more information.

      scala>

      很遺憾,發現不是我們想要的結果,檢查編譯后的版本,發現版本是 spark3.0.1 和想要的版本沒有錯誤,那到底是哪里錯誤呢?

      [root@hadoop001 deploy]# git branch -vv

      * (detached from v3.0.1) 2b147c4 Preparing Spark release v3.0.1-rc3

      master 13fd272 [origin/master] Spelling r common dev mlib external project streaming resource managers python

      在命令行中執行 spark-submit –version 和 spark-shell –version 查看結果

      發現加上 spark-shell –version 有自己想要的結果,但是直接 spark-shell 沒有想要的結果,那么唯一的解釋應該是修改源碼的地方有誤,修改源碼的位置出錯了,經過一番的查找,確實是修改源碼的位置有錯誤,如果想要我們預先假設的結果,需要修改 spark repl 模塊的源碼,修改地方如下所示:

      [root@hadoop001 repl]# vim /root/spark/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala

      ......

      由于前面已經編譯完了整個spark,而且只對 repl 模塊做了修改,因此我們只對 repl 這個子模塊編譯即可。那么如何編譯 repl 子模塊呢?請看 spark 官網做了詳細的介紹,首先進入源碼 repl 模塊的 pom.xml 文件中,發現定義了 spark-repl_2.12

      [root@hadoop001 repl]# vim /root/spark/repl/pom.xml

      ......

      org.apache.spark

      spark-parent_2.12

      3.0.1

      ../pom.xml

      spark-repl_2.12

      jar

      Spark Project REPL

      http://spark.apache.org/

      Apache Spark 文檔中關于 Spark 子模塊的編譯說明截圖如下:

      因此對 spark repl 子模塊的編譯操作步驟如下:

      [root@hadoop001 spark]# ./build/mvn -pl :spark-repl_2.12 clean install

      ......

      [INFO] --- maven-install-plugin:3.0.0-M1:install (default-install) @ spark-repl_2.12 ---

      [INFO] Installing /root/spark/repl/target/spark-repl_2.12-3.0.1.jar to /root/.m2/repository/org/apache/spark/spark-repl_2.12/3.0.1/spark-repl_2.12-3.0.1.jar

      [INFO] Installing /root/spark/repl/dependency-reduced-pom.xml to /root/.m2/repository/org/apache/spark/spark-repl_2.12/3.0.1/spark-repl_2.12-3.0.1.pom

      [INFO] Installing /root/spark/repl/target/spark-repl_2.12-3.0.1-tests.jar to /root/.m2/repository/org/apache/spark/spark-repl_2.12/3.0.1/spark-repl_2.12-3.0.1-tests.jar

      [INFO] Installing /root/spark/repl/target/spark-repl_2.12-3.0.1-sources.jar to /root/.m2/repository/org/apache/spark/spark-repl_2.12/3.0.1/spark-repl_2.12-3.0.1-sources.jar

      [INFO] Installing /root/spark/repl/target/spark-repl_2.12-3.0.1-test-sources.jar to /root/.m2/repository/org/apache/spark/spark-repl_2.12/3.0.1/spark-repl_2.12-3.0.1-test-sources.jar

      [INFO] Installing /root/spark/repl/target/spark-repl_2.12-3.0.1-javadoc.jar to /root/.m2/repository/org/apache/spark/spark-repl_2.12/3.0.1/spark-repl_2.12-3.0.1-javadoc.jar

      [INFO] ------------------------------------------------------------------------

      [INFO] BUILD SUCCESS

      [INFO] ------------------------------------------------------------------------

      [INFO] Total time: 04:31 min

      [INFO] Finished at: 2020-11-28T16:47:29+08:00

      [INFO] ------------------------------------------------------------------------

      替換掉原來的 spark-repl jar

      [root@hadoop001 jars]# mv /home/hadoop/app/spark-3.0.1-bin-2.6.0-cdh5.7.0/jars/spark-repl_2.12-3.0.1.jar spark-repl_2.12-3.0.1.jar_bak

      [root@hadoop001 target]# cp /root/spark/repl/target/spark-repl_2.12-3.0.1.jar /home/hadoop/app/spark-3.0.1-bin-2.6.0-cdh5.7.0/jars/

      發現是我們預先假設的結果

      5. 總結

      在源碼修改的過程中一定要注意,別修改錯了,同時源碼編譯也是一個基本功,需要非常熟練的掌握,可能沒有梯子的會在源碼編譯中遇到不少的錯誤,大多數錯誤是由于網絡造成的。

      參考文獻

      參考官網

      http://spark.apache.org/docs/latest/building-spark.html

      spark

      版權聲明:本文內容由網絡用戶投稿,版權歸原作者所有,本站不擁有其著作權,亦不承擔相應法律責任。如果您發現本站中有涉嫌抄襲或描述失實的內容,請聯系我們jiasou666@gmail.com 處理,核實后本網站將在24小時內刪除侵權內容。

      上一篇:震驚 | 只需3分鐘!極速部署個人Docker云平臺
      下一篇:谷歌OKR指導手冊 (譯)
      相關文章
      亚洲第一永久在线观看| 亚洲精品无码不卡| 国产午夜亚洲精品| 亚洲人6666成人观看| 亚洲系列中文字幕| 亚洲色图黄色小说| 亚洲码在线中文在线观看| 亚洲精品第五页中文字幕| 亚洲福利视频网址| 亚洲一区二区三区免费观看| 亚洲国产精品免费在线观看| 91午夜精品亚洲一区二区三区| 亚洲沟沟美女亚洲沟沟| 亚洲国产日韩在线人成下载| 亚洲小说区图片区| 亚洲六月丁香婷婷综合| 91丁香亚洲综合社区| 亚洲日韩av无码中文| 亚洲精品美女久久7777777| 亚洲AV无码专区亚洲AV桃| 色九月亚洲综合网| 亚洲成a人片在线观看老师| 亚洲综合色区在线观看| 伊人久久大香线蕉亚洲| 久久国产亚洲精品麻豆| 亚洲毛片在线观看| 亚洲国产精品张柏芝在线观看| 亚洲一卡2卡3卡4卡乱码 在线| 亚洲久热无码av中文字幕| 亚洲AV日韩AV永久无码色欲| 亚洲AV蜜桃永久无码精品| 国产成人高清亚洲| 亚洲国产另类久久久精品| 亚洲综合一区二区国产精品| 亚洲影视一区二区| 亚洲国产欧美国产综合一区| 亚洲第一区在线观看| 亚洲精品乱码久久久久久按摩 | 久久久久亚洲AV成人无码| 亚洲网址在线观看你懂的| 亚洲喷奶水中文字幕电影|