问题背景
在使用pyspark提交任务导yarn上的时候,每次提交任务,都要等待好长时间,但是在之前公司中,提交任务导yarn上很快的,所以就调查了一下在提交任务的时候,有一个WARN的日志:
WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
在网上查了一下,每一次我们运行的时候,如果没有指定 spark.yarn.archive or spark.yarn.jars,Spark将在安装路径下的Jar目录,将其所有的Jar包打包然后将其上传到分布式缓存官网的原话:To make Spark runtime jars accessible from YARN side, you can specify spark.yarn.archive or spark.yarn.jars. For details please refer to Spark Properties. If neither spark.yarn.archive nor spark.yarn.jars is specified, Spark will create a zip file with all jars under $SPARK_HOME/jars and upload it to the distributed cache.
调优方法
* 首先将Spark安装路径下的所有jar包上传到HDFS上* 在spark的conf目录下的spark-defaults.conf添加
spark.yarn.archive hdfs://ycluster-3/data/hadoop/spark-jars/*jar
有个bug
我记得我当时按照这个步骤修改完,提交任务导yarn上之后,会报以下错误
ERROR spark.SparkContext: Error initializing SparkContext.org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85)
至于怎么修复的,我忘记了,等想起来,再回来补上