Spark(11)——使用spark在yarn上运行,关闭Application
配置:
使用Yarn运行spark时,需要在spark-env.sh中添加以下行
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop//注意:在您的环境中检查$HADOOP_HOME / etc / hadoop是否正确.而spark-env.sh也包含HADOOP_HOME的导出.
修改/root/apps/hadoop-2.8.1/etc/hadoop/yarn-site.xml,在原有基础上添加两个配置
<property><name>yarn.nodemanager.vmem-check-enabled</name><value>false</value><description>Whether virtual memory limits will be enforced for containers</description></property><property><name>yarn.nodemanager.vmem-pmem-ratio</name><value>4</value><description>Ratio between virtual memory to physical memory when setting memory limits for containers</description></property>
启动
[]
关闭
yarn application -listyarn application -kill Application-Id
