Hive job,抛错java.io.FileNotFoundException:/.../container_000001(Is a directory)

简介: 场景: 跑hive job时,夯住 错误: 查看RM WebSLF4J: Class path contains multiple SLF4J bindings.

场景: 跑hive job时,夯住

错误: 查看RM Web
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hadoop/hadoop-2.7.2/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop-2.7.2/tmp/nm-local-dir/usercache/root/appcache/application_1468583637020_0006/filecache/10/job.jar/job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:ERROR setFile(null,true) call failed.
java.io.FileNotFoundException: /hadoop/hadoop-2.7.2/logs/userlogs/application_1468583637020_0006/container_e58_1468583637020_0006_01_000001 (Is a directory)
 at java.io.FileOutputStream.open(Native Method)
 at java.io.FileOutputStream.(FileOutputStream.java:221)
 at java.io.FileOutputStream.(FileOutputStream.java:142)
 at org.apache.log4j.FileAppender.setFile(FileAppender.java:294)
 at org.apache.log4j.FileAppender.activateOptions(FileAppender.java:165)
 at org.apache.hadoop.yarn.ContainerLogAppender.activateOptions(ContainerLogAppender.java:55)
 at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
 at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
 at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
 at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
 at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
 at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
 at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
 at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
 at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
 at org.apache.log4j.LogManager.(LogManager.java:127)
 at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:64)
 at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:285)
 at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:156)
 at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
 at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:657)
 at org.apache.hadoop.service.AbstractService.(AbstractService.java:43)
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver as a provider class
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.yarn.webapp.GenericExceptionHandler as a provider class
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory register
INFO: Registering org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices as a root resource class
Jul 19, 2016 7:32:00 PM com.sun.jersey.server.impl.application.WebApplicationImpl _initiate
INFO: Initiating Jersey application, version 'Jersey: 1.9 09/02/2011 11:17 AM'
Jul 19, 2016 7:32:00 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.JAXBContextResolver to GuiceManagedComponentProvider with the scope "Singleton"
Jul 19, 2016 7:32:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.yarn.webapp.GenericExceptionHandler to GuiceManagedComponentProvider with the scope "Singleton"
Jul 19, 2016 7:32:01 PM com.sun.jersey.guice.spi.container.GuiceComponentProviderFactory getComponentProvider
INFO: Binding org.apache.hadoop.mapreduce.v2.app.webapp.AMWebServices to GuiceManagedComponentProvider with the scope "PerRequest"


分析:
1>.wordcount测试MR--ok
2>.调优yarn参数

解决方法: 修改每个节点的yarn-site.xml,然后重启服务生效

点击(此处)折叠或打开

<property>
    <name>yarn.nodemanager.resource.memory-mb</name>
    <value>8192</value>
     <discription>当前节点nodemanager进程分配内存大小</discription>
</property>
<property>
    <name>yarn.scheduler.minimum-allocation-mb</name>
    <value>1024</value>
    <discription>单个任务可申请最少内存,默认1024MB</discription>
 </property>
  
<property>
    <name>yarn.scheduler.maximum-allocation-mb</name>
    <value>8192</value>
    <discription>单个任务可申请最大内存,默认8192MB</discription>
</property>
<property>
    <name>yarn.nodemanager.resource.cpu-vcores</name>
    <value>4</value>
</property>

目录
相关文章
|
3月前
|
SQL Java 数据库连接
java链接hive数据库实现增删改查操作
java链接hive数据库实现增删改查操作
146 0
|
7月前
|
SQL 分布式计算 监控
Hive性能优化之计算Job执行优化 2
Hive性能优化之计算Job执行优化
101 1
|
6月前
|
SQL 前端开发 Java
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制 1
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制
|
SQL 分布式计算 DataWorks
同步Hive表数据报block文件不存在问题 java.io.FileNotFoundException: File does not exist
同步Hive表数据报block文件不存在问题 java.io.FileNotFoundException: File does not exist
|
6月前
|
SQL Java 大数据
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制 2
大数据平台底层技术-JAVA篇-如何动态加载不同版本的 HIVE JDBC 驱动 - 一文读懂JAVA的类加载机制
|
4月前
|
SQL 分布式计算 Hadoop
[AIGC ~大数据] 深入理解Hadoop、HDFS、Hive和Spark:Java大师的大数据研究之旅
[AIGC ~大数据] 深入理解Hadoop、HDFS、Hive和Spark:Java大师的大数据研究之旅
|
6月前
|
SQL 分布式计算 Java
阿里云MaxCompute-Hive UDF(Java)迁移上云实践
阿里云MaxCompute-Hive UDF(Java)迁移上云实践
|
7月前
|
SQL 分布式计算 资源调度
Hive性能优化之计算Job执行优化 1
Hive性能优化之计算Job执行优化
97 0
Hive性能优化之计算Job执行优化 1
|
SQL 分布式计算 Java
|
安全 C# 开发者
【.Net实用方法总结】 整理并总结System.IO中Directory类及其方法介绍
本文主要介绍System.IO命名空间的Directory类,介绍其常用的方法和示例说明。
【.Net实用方法总结】 整理并总结System.IO中Directory类及其方法介绍