关于hdfs:无法检测到有效的hadoop主目录

Failed to detect a valid hadoop home directory

我已经设置了Hadoop 2.2.0单节点并启动了它。我可以在http://localhost:50070/上浏览FS
然后,我尝试使用以下代码编写一个虚拟文件。

1
2
3
4
5
6
7
public class Test {
public void write(File file) throws IOException{
    FileSystem fs = FileSystem.get(new Configuration());
    Path outFile = new Path("test.jpg");      
    FSDataOutputStream out = fs.create(outFile);        

}

我收到以下异常

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
INFO:   DEBUG - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
    INFO:   DEBUG - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)], about=, type=DEFAULT, always=false, sampleName=Ops)
    INFO:   DEBUG - UgiMetrics, User and group related metrics
    INFO:   DEBUG -  Creating new Groups object
    INFO:   DEBUG - Trying to load the custom-built native-hadoop library...
    INFO:   DEBUG - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
    INFO:   DEBUG - java.library.path=/usr/lib/jvm/jdk1.7.0/jre/lib/amd64:/usr/lib/jvm/jdk1.7.0/jre/lib/i386::/usr/java/packages/lib/i386:/lib:/usr/lib
    INFO:   WARN - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    INFO:   DEBUG - Falling back to shell based
    INFO:   DEBUG - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
    INFO:   DEBUG - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000
    INFO:   DEBUG - hadoop login
    INFO:   DEBUG - hadoop login commit
    INFO:   DEBUG - using local user:UnixPrincipal: qualebs
    INFO:   DEBUG - UGI loginUser:qualebs (auth:SIMPLE)
    INFO:   DEBUG - Failed to detect a valid hadoop home directory
    java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set.
        at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:225)
        at org.apache.hadoop.util.Shell.<clinit>(Shell.java:250)
        at
    org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
        at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
        at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
        at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:783)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:772)
        at com.qualebs.managers.HadoopDFS.writer(HadoopDFS.java:41)

我在哪里设置HADOOP_HOME或hadoop.home.dir?
操作系统是Ubuntu 11.10

我配置的唯一配置文件如下,并添加了属性

  • core-site.xml
  • 1
    2
    3
    4
    5
    6
    <configuration>
        <property>
            <name>fs.default.name</name>
            <value>hdfs://localhost:9000</value>
        </property>
    </configuration>
  • hdfs-site.xml
  • 1
    2
    3
    4
    5
    6
    <configuration>
        <property>
            <name>dfs.replication</name>
            <value>1</value>
        </property>
    </configuration>

  • mapred-site.xml.template
  • 1
    2
    3
    4
    5
    6
    <configuration>
        <property>
            <name>mapred.job.tracker</name>
            <value>localhost:9001</value>
        </property>
    </configuration>

    热切等待您的答复。


    我通过以下方法找到了解决方案:

    1
    System.setProperty("hadoop.home.dir","/");

    此异常由org.apache.hadoop.util.Shell

    中的checkHadoopHome()引发

    希望有帮助!


    此设置在Windows上不起作用。一种解决方法是在项目上创建一个文件夹(例如winutils / bin)并将winutils.exe放入其中(请参阅https://wiki.apache.org/hadoop/WindowsProblems)。然后在Java代码中添加

    1
    2
    3
    4
    5
    6
    7
    8
    9
    static {
        String OS = System.getProperty("os.name").toLowerCase();

        if (OS.contains("win")) {
          System.setProperty("hadoop.home.dir", Paths.get("winutils").toAbsolutePath().toString());
        } else {
          System.setProperty("hadoop.home.dir","/");
        }
      }

    希望对您有帮助。


    我从某件事中得到了同样的错误消息,我根本不认为这与路径有关。我的记录器设置不正确:

    是导致错误的原因:

    1
    2
    3
    4
    5
    import org.apache.log4j._

    trait Logger {
      val logger = LogManager.getRootLogger
    }

    已修复:

    1
    2
    3
    4
    5
    6
    import org.apache.log4j._

    trait Logger {
      val logger = LogManager.getRootLogger
      logger.setLevel(Level.INFO)
    }

    解决方案可能根本不在于更改路径。


    如果您不使用hadoop的专用用户,请将其添加到终端bash文件中。

    1
    2
    3
    4
    5
    1. start terminal
    2. sudo vi .bashrc
    3. export HADOOP_HOME=YOUR_HADOOP_HOME_DIRECTORY(don't include bin folder)
    4. save
    5. restart terminal and check it if it's saved by typing : echo $HADOOP_HOME