为什么./bin/smack-shell警告:WARN NativeCodeLoader: Unable to load native-hadoop library for your platform?

内容来源于 Stack Overflow,并遵循CC BY-SA 3.0许可协议进行翻译与使用

  • 回答 (2)
  • 关注 (0)
  • 查看 (238)

在MacOSX上,我使用以下命令从源代码中编译了SPark:

jacek:~/oss/spark
$ SPARK_HADOOP_VERSION=2.4.0 SPARK_YARN=true SPARK_HIVE=true SPARK_GANGLIA_LGPL=true xsbt
...

[info] Set current project to root (in build file:/Users/jacek/oss/spark/)
> ; clean ; assembly
...
[info] Packaging /Users/jacek/oss/spark/examples/target/scala-2.10/spark-examples-1.0.0-SNAPSHOT-hadoop2.4.0.jar ...
[info] Done packaging.
[info] Done packaging.
[success] Total time: 1964 s, completed May 9, 2014 5:07:45 AM

当我开始./bin/spark-shell我注意到以下警告消息:

WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

有什么问题吗?

jacek:~/oss/spark
$ ./bin/spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on classpath
14/05/09 21:11:17 INFO SecurityManager: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
14/05/09 21:11:17 INFO SecurityManager: Changing view acls to: jacek
14/05/09 21:11:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(jacek)
14/05/09 21:11:17 INFO HttpServer: Starting HTTP Server
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.0.0-SNAPSHOT
      /_/

Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0)
Type in expressions to have them evaluated.
Type :help for more information.
...
14/05/09 21:11:49 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
...
提问于
用户回答回答于

Apache Hadoop中的本地库指南文档如下:

支持本机Hadoop库。*只限Nix平台。这个库不适合使用Cygwin或MacOSX平台。 本机Hadoop库主要用于GNU/Linus平台,并在以下发行版上进行了测试:

  • RHEL4/Fedora
  • Ubuntu
  • Gentoo

在所有上述发行版上,32/64位本机Hadoop库将与各自的32/64位JVM一起工作。

在MacOSX上,警告消息似乎应该被忽略,因为本地库并不只是用于平台。

用户回答回答于

如果你cd进入/sparkDir/conf并重命名spark-env.sh.templatespark-env.sh,然后设置JAVA_OPTShadoop_DIR,它起作用了。

您还必须编辑这个/etc/profile行:

export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH

扫码关注云+社区

领取腾讯云代金券