前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Hadoop使用学习笔记(5)

Hadoop使用学习笔记(5)

作者头像
干货满满张哈希
发布2021-04-12 15:11:47
3540
发布2021-04-12 15:11:47
举报
文章被收录于专栏:干货满满张哈希

Hadoop使用学习笔记

3. Map-Reduce本地调试全程Debug(上)

将之前的项目中的Resource中的除了log4j配置其他的文件全部删除。同时,添加本地库(就是之前从集群中拷贝下来的Hadoop文件夹),添加其目录下的share/hadoop中的所有文件作为一个library,如下所示:

这里写图片描述
这里写图片描述
这里写图片描述
这里写图片描述

之后,注释掉删除/test/ouput那一行代码,因为本地运行无法这样删除远程HDFS目录:

代码语言:javascript
复制
//先删除输出目录
//deleteDir(jobConf, args[1]);

我们在集群机器上手动删除:

代码语言:javascript
复制
./bin/hdfs dfs -rm -r /test/output

运行,发现异常:

代码语言:javascript
复制
16/08/04 19:17:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/04 19:17:52 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
16/08/04 19:17:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
16/08/04 19:17:53 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
16/08/04 19:17:54 INFO input.FileInputFormat: Total input paths to process : 2
16/08/04 19:17:55 INFO mapreduce.JobSubmitter: number of splits:2
16/08/04 19:17:56 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_local2007553514_0001
16/08/04 19:17:56 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/tmp/hadoop-862911/mapred/staging/sfdba2007553514/.staging/job_local2007553514_0001
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609)
    at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:977)
    at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187)
    at org.apache.hadoop.util.DiskChecker.checkDirAccess(DiskChecker.java:174)
    at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:285)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:344)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:150)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:131)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:115)
    at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:125)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.(LocalJobRunner.java:163)
    at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:731)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:240)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
    at com.hash.test.hadoop.mapred.wordcount.WordCount.run(WordCount.java:54)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at com.hash.test.hadoop.mapred.wordcount.WordCount.main(WordCount.java:59)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)

这个是因为windows目录权限所致的问题,改写(NativeIO.java:609)代码:

代码语言:javascript
复制
public static boolean access(String path, NativeIO.Windows.AccessRight desiredAccess) throws IOException {
            return true;
//            return access0(path, desiredAccess.accessRight());
        }

让这个方法直接返回true(改写方法就是新建同包同名类NativeIO,复制所有源代码,替换上面的代码部分) 继续运行,成功 接下来我们可以打断点调试了。

本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2016/08/08 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Hadoop使用学习笔记
    • 3. Map-Reduce本地调试全程Debug(上)
    领券
    问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档