专栏首页别先生java.net.ConnectException: Call From slaver1/192.168.19.128 to slaver1:8020 failed on connection exc

java.net.ConnectException: Call From slaver1/192.168.19.128 to slaver1:8020 failed on connection exc

1:练习spark的时候,操作大概如我读取hdfs上面的文件,然后spark懒加载以后,我读取详细信息出现如下所示的错误,错误虽然不大,我感觉有必要记录一下,因为错误的起因是对命令的不熟悉造成的,错误如下所示:

  1 scala> text.collect
  2 java.net.ConnectException: Call From slaver1/192.168.19.128 to slaver1:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
  3     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  4     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  5     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  6     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  7     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
  8     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:731)
  9     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
 10     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
 11     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
 12     at com.sun.proxy.$Proxy36.getFileInfo(Unknown Source)
 13     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
 14     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 15     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 16     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 17     at java.lang.reflect.Method.invoke(Method.java:606)
 18     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
 19     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
 20     at com.sun.proxy.$Proxy37.getFileInfo(Unknown Source)
 21     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
 22     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
 23     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
 24     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 25     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
 26     at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:57)
 27     at org.apache.hadoop.fs.Globber.glob(Globber.java:252)
 28     at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1644)
 29     at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:257)
 30     at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:228)
 31     at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313)
 32     at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:199)
 33     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
 34     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
 35     at scala.Option.getOrElse(Option.scala:120)
 36     at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
 37     at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
 38     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
 39     at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
 40     at scala.Option.getOrElse(Option.scala:120)
 41     at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
 42     at org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
 43     at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
 44     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
 45     at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
 46     at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
 47     at org.apache.spark.rdd.RDD.collect(RDD.scala:926)
 48     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:30)
 49     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
 50     at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
 51     at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
 52     at $iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
 53     at $iwC$$iwC$$iwC.<init>(<console>:43)
 54     at $iwC$$iwC.<init>(<console>:45)
 55     at $iwC.<init>(<console>:47)
 56     at <init>(<console>:49)
 57     at .<init>(<console>:53)
 58     at .<clinit>(<console>)
 59     at .<init>(<console>:7)
 60     at .<clinit>(<console>)
 61     at $print(<console>)
 62     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 63     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 64     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 65     at java.lang.reflect.Method.invoke(Method.java:606)
 66     at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
 67     at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
 68     at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
 69     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
 70     at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
 71     at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
 72     at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
 73     at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
 74     at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
 75     at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
 76     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
 77     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
 78     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
 79     at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
 80     at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
 81     at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
 82     at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
 83     at org.apache.spark.repl.Main$.main(Main.scala:31)
 84     at org.apache.spark.repl.Main.main(Main.scala)
 85     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 86     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 87     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 88     at java.lang.reflect.Method.invoke(Method.java:606)
 89     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
 90     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
 91     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
 92     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
 93     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 94 Caused by: java.net.ConnectException: Connection refused
 95     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 96     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
 97     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 98     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:530)
 99     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:494)
100     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:607)
101     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:705)
102     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
103     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
104     at org.apache.hadoop.ipc.Client.call(Client.java:1438)
105     ... 84 more

 2:错误原因如下所示:

我使用了如下所示命令来读取hdfs上面的文件,scala> var text = sc.textFile("hdfs://slaver1:/input.txt");,然后使用text.collect命令来查看详细信息,就是查看详细信息的时候报的上面的错误,错误原因是因为我读取hdfs文件的时候少了端口号,造成的错误;

修改为如下所示即可:

scala> var text = sc.textFile("hdfs://slaver1:9000/input.txt");
scala> text.collect

本文参与腾讯云自媒体分享计划,欢迎正在阅读的你也加入,一起分享。

我来说两句

0 条评论
登录 后参与评论

相关文章

  • 执行Spark运行在yarn上的命令报错 spark-shell --master yarn-client

    别先生
  • org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.

    1、启动hive的过程中,[hadoop@slaver1 soft]$ hive --service metastore & 错误如下所示: 原因:之前启动hi...

    别先生
  • sqoop数据迁移(基于Hadoop和关系数据库服务器之间传送数据)

    1:sqoop的概述: (1):sqoop是apache旗下一款“Hadoop和关系数据库服务器之间传送数据”的工具。 (2):导入数据:MySQL,Orac...

    别先生
  • Feign声明式调用服务 feign.codec.DecodeException: Error while extracting response for type [class **] and co

      比如我有两个服务 ,一个调用这个Feign查询数据抱这种错误,那么好,再换一个服务进行查询,还是报了如上的错误,那么就确定了一定是被Feign调用的服务有了...

    Arebirth
  • org.apache.tomcat.util.scan.StandardJarScanner找不到serializer.jar的问题

    我在下载xalan.jar的压缩包里找到了StandardJarScanner processURLs寻找的serializer.jar: http://xm...

    Jerry Wang
  • python中ndarray除_Numpy 基本除法运算和模运算

    基本算术运算符+、-和*隐式关联着通用函数add、subtract和multiply

    用户7886150
  • Numpy 基本除法运算和模运算

    基本算术运算符+、-和*隐式关联着通用函数add、subtract和multiply

    用户7886150
  • Linux jar错误解决方法

    Java程序在windows下正常,在Linux下却报jar错 cannot read zip file entry 或者 添加jar包后,项目启动时报错:

    三产
  • JMeter java.net.URISyntaxException:Illegalcharacterinquery解决方案

    如下,jmeter发送get参数,参数中携带字符 {},运行后查看请求响应结果,提示如下:

    授客
  • 自定义配置类,读取Application.properties中的参数

    在开发过程中,我们可能需要指定一些固定的参数,或者配置一些其它第三方参数。但是在后期应用中,可能需要对改内容进行不定期的修改。为了降低项目重新部署的成本。我们可...

    java攻城狮

扫码关注云+社区

领取腾讯云代金券