前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.

exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.

作者头像
别先生
发布2018-05-16 15:01:20
2.6K0
发布2018-05-16 15:01:20
举报
文章被收录于专栏:别先生别先生

1、虽然,不是大错,还说要贴一下,由于我运行run-example streaming.NetworkWordCount localhost 9999的测试案例,出现的错误,第一感觉就是Spark没有启动导致的:

代码语言:javascript
复制
  1 18/04/23 03:21:58 ERROR SparkContext: Error initializing SparkContext.
  2 java.net.ConnectException: Call From slaver1/192.168.19.131 to slaver1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
  3     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  4     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
  5     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  6     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
  7     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
  8     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
  9     at org.apache.hadoop.ipc.Client.call(Client.java:1414)
 10     at org.apache.hadoop.ipc.Client.call(Client.java:1363)
 11     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 12     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 13     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 14     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 15     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 16     at java.lang.reflect.Method.invoke(Method.java:606)
 17     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
 18     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
 19     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 20     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
 21     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
 22     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
 23     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
 24     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 25     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
 26     at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
 27     at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
 28     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:864)
 29     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
 30     at org.apache.spark.examples.streaming.NetworkWordCount$.main(NetworkWordCount.scala:47)
 31     at org.apache.spark.examples.streaming.NetworkWordCount.main(NetworkWordCount.scala)
 32     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 33     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 34     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 35     at java.lang.reflect.Method.invoke(Method.java:606)
 36     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
 37     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
 38     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
 39     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
 40     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 41 Caused by: java.net.ConnectException: Connection refused
 42     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
 43     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
 44     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
 45     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
 46     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
 47     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
 48     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
 49     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
 50     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
 51     at org.apache.hadoop.ipc.Client.call(Client.java:1381)
 52     ... 31 more
 53 18/04/23 03:21:59 INFO SparkUI: Stopped Spark web UI at http://192.168.19.131:4040
 54 18/04/23 03:21:59 INFO DAGScheduler: Stopping DAGScheduler
 55 18/04/23 03:21:59 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
 56 18/04/23 03:21:59 INFO MemoryStore: MemoryStore cleared
 57 18/04/23 03:21:59 INFO BlockManager: BlockManager stopped
 58 18/04/23 03:21:59 INFO BlockManagerMaster: BlockManagerMaster stopped
 59 18/04/23 03:21:59 INFO SparkContext: Successfully stopped SparkContext
 60 Exception in thread "main" java.net.ConnectException: Call From slaver1/192.168.19.131 to slaver1:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
 61     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 62     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 63     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 64     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 65     at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
 66     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
 67     at org.apache.hadoop.ipc.Client.call(Client.java:1414)
 68     at org.apache.hadoop.ipc.Client.call(Client.java:1363)
 69     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
 70     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 71     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 72     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 73     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 74     at java.lang.reflect.Method.invoke(Method.java:606)
 75     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
 76     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
 77     at com.sun.proxy.$Proxy12.getFileInfo(Unknown Source)
 78     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
 79     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
 80     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
 81     at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
 82     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
 83     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
 84     at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:100)
 85     at org.apache.spark.SparkContext.<init>(SparkContext.scala:541)
 86     at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:864)
 87     at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
 88     at org.apache.spark.examples.streaming.NetworkWordCount$.main(NetworkWordCount.scala:47)
 89     at org.apache.spark.examples.streaming.NetworkWordCount.main(NetworkWordCount.scala)
 90     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 91     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 92     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 93     at java.lang.reflect.Method.invoke(Method.java:606)
 94     at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
 95     at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
 96     at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
 97     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
 98     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 99 Caused by: java.net.ConnectException: Connection refused
100     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
101     at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:739)
102     at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
103     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
104     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
105     at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
106     at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
107     at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
108     at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
109     at org.apache.hadoop.ipc.Client.call(Client.java:1381)
110     ... 31 more
111 18/04/23 03:21:59 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
112 18/04/23 03:21:59 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.
113 18/04/23 03:21:59 INFO ShutdownHookManager: Shutdown hook called
114 18/04/23 03:21:59 INFO ShutdownHookManager: Deleting directory /tmp/spark-7ef5c2da-0b57-4553-a9f9-6e215885c7ba
115 18/04/23 03:21:59 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports.

2、启动Spark的脚本命令:

[hadoop@slaver1 spark-1.5.1-bin-hadoop2.4]$ sbin/start-all.sh

[hadoop@slaver2 ~]$ run-example streaming.NetworkWordCount localhost 9999

本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2018-04-23 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档