专栏首页别先生Caused by: java.net.ConnectException: Connection refused/Caused by: java.lang.RuntimeException: com.

Caused by: java.net.ConnectException: Connection refused/Caused by: java.lang.RuntimeException: com.

1、使用sqoop技术将mysql的数据导入到Hive出现的错误如下所示:

第一次使用命令如下所示:

  1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import --connect jdbc:mysql://localhost:3306/test --username root --password 123456 --table tb_user --hive-import --m 1
  2 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../hcatalog does not exist! HCatalog jobs will fail.
  3 Please set $HCAT_HOME to the root of your HCatalog installation.
  4 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../accumulo does not exist! Accumulo imports will fail.
  5 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
  6 18/05/18 19:57:51 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.6
  7 18/05/18 19:57:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
  8 18/05/18 19:57:51 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
  9 18/05/18 19:57:51 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
 10 18/05/18 19:57:51 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
 11 18/05/18 19:57:51 INFO tool.CodeGenTool: Beginning code generation
 12 18/05/18 19:57:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
 13 18/05/18 19:57:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
 14 18/05/18 19:57:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/soft/hadoop-2.5.0-cdh5.3.6
 15 Note: /tmp/sqoop-hadoop/compile/cb8f61449b0ae521eecbd2bccba40b07/tb_user.java uses or overrides a deprecated API.
 16 Note: Recompile with -Xlint:deprecation for details.
 17 18/05/18 19:57:54 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/cb8f61449b0ae521eecbd2bccba40b07/tb_user.jar
 18 18/05/18 19:57:54 WARN manager.MySQLManager: It looks like you are importing from mysql.
 19 18/05/18 19:57:54 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
 20 18/05/18 19:57:54 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
 21 18/05/18 19:57:54 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
 22 18/05/18 19:57:54 INFO mapreduce.ImportJobBase: Beginning import of tb_user
 23 18/05/18 19:57:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
 24 18/05/18 19:57:55 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
 25 18/05/18 19:57:56 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
 26 18/05/18 19:57:56 INFO client.RMProxy: Connecting to ResourceManager at slaver1/192.168.19.131:8032
 27 18/05/18 19:57:59 INFO db.DBInputFormat: Using read commited transaction isolation
 28 18/05/18 19:58:00 INFO mapreduce.JobSubmitter: number of splits:1
 29 18/05/18 19:58:00 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526642793183_0002
 30 18/05/18 19:58:01 INFO impl.YarnClientImpl: Submitted application application_1526642793183_0002
 31 18/05/18 19:58:01 INFO mapreduce.Job: The url to track the job: http://slaver1:8088/proxy/application_1526642793183_0002/
 32 18/05/18 19:58:01 INFO mapreduce.Job: Running job: job_1526642793183_0002
 33 18/05/18 19:58:14 INFO mapreduce.Job: Job job_1526642793183_0002 running in uber mode : false
 34 18/05/18 19:58:14 INFO mapreduce.Job:  map 0% reduce 0%
 35 18/05/18 19:58:30 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_0, Status : FAILED
 36 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
 37 
 38 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
 39     at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
 40     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
 41     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
 42     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
 43     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
 44     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
 45     at java.security.AccessController.doPrivileged(Native Method)
 46     at javax.security.auth.Subject.doAs(Subject.java:415)
 47     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
 48     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
 49 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
 50 
 51 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
 52     at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
 53     at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
 54     ... 9 more
 55 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
 56 
 57 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
 58     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 59     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 60     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 61     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 62     at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
 63     at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
 64     at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
 65     at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
 66     at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
 67     at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
 68     at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
 69     at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
 70     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 71     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
 72     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 73     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
 74     at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
 75     at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
 76     at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
 77     at java.sql.DriverManager.getConnection(DriverManager.java:571)
 78     at java.sql.DriverManager.getConnection(DriverManager.java:215)
 79     at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
 80     at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
 81     ... 10 more
 82 Caused by: java.net.ConnectException: Connection refused
 83     at java.net.PlainSocketImpl.socketConnect(Native Method)
 84     at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
 85     at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
 86     at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
 87     at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
 88     at java.net.Socket.connect(Socket.java:579)
 89     at java.net.Socket.connect(Socket.java:528)
 90     at java.net.Socket.<init>(Socket.java:425)
 91     at java.net.Socket.<init>(Socket.java:241)
 92     at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
 93     at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
 94     ... 26 more
 95 
 96 18/05/18 19:58:37 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_1, Status : FAILED
 97 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
 98 
 99 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
100     at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
101     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
102     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
103     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
104     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
105     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
106     at java.security.AccessController.doPrivileged(Native Method)
107     at javax.security.auth.Subject.doAs(Subject.java:415)
108     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
109     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
110 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
111 
112 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
113     at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
114     at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
115     ... 9 more
116 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
117 
118 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
119     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
120     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
121     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
122     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
123     at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
124     at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
125     at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
126     at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
127     at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
128     at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
129     at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
130     at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
131     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
132     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
133     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
134     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
135     at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
136     at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
137     at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
138     at java.sql.DriverManager.getConnection(DriverManager.java:571)
139     at java.sql.DriverManager.getConnection(DriverManager.java:215)
140     at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
141     at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
142     ... 10 more
143 Caused by: java.net.ConnectException: Connection refused
144     at java.net.PlainSocketImpl.socketConnect(Native Method)
145     at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
146     at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
147     at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
148     at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
149     at java.net.Socket.connect(Socket.java:579)
150     at java.net.Socket.connect(Socket.java:528)
151     at java.net.Socket.<init>(Socket.java:425)
152     at java.net.Socket.<init>(Socket.java:241)
153     at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
154     at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
155     ... 26 more
156 
157 18/05/18 19:58:44 INFO mapreduce.Job: Task Id : attempt_1526642793183_0002_m_000000_2, Status : FAILED
158 Error: java.lang.RuntimeException: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
159 
160 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
161     at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
162     at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
163     at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
164     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:746)
165     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
166     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
167     at java.security.AccessController.doPrivileged(Native Method)
168     at javax.security.auth.Subject.doAs(Subject.java:415)
169     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
170     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
171 Caused by: java.lang.RuntimeException: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
172 
173 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
174     at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
175     at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
176     ... 9 more
177 Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
178 
179 The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
180     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
181     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
182     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
183     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
184     at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
185     at com.mysql.jdbc.SQLError.createCommunicationsException(SQLError.java:1137)
186     at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:356)
187     at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
188     at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
189     at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
190     at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
191     at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
192     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
193     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
194     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
195     at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
196     at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
197     at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
198     at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
199     at java.sql.DriverManager.getConnection(DriverManager.java:571)
200     at java.sql.DriverManager.getConnection(DriverManager.java:215)
201     at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
202     at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
203     ... 10 more
204 Caused by: java.net.ConnectException: Connection refused
205     at java.net.PlainSocketImpl.socketConnect(Native Method)
206     at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
207     at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
208     at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
209     at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
210     at java.net.Socket.connect(Socket.java:579)
211     at java.net.Socket.connect(Socket.java:528)
212     at java.net.Socket.<init>(Socket.java:425)
213     at java.net.Socket.<init>(Socket.java:241)
214     at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
215     at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
216     ... 26 more
217 
218 ^C[hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ 

2、第二次导入使用命令如:

主要是注意连接的主机名称别写错了,账号和密码信息。

 1 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ bin/sqoop import \
 2 > --connect jdbc:mysql://slaver1:3306/test \
 3 > --username root \
 4 > --password 123456 \
 5 > --table tb_user \
 6 > --hive-import \
 7 > --m 1
 8 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../hcatalog does not exist! HCatalog jobs will fail.
 9 Please set $HCAT_HOME to the root of your HCatalog installation.
10 Warning: /home/hadoop/soft/sqoop-1.4.5-cdh5.3.6/../accumulo does not exist! Accumulo imports will fail.
11 Please set $ACCUMULO_HOME to the root of your Accumulo installation.
12 18/05/18 20:01:43 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.6
13 18/05/18 20:01:43 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14 18/05/18 20:01:43 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
15 18/05/18 20:01:43 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16 18/05/18 20:01:43 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
17 18/05/18 20:01:43 INFO tool.CodeGenTool: Beginning code generation
18 18/05/18 20:01:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
19 18/05/18 20:01:44 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
20 18/05/18 20:01:44 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/soft/hadoop-2.5.0-cdh5.3.6
21 Note: /tmp/sqoop-hadoop/compile/02d8e06c0a500fbe72ac09d7f0dca9c3/tb_user.java uses or overrides a deprecated API.
22 Note: Recompile with -Xlint:deprecation for details.
23 18/05/18 20:01:47 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/02d8e06c0a500fbe72ac09d7f0dca9c3/tb_user.jar
24 18/05/18 20:01:47 WARN manager.MySQLManager: It looks like you are importing from mysql.
25 18/05/18 20:01:47 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
26 18/05/18 20:01:47 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
27 18/05/18 20:01:47 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
28 18/05/18 20:01:47 INFO mapreduce.ImportJobBase: Beginning import of tb_user
29 18/05/18 20:01:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
30 18/05/18 20:01:48 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
31 18/05/18 20:01:49 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
32 18/05/18 20:01:49 INFO client.RMProxy: Connecting to ResourceManager at slaver1/192.168.19.131:8032
33 18/05/18 20:01:52 INFO db.DBInputFormat: Using read commited transaction isolation
34 18/05/18 20:01:52 INFO mapreduce.JobSubmitter: number of splits:1
35 18/05/18 20:01:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1526642793183_0003
36 18/05/18 20:01:53 INFO impl.YarnClientImpl: Submitted application application_1526642793183_0003
37 18/05/18 20:01:54 INFO mapreduce.Job: The url to track the job: http://slaver1:8088/proxy/application_1526642793183_0003/
38 18/05/18 20:01:54 INFO mapreduce.Job: Running job: job_1526642793183_0003
39 18/05/18 20:02:05 INFO mapreduce.Job: Job job_1526642793183_0003 running in uber mode : false
40 18/05/18 20:02:05 INFO mapreduce.Job:  map 0% reduce 0%
41 18/05/18 20:02:16 INFO mapreduce.Job:  map 100% reduce 0%
42 18/05/18 20:02:16 INFO mapreduce.Job: Job job_1526642793183_0003 completed successfully
43 18/05/18 20:02:16 INFO mapreduce.Job: Counters: 30
44     File System Counters
45         FILE: Number of bytes read=0
46         FILE: Number of bytes written=132972
47         FILE: Number of read operations=0
48         FILE: Number of large read operations=0
49         FILE: Number of write operations=0
50         HDFS: Number of bytes read=87
51         HDFS: Number of bytes written=153
52         HDFS: Number of read operations=4
53         HDFS: Number of large read operations=0
54         HDFS: Number of write operations=2
55     Job Counters 
56         Launched map tasks=1
57         Other local map tasks=1
58         Total time spent by all maps in occupied slots (ms)=7821
59         Total time spent by all reduces in occupied slots (ms)=0
60         Total time spent by all map tasks (ms)=7821
61         Total vcore-seconds taken by all map tasks=7821
62         Total megabyte-seconds taken by all map tasks=8008704
63     Map-Reduce Framework
64         Map input records=10
65         Map output records=10
66         Input split bytes=87
67         Spilled Records=0
68         Failed Shuffles=0
69         Merged Map outputs=0
70         GC time elapsed (ms)=88
71         CPU time spent (ms)=1230
72         Physical memory (bytes) snapshot=100917248
73         Virtual memory (bytes) snapshot=841768960
74         Total committed heap usage (bytes)=15794176
75     File Input Format Counters 
76         Bytes Read=0
77     File Output Format Counters 
78         Bytes Written=153
79 18/05/18 20:02:16 INFO mapreduce.ImportJobBase: Transferred 153 bytes in 27.1712 seconds (5.631 bytes/sec)
80 18/05/18 20:02:16 INFO mapreduce.ImportJobBase: Retrieved 10 records.
81 18/05/18 20:02:16 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tb_user` AS t LIMIT 1
82 18/05/18 20:02:16 INFO hive.HiveImport: Loading uploaded data into Hive
83 18/05/18 20:02:23 INFO hive.HiveImport: 
84 18/05/18 20:02:23 INFO hive.HiveImport: Logging initialized using configuration in jar:file:/home/hadoop/soft/hive-0.13.1-cdh5.3.6/lib/hive-common-0.13.1-cdh5.3.6.jar!/hive-log4j.properties
85 18/05/18 20:02:36 INFO hive.HiveImport: OK
86 18/05/18 20:02:36 INFO hive.HiveImport: Time taken: 4.423 seconds
87 18/05/18 20:02:36 INFO hive.HiveImport: Loading data to table default.tb_user
88 18/05/18 20:02:38 INFO hive.HiveImport: Table default.tb_user stats: [numFiles=1, numRows=0, totalSize=153, rawDataSize=0]
89 18/05/18 20:02:38 INFO hive.HiveImport: OK
90 18/05/18 20:02:38 INFO hive.HiveImport: Time taken: 1.68 seconds
91 18/05/18 20:02:38 INFO hive.HiveImport: Hive import complete.
92 18/05/18 20:02:38 INFO hive.HiveImport: Export directory is not empty, keeping it.
93 [hadoop@slaver1 sqoop-1.4.5-cdh5.3.6]$ 

3、由于mysql导入到Hive中模式是在default中,所以查看的时候可以去default数据库中查看。

 1 [hadoop@slaver1 ~]$ hive
 2 
 3 Logging initialized using configuration in jar:file:/home/hadoop/soft/hive-0.13.1-cdh5.3.6/lib/hive-common-0.13.1-cdh5.3.6.jar!/hive-log4j.properties
 4 hive> show databases;
 5 OK
 6 course
 7 default
 8 test0516
 9 test20180509
10 Time taken: 0.759 seconds, Fetched: 4 row(s)
11 hive> use default;
12 OK
13 Time taken: 0.025 seconds
14 hive> show tables;
15 OK
16 tb_user
17 user
18 Time taken: 0.035 seconds, Fetched: 2 row(s)
19 hive> select * from tb_user;
20 OK
21 1    张三    15236083001
22 2    李四    15236083001
23 3    王五    15236083001
24 4    小明    15236083001
25 5    小红    15236083001
26 6    小别    15236083001
27 7    7    7
28 8    8    8
29 9    9    9
30 10    10    10
31 Time taken: 1.14 seconds, Fetched: 10 row(s)
32 hive> 

本文参与腾讯云自媒体分享计划,欢迎正在阅读的你也加入,一起分享。

我来说两句

0 条评论
登录 后参与评论

相关文章

  • java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClie

    别先生
  • gradle的安装,配置,构建,研究,初体验......(入职一周研究的第一个大知识点)

    (1)Gradle是一个基于Apache Ant和Apache Maven概念的项目自动化构建工具。它使用一种基于Groovy的特定领域语言(DSL)来声明项...

    别先生
  • java.lang.NullPointerException at java.lang.ProcessBuilder.start(Unknown Source) at org.apache.had

    1:问题出现的原因,部署好的hadoop-2.6.4进行window10操作hadoop api出现的错误,具体错误是我向hdfs上传文件,还好点,之前解决过,...

    别先生
  • HttpClient在多线程环境下踩坑总结

    在多线程环境下使用HttpClient组件对某个HTTP服务发起请求,运行一段时间之后发现客户端主机CPU利用率呈现出下降趋势,而不是一个稳定的状态。 而且,...

    2Simple
  • freemark学习(一):hello freemark

    FreeMarker是一款模板引擎: 即一种基于模板和要改变的数据,并用来生成输出文本(HTML网页、电子邮件、配置文件、源代码等)的通用工具。它不是面向最终用...

    凯哥Java
  • java.sql.BatchUpdateException: Can not issue SELECT via executeUpdate() or executeLargeUpdate().

    java.sql.BatchUpdateException: Can not issue SELECT via executeUpdate() or execu...

    一个会写诗的程序员
  • 如何在 Keras 中从零开始开发一个神经机器翻译系统?

    机器翻译是一项具有挑战性的任务,包含一些使用高度复杂的语言知识开发的大型统计模型。 神经机器翻译的工作原理是——利用深层神经网络来解决机器翻译问题。 在本教程...

    AI研习社
  • 腾讯云游戏开发者技术沙龙 : 如何免费运营亿级手Q用户?

    游戏社交化是近年来游戏行业发展的重要趋势,实时互动的实现和社交平台的能力是游戏社交化的两大关键。游戏中玩家的沟通协作从最初的文字交流,逐渐发展为音、视频结合的多...

    腾讯游戏云
  • 2020社交电商走向何方?方雨这场个人年会给出了答案

    回望2019年,互联网上讨论最多的一个名词可能是5G。2020年到来前夕,5G蓄势待发,未来将给各行各业带来巨大的影响和变化。和互联网发展密切相关的社交电商,也...

    罗超频道
  • 还没用上 JDK 11吧,JDK 12 早期访问构建版使用

    JDK 更新速度快的飞起,JDK 12 早期访问构建版已发布,你现在用到了第几版本?

    搜云库技术团队

扫码关注云+社区

领取腾讯云代金券