本文描述问题及解决方法同样适用于 腾讯云 Elasticsearch Service(ES)。
Spark 版本:2.3.1
Elasticsearch :7.14.2
spark连接es写入报错
[HEAD] on [yuqing_info1] failed; server[https://es-8gp5f0ej.public.tencentelasticsearch.com:9200] returned [403|Forbidden:]
问题产生原因是用户在向es中写入数据的时候,指定了索引,但是这个索引不存在。
SparkConf sparkConf = new SparkConf().setAppName("TestEs").setMaster("local[*]")
.set("es.index.auto.create", "true")
.set("es.nodes", "https://es-8gp5f0ej.public.tencentelasticsearch.com")
// .set("es.nodes", "129.204.98.76")
.set("es.port", "9200")
.set("es.nodes.wan.only", "true")
.set("es.http.timeout", "3000")
.set("es.http.retries", "5")
.set("es.net.http.auth.user", "elastic")
.set("es.net.http.auth.pass", "mc!VaY@9ng#kI^Q*");
SparkSession sparkSession = SparkSession.builder().config(sparkConf).getOrCreate();
JavaSparkContext javaSparkContext = new JavaSparkContext(sparkSession.sparkContext());
Map<String, ?> number = ImmutableMap.of("one",1, "two", 2);
Map<String, ?> airports = ImmutableMap.of("OTP","Otpeni", "SFO", "San Fran");
JavaRDD<Map<String, ?>> javaRDD = javaSparkContext.parallelize(ImmutableList.of(number, airports));
JavaEsSpark.saveToEs(javaRDD,"spark2");
对应的pom文件
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark-20_2.10</artifactId>
<version>7.14.2</version>
</dependency>
<!-- Spark dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.11.8</version>
</dependency>
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。