该方法属于SparkSession,名称为getOrCreate()
详细的例外是
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.JsonMappingException.<init>(Ljava/io/Closeable;Ljava/lang/String;)V
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:718)
at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:62)
at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala)
at org.apache.spark.scheduler.EventLoggingListener.initEventLog(EventLoggingListener.scala:89)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:84)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:610)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
at com.hiido.server.service.impl.SparkSqlJob.executing(SparkSqlJob.java:56)
at com.hiido.server.service.impl.SparkSqlJob.main(SparkSqlJob.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:737)
有人说这是因为版本冲突,我不同意。因为我已经检查了我的火花版本,它是spark_core_2.12-3.2.1,杰克逊版本是2.12.3,spark_version是3.2.1-bin-hadoop2.7,我不知道这个问题。这个问题只会在星星团中发生,当我使用本地的时候就没问题了。谢谢。
补充:这是我的pom.xml,我只显示我的虔诚,斯里
<properties>
<java.version>1.8</java.version>
<geospark.version>1.2.0</geospark.version>
<spark.compatible.verison>2.3</spark.compatible.verison>
<!--<spark.version>2.3.4</spark.version>-->
<spark.version>3.2.1</spark.version>
<hadoop.version>2.7.2</hadoop.version>
<geotools.version>19.0</geotools.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/io.netty/netty-all -->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.68.Final</version>
</dependency>
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.4</version>
</dependency>-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId> jackson-annotations</artifactId>
<version>2.12.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId> jackson-core</artifactId>
<version>2.12.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.12.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
<version>3.0.8</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.15</version>
</dependency>
<!-- geospark -->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark</artifactId>
<version>${geospark.version}</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-sql_${spark.compatible.verison}</artifactId>
<version>${geospark.version}</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-viz_${spark.compatible.verison}</artifactId>
<version>${geospark.version}</version>
</dependency>
<!-- geospark -->
<dependency>
<groupId>org.apache.sedona</groupId>
<artifactId>sedona-core-3.0_2.12</artifactId>
<version>1.1.1-incubating</version>
</dependency>
<dependency>
<groupId>org.apache.sedona</groupId>
<artifactId>sedona-sql-3.0_2.12</artifactId>
<version>1.1.1-incubating</version>
</dependency>
<dependency>
<groupId>org.apache.sedona</groupId>
<artifactId>sedona-viz-3.0_2.12</artifactId>
<version>1.1.1-incubating</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>sernetcdf</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<!-- <version>2.3.4</version>-->
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.12</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.2.1</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.12</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.12</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>${hadoop.version}</version>
<scope>${dependency.scope}</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<scope>${dependency.scope}</scope>
</dependency>
<dependency>
<groupId>net.sourceforge.javacsv</groupId>
<artifactId>javacsv</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.2.5</version>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>3.0.8</version>
</dependency>
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-grid</artifactId>
<version>${geotools.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.locationtech.spatial4j/spatial4j -->
<dependency>
<groupId>org.locationtech.spatial4j</groupId>
<artifactId>spatial4j</artifactId>
<version>0.8</version>
</dependency>
<!-- JTS is essentially only used for polygons. -->
<!-- https://mvnrepository.com/artifact/org.locationtech.jts/jts-core -->
<dependency>
<groupId>org.locationtech.jts</groupId>
<artifactId>jts-core</artifactId>
<version>1.18.1</version>
</dependency>
</dependencies>
发布于 2022-11-22 16:19:30
这很可能是由Maven pom.xml中错误的打包策略造成的。
compile
作用域进行本地测试,并在部署到集群时更改为provided
范围.中的许多版本冲突。
和Sedona的依赖项,并按照以下说明操作: GeoSpark
下面是Sedona + Spark项目的一个可运行示例:https://github.com/apache/incubator-sedona/blob/master/examples/sql/build.sbt#L59。虽然它是用sbt编写的,但是POM共享相同的逻辑。
请密切注意那些dependency scope
和exclude
零件。
https://stackoverflow.com/questions/74484771
复制相似问题