首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >火花-提交: NoSuchMethodError: NoSuchMethodError

火花-提交: NoSuchMethodError: NoSuchMethodError
EN

Stack Overflow用户
提问于 2022-11-18 03:55:51
回答 1查看 27关注 0票数 0

该方法属于SparkSession,名称为getOrCreate()

详细的例外是

代码语言:javascript
运行
复制
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.JsonMappingException.<init>(Ljava/io/Closeable;Ljava/lang/String;)V
    at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
    at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
    at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
    at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:718)
    at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:62)
    at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala)
    at org.apache.spark.scheduler.EventLoggingListener.initEventLog(EventLoggingListener.scala:89)
    at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:84)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:610)
    at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
    at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
    at scala.Option.getOrElse(Option.scala:189)
    at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
    at com.hiido.server.service.impl.SparkSqlJob.executing(SparkSqlJob.java:56)
    at com.hiido.server.service.impl.SparkSqlJob.main(SparkSqlJob.java:47)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:737)

有人说这是因为版本冲突,我不同意。因为我已经检查了我的火花版本,它是spark_core_2.12-3.2.1,杰克逊版本是2.12.3,spark_version是3.2.1-bin-hadoop2.7,我不知道这个问题。这个问题只会在星星团中发生,当我使用本地的时候就没问题了。谢谢。

补充:这是我的pom.xml,我只显示我的虔诚,斯里

代码语言:javascript
运行
复制
    <properties>
        <java.version>1.8</java.version>
        <geospark.version>1.2.0</geospark.version>
        <spark.compatible.verison>2.3</spark.compatible.verison>
        <!--<spark.version>2.3.4</spark.version>-->
        <spark.version>3.2.1</spark.version>
        <hadoop.version>2.7.2</hadoop.version>
        <geotools.version>19.0</geotools.version>
    </properties>
    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>


        <!-- https://mvnrepository.com/artifact/io.netty/netty-all -->
        <dependency>
            <groupId>io.netty</groupId>
            <artifactId>netty-all</artifactId>
            <version>4.1.68.Final</version>
        </dependency>


        <!--        <dependency>
                    <groupId>com.fasterxml.jackson.core</groupId>
                    <artifactId>jackson-databind</artifactId>
                    <version>2.4.4</version>
                </dependency>-->
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId> jackson-annotations</artifactId>
            <version>2.12.3</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId> jackson-core</artifactId>
            <version>2.12.3</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>jackson-databind</artifactId>
            <version>2.12.3</version>
            <scope>compile</scope>
        </dependency>

        <dependency>
            <groupId>org.codehaus.janino</groupId>
            <artifactId>commons-compiler</artifactId>
            <version>3.0.8</version>
        </dependency>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.12.15</version>
        </dependency>
        <!-- geospark -->
        <dependency>
            <groupId>org.datasyslab</groupId>
            <artifactId>geospark</artifactId>
            <version>${geospark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.datasyslab</groupId>
            <artifactId>geospark-sql_${spark.compatible.verison}</artifactId>
            <version>${geospark.version}</version>
        </dependency>
        <dependency>
            <groupId>org.datasyslab</groupId>
            <artifactId>geospark-viz_${spark.compatible.verison}</artifactId>
            <version>${geospark.version}</version>
        </dependency>
        <!-- geospark -->
        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-core-3.0_2.12</artifactId>
            <version>1.1.1-incubating</version>
        </dependency>
        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-sql-3.0_2.12</artifactId>
            <version>1.1.1-incubating</version>
        </dependency>
        <dependency>
            <groupId>org.apache.sedona</groupId>
            <artifactId>sedona-viz-3.0_2.12</artifactId>
            <version>1.1.1-incubating</version>
        </dependency>


        <dependency>
            <groupId>org.datasyslab</groupId>
            <artifactId>sernetcdf</artifactId>
            <version>0.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.12</artifactId>
<!--            <version>2.3.4</version>-->
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.12</artifactId>
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.12</artifactId>
            <version>3.2.1</version>
            <!--<scope>provided</scope>-->
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.12</artifactId>
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.12</artifactId>
            <version>3.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-mapreduce-client-core</artifactId>
            <version>${hadoop.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>${hadoop.version}</version>
            <scope>${dependency.scope}</scope>
        </dependency>
        <dependency>
            <groupId>net.sourceforge.javacsv</groupId>
            <artifactId>javacsv</artifactId>
            <version>2.0</version>
        </dependency>
        <dependency>
            <groupId>org.postgresql</groupId>
            <artifactId>postgresql</artifactId>
            <version>42.2.5</version>
        </dependency>
        <dependency>
            <groupId>org.codehaus.janino</groupId>
            <artifactId>janino</artifactId>
            <version>3.0.8</version>
        </dependency>


        <dependency>
            <groupId>org.geotools</groupId>
            <artifactId>gt-grid</artifactId>
            <version>${geotools.version}</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.locationtech.spatial4j/spatial4j -->
        <dependency>
            <groupId>org.locationtech.spatial4j</groupId>
            <artifactId>spatial4j</artifactId>
            <version>0.8</version>
        </dependency>
        <!-- JTS is essentially only used for polygons. -->
        <!-- https://mvnrepository.com/artifact/org.locationtech.jts/jts-core -->
        <dependency>
            <groupId>org.locationtech.jts</groupId>
            <artifactId>jts-core</artifactId>
            <version>1.18.1</version>
        </dependency>
    </dependencies>
EN

回答 1

Stack Overflow用户

发布于 2022-11-23 00:19:30

这很可能是由Maven pom.xml中错误的打包策略造成的。

  1. 您的POM.xml包含许多不应该放在“编译”范围内的包。例如,火花和hadoop依赖关系。星系团通常已经拥有了所有这些库。如果您错误地将它们包含在您的罐子中,那么是否会使用哪个杰克逊版本是不确定的。请将其更改为“提供”范围。Spark和Sedona开发人员通常做的是,使用compile作用域进行本地测试,并在部署到集群时更改为provided范围.

  1. 通常不需要包含Hadoop依赖项,因为Spark附带了许多Hadoop依赖项。这将导致JackSon.

中的许多版本冲突。

  1. 您的GeoSpark依赖是错误的。请同时删除旧的https://sedona.apache.org/setup/maven-coordinates/#use-sedona-fat-jars

和Sedona的依赖项,并按照以下说明操作: GeoSpark

下面是Sedona + Spark项目的一个可运行示例:https://github.com/apache/incubator-sedona/blob/master/examples/sql/build.sbt#L59。虽然它是用sbt编写的,但是POM共享相同的逻辑。

请密切注意那些dependency scopeexclude零件。

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/74484771

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档