Loading [MathJax]/jax/output/CommonHTML/config.js
首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
社区首页 >问答首页 >火花卡桑德拉爪哇连接NoSuchMethodError或NoClassDefFoundError

火花卡桑德拉爪哇连接NoSuchMethodError或NoClassDefFoundError
EN

Stack Overflow用户
提问于 2016-09-08 15:32:57
回答 2查看 792关注 0票数 1

从Spark应用程序提交到托管在我的机器上的星火集群,我试图连接到我的计算机@ 127.0.0.1:9042上托管的Cassandra DB,而我的Spring应用程序无法启动。

方法1 -

**基于链接,我在POM文件-**中包含了以下内容

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>

方法1- NoSuchMethodError -日志文件:

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
16/09/08 15:12:50 ERROR SpringApplication: Application startup failed
java.lang.NoSuchMethodError: com.datastax.driver.core.KeyspaceMetadata.getMaterializedViews()Ljava/util/Collection;
    at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchTables$1(Schema.scala:281)
    at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:305)
    at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:304)
    at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:683)
    at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:316)
    at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:972)
    at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:972)
    at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:682)
    at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1(Schema.scala:304)
    at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:325)
    at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:322)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:122)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:121)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110)
    at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:140)
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:110)
    at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:121)
    at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:322)
    at com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:342)
    at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(CassandraTableRowReaderProvider.scala:50)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:60)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:60)
    at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:137)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:60)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:232)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:875)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:873)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
    at org.apache.spark.rdd.RDD.foreach(RDD.scala:873)
    at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:350)
    at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
    at com.initech.myapp.cassandra.service.CassandraDataService.getMatches(CassandraDataService.java:45)
    at com.initech.myapp.processunit.MySparkApp.receive(MySparkApp.java:120)
    at com.initech.myapp.processunit.MySparkApp.process(MySparkApp.java:61)
    at com.initech.myapp.processunit.MySparkApp.run(MySparkApp.java:144)
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:789)
    at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:779)
    at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:769)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1185)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1174)
    at com.initech.myapp.MySparkAppBootApp.main(MyAppProcessingUnitsApplication.java:20)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
    at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
    at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
    at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:58)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
16/09/08 15:12:50 INFO AnnotationConfigApplicationContext: Closing org.springframework.context.annotation.AnnotationConfigApplicationContext@3381b4fc: startup date [Thu Sep 08 15:12:40 PDT 2016]; root of context hierarchy

方法2 -

**由于我正在开发的是一个Java应用程序,我想使用,并在POM文件-**中包含了以下内容

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>2.0.0-M3</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.11</artifactId>
            <version>1.2.6</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.0.0</version>
        </dependency>

最后得到了这个

方法2- SelectableColumnRef NoClassDefFoundError -日志文件:

16/09/08 16:28:07错误SpringApplication:应用程序启动失败java.lang.NoClassDefFoundError: com.initech.myApp.cassandra.service.CassandraDataService.getMatches(CassandraDataService.java:41)

**我的Spark方法调用下面的process()方法

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
    public boolean process() throws InterruptedException {

    logger.debug("In the process() method");

    SparkConf sparkConf = new SparkConf().setAppName("My Process Unit");

    sparkConf.set("spark.cassandra.connection.host", "127.0.0.1");
    sparkConf.set("spark.cassandra.connection.port","9042");

    logger.debug("SparkConf = " + sparkConf);

    JavaStreamingContext javaStreamingContext = new JavaStreamingContext(sparkConf, new Duration(1000));

    logger.debug("JavaStreamingContext = " + javaStreamingContext);

    JavaSparkContext javaSparkContext = javaStreamingContext.sparkContext();

    logger.debug("Java Spark context = " + javaSparkContext);

    JavaRDD<MyData> myDataJavaRDD = receive(javaSparkContext);

    myDataJavaRDD.foreach(myData -> {
        logger.debug("myData = " + myData);
    });

    javaStreamingContext.start();
    javaStreamingContext.awaitTermination();

    return true; }

**调用下面的接收()

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
    private JavaRDD<MyData> receive(JavaSparkContext javaSparkContext) {
    logger.debug("receive method called...");

    List<String> myAppConfigsStrings = myAppConfiguration.get();
    logger.debug("Received ..." + myAppConfigsStrings);

    for(String myAppConfigStr : myAppConfigsStrings)
    {
        ObjectMapper mapper = new ObjectMapper();
        MyAppConfig myAppConfig;
        try {

            logger.debug("Parsing the myAppConfigStr..." + myAppConfigStr);

            myAppConfig = mapper.readValue(myAppConfigStr, MyAppConfig.class);

            logger.debug("Parse Complete...");

            // Check for matching data in Cassandra
            JavaRDD<MyData> cassandraRowsRDD = cassandraDataService.getMatches(myAppConfig, javaSparkContext);

            cassandraRowsRDD.foreach(myData -> {
                logger.debug("myData = " + myData);
            });

            return cassandraRowsRDD;

        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    return null;
}

**最终调用卡桑德拉数据服务getMatches()下面**

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
@Service    
public class CassandraDataService implements Serializable {    

    private static final Log logger = LogFactory.getLog(CassandraDataService.class);    

    public JavaRDD<MyData> getMatches(MyAppConfig myAppConfig, JavaSparkContext javaSparkContext) {    

        logger.debug("Creating the MyDataID...");    

        MyDataID myDataID = new MyDataID();    
        myDataID.set...(myAppConfig.get...);    
        myDataID.set...(myAppConfig.get...);    
        myDataID.set...(myAppConfig.get...);    

        logger.debug("MyDataID = " + myDataID);    

        JavaRDD<MyData> cassandraRowsRDD = javaFunctions(javaSparkContext).cassandraTable("myKeySpace", "myData", mapRowTo(MyData.class));    

        cassandraRowsRDD.foreach(myData -> {    
            logger.debug("====== Cassandra Data Service ========");    
            logger.debug("myData = " + myData);    
            logger.debug("====== Cassandra Data Service ========");    
        });    

        return cassandraRowsRDD;    
    }    
}    

有没有人经历过类似的错误,或者可以为我提供一些方向?我试过搜索和阅读几个项目--但没有一个项目可以挽救。谢谢。

更新9/9/2016 2:15 PM PST

我尝试了上面的方法。以下是我所做的-

  1. 使用一个工作线程运行的星火群集
  2. 使用Spring Boot Uber Jar提交我的Spark应用程序使用火花提交命令如下- ./bin/ org.springframework.boot.loader.JarLauncher提交--类org.springframework.boot.loader.JarLauncher--主星星之火://localhost:6066-部署模式集群org.springframework.boot.loader.JarLauncher
  3. 星火驱动程序成功启动并启动了我的Spark应用程序,并被设置为“等待”状态,因为只有一个正在运行的工作人员被分配给驱动程序
  4. 然后我启动了另一个辅助线程,然后another线程由于"java.lang.ClassNotFoundException: java.lang.ClassNotFoundException“而失败下面是堆栈跟踪。

如果它在某种程度上是有用的-他就是我正在使用的堆栈。

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
1. cqlsh 5.0.1 | Cassandra 2.2.7 | CQL spec 3.3.1
2. Spark - 2.0.0
3. Spring Boot - 1.4.0.RELEASE
4. Jar's listed in the Approach 1 above

异常堆栈跟踪

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
    16/09/09 14:13:24 ERROR SpringApplication: Application startup failed
    java.lang.IllegalStateException: Failed to execute ApplicationRunner
        at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:792)
        at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:779)
        at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:769)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:1185)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:1174)
        at com.initech.officespace.MySpringBootSparkApp.main(MySpringBootSparkApp.java:23)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
        at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:58)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
        at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
    Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 6, 192.168.0.30): java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

    Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1450)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1438)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1437)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1437)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1659)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1871)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1884)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1897)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911)
        at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:875)
        at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:873)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
        at org.apache.spark.rdd.RDD.foreach(RDD.scala:873)
        at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:350)
        at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
        at com.initech.officespace.cassandra.service.CassandraDataService.getMatches(CassandraDataService.java:43)
        at com.initech.officespace.processunit.MyApp.receive(MyApp.java:120)
        at com.initech.officespace.processunit.MyApp.process(MyApp.java:61)
        at com.initech.officespace.processunit.MyApp.run(MyApp.java:144)
        at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:789)
        ... 20 more
    Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
    16/09/09 14:13:24 INFO AnnotationConfigApplicationContext: Closing org.springframework.context.annotation.AnnotationConfigApplicationContext@3381b4fc: startup date [Fri Sep 09 14:10:40 PDT 2016]; root of context hierarchy

2016年9/9更新2 3:20太平洋标准时间下午20

问题现在是根据RussS @ datastax火花卡桑德拉连接器的问题提供的答案解决的

在更新我的火花-提交到下面,我看到工人能够拿起连接者,并开始工作的RDD :)

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
./bin/spark-submit --class org.springframework.boot.loader.JarLauncher --master spark://localhost:6066 --deploy-mode cluster  --packages com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3  /Users/apple/Repos/Initech/Officespace/target/my-spring-spark-boot-streaming-app-0.1-SNAPSHOT.jar
EN

回答 2

Stack Overflow用户

发布于 2020-02-11 02:33:37

解决办法可能不一样。

当我试图在java上运行PC(驱动程序)上的cassandra时,我遇到了这个异常。

您可以在SparkContext中添加火花卡桑德拉连接器的jar,在我的示例中,如下所示:

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
JavaSparkContext sc = new JavaSparkContext(conf);
    sc.addJar("./build/libs/spark-cassandra-connector_2.11-2.4.2.jar"); // location of driver could be different.
票数 0
EN

Stack Overflow用户

发布于 2016-09-08 16:53:22

com.datastax.driver.core.KeyspaceMetadata.getMaterializedViews是驱动程序的启动版本3.0。

尝试将此依赖项添加到版本1中:

代码语言:javascript
代码运行次数:0
运行
AI代码解释
复制
<dependency>
    <groupId>com.datastax.cassandra</groupId>
    <artifactId>cassandra-driver-core</artifactId>
    <version>3.1.0</version>
</dependency>
票数 -1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/39401640

复制
相关文章
ClassNotFoundException,NoClassDefFoundError,NoSuchMethodError排查
在使用java开发的过程中时常会碰到以上三个错误,其中NoClassDefFoundError、NoSuchMethodError两个error遭遇得会多一些。本文会简单分析三个异常发生的原因,并给出排查思路和相关工具。
LNAmp
2018/09/05
2K0
Java.lang.Illegalaccessexception 或 java.lang.NoSuchMethodError
今天和小伙伴讨论一个mybatis-plus的一个诡异问题,最后定位到原因竟然是lombok@Data和@Builder一起用无法添加无参构造方法引起的,非常隐蔽。
AlbertZhang
2020/09/16
9810
Sqoop 连接mysql 错误 java.lang.NoClassDefFoundError
原因在异常中已经很明显了:没有找到类 org.apache.commons.lang.StringUtils  也就是说Sqoop中没有这个类或者包含这个类的jar包。但是我查看lib目录下的jar包,是包含commons-lang3的。上网查到因为Sqoop版本大部分是支持2 的。
来自银河系的员程序
2022/02/11
1.8K0
Sqoop 连接mysql 错误 java.lang.NoClassDefFoundError
NoClassDefFoundError
在new JettisonMapperdXmlDriver()对象时 其构造器如下
Jack Chen
2018/09/14
1.1K0
NoClassDefFoundError
日常问题——flume连接hive时报错Caused by: java.lang.NoSuchMethodError
今天新安装的flume,使用flume来做kafka与hive对接时出现了以下两个的错误:
栗筝i
2022/12/01
5520
java.lang.NoSuchMethodError
NoSuchMethodError是一个运行时错误,在编译时一般不会出现这个错误。
Java廖志伟
2022/09/29
1K0
解决在 Spring Boot 中运行 JUnit 测试遇到的 NoSuchMethodError 错误
在本文章中,我们将会解决在 Spring Boot 运行测试的时候,得到 NoSuchMethodError 和 NoClassDefFoundError 的 JUnit 错误。
HoneyMoose
2022/08/25
2.8K0
解决在 Spring Boot 中运行 JUnit 测试遇到的 NoSuchMethodError 错误
NoClassDefFoundError: org/hibernate/engine/transaction/spi/TransactionContext
本文介绍了如何使用Spring Boot和Spring Data JPA简化Hibernate和MySQL的CRUD操作,展示了如何利用Spring Data JPA简化代码,提高开发效率。同时,还介绍了如何使用Spring Data JPA进行分页查询和排序,以及如何使用Hibernate进行原生SQL查询。
程裕强
2018/01/02
3.1K0
ClassNotFoundException v/s NoClassDefFoundError
在这个小节里,我们讨论一下ClassNotFoundException与NoClassDefFoundError的区别。
Erossssssss
2021/04/09
1.3K0
ClassNotFoundException v/s NoClassDefFoundError
lagou 爪哇 3-2 zookeeper 笔记
分布式系统的协调工作就是通过某种方式,让每个节点的信息能够同步和共享。这依赖于服务进程之间的通信。通信方式有两种:
acc8226
2022/05/17
4050
lagou 爪哇 3-2 zookeeper 笔记
lagou 爪哇 3-3 dubbo 笔记
Apache Dubbo是一款高性能的 Java RPC 框架。其前身是阿里巴巴公司开源的一个高性能、轻量级的开源 Java RPC框架,可以和 Spring 框架无缝集成。
acc8226
2022/05/17
4410
lagou 爪哇 3-3 dubbo 笔记
NoClassDefFoundError: ch/qos/logback/classic/spi/ThrowableProxy
This warning, i.e. not an error, message is reported when no SLF4J providers could be found on the class path. Placing one (and only one) of slf4j-nop.jar slf4j-simple.jar, slf4j-log4j12.jar, slf4j-jdk14.jar or logback-classic.jar on the class path should solve the problem. Note that these providers must target slf4j-api 1.8 or later.
一个会写诗的程序员
2018/08/17
6.2K0
解决WIFI无线连接或连接上很慢
    随着移动互联网的发展,很多公司和家庭都通过WIFI上网。对于众多使用WIFI的人来说,多多少少会遇得到很多WIFI问题,我就列3个常用的WIFI问题,看看是如何解决的。
全栈程序员站长
2021/09/07
2.5K0
用最简单的一个例子看maven冲突的解决办法
java.lang.NoSuchMethodError: com.flash.conflict.b.BClass.method2()V
全栈程序员站长
2021/04/07
3570
类加载常见错误总结,写得非常好!
作者:fredalxin 地址:https://fredal.xin/classloader-error
肉眼品世界
2021/05/07
1.1K0
类加载常见错误总结,写得非常好!
NoClassDefFoundError 和 ClassNotFoundException异常
1.简介 ClassNotFoundException和NoClassDefFoundError是当JVM无法找到classpath请求的类发生。
FHAdmin
2021/08/04
1.1K0
lagou 爪哇 1-3 spring mvc 笔记
作业一: 手写MVC框架基础上增加如下功能 1)定义注解@Security(有value属性,接收String数组),该注解用于添加在Controller类或者Handler方法上,表明哪些用户拥有访问该Handler方法的权限(注解配置用户名)
acc8226
2022/05/17
1.1K0
lagou 爪哇 1-3 spring mvc 笔记
Java 类加载器解析及常见类加载问题
原文 https://www.toutiao.com/article/6812564562244534787 java.lang.ClassLoader 每个类加载器本身也是个对象——一个继承 java.lang.ClassLoader 的实例。每个类被其中一个实例加载。我们下面来看看 java.lang.ClassLoader 中的 API, 不太相关的部分已忽略。 package java.lang; public abstract class ClassLoader {   public Clas
程序猿DD
2022/06/13
1K0
Java 类加载器解析及常见类加载问题
android-NoSuchMethodError-错误记录
java.lang.NoSuchMethodError: No virtual method addOnPageChangeListener(Landroid/support/v4/view/ViewPager$OnPageChangeListener;)V in class Landroid/support/v4/view/ViewPager; or its super classes (declaration of ‘android.support.v4.view.ViewPager’ appears in /data/data/com.weijie.ckapp/files/instant-run/dex/slice-support-v4-r7_6b487e8b8eac5541972e73dc0c160b63dd97c123-classes.dex)
tea9
2022/07/16
5610
lagou 爪哇 1-1 mybatis 笔记
Mybatis框架是一个半自动的ORM持久层框架,也可以在Java中实现类似 insert(User)的操作最终操作数据库,但是需要我们自己写Sql语句。Mybatis是目前比较流行的Dao层框架。
acc8226
2022/05/17
7940

相似问题

卡桑德拉火花连接器- NoSuchMethodError: NoSuchMethodError

35

带火花卡桑德拉连接器的NoSuchMethodError

11

火花卡桑德拉连接

11

电火花卡桑德拉连接器NoclassDefFoundError,util/日志记录

114

火花卡桑德拉NoClassDefFoundError番石榴/cache/CacheLoader

22
添加站长 进交流群

领取专属 10元无门槛券

AI混元助手 在线答疑

扫码加入开发者社群
关注 腾讯云开发者公众号

洞察 腾讯核心技术

剖析业界实践案例

扫码关注腾讯云开发者公众号
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档
查看详情【社区公告】 技术创作特训营有奖征文