首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >火花卡桑德拉爪哇连接NoSuchMethodError或NoClassDefFoundError

火花卡桑德拉爪哇连接NoSuchMethodError或NoClassDefFoundError
EN

Stack Overflow用户
提问于 2016-09-08 23:32:57
回答 2查看 792关注 0票数 1

从Spark应用程序提交到托管在我的机器上的星火集群,我试图连接到我的计算机@ 127.0.0.1:9042上托管的Cassandra DB,而我的Spring应用程序无法启动。

方法1 -

**基于链接,我在POM文件-**中包含了以下内容

代码语言:javascript
运行
复制
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>

方法1- NoSuchMethodError -日志文件:

代码语言:javascript
运行
复制
16/09/08 15:12:50 ERROR SpringApplication: Application startup failed
java.lang.NoSuchMethodError: com.datastax.driver.core.KeyspaceMetadata.getMaterializedViews()Ljava/util/Collection;
    at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchTables$1(Schema.scala:281)
    at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:305)
    at com.datastax.spark.connector.cql.Schema$$anonfun$com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1$2.apply(Schema.scala:304)
    at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:683)
    at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:316)
    at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:972)
    at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:972)
    at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:682)
    at com.datastax.spark.connector.cql.Schema$.com$datastax$spark$connector$cql$Schema$$fetchKeyspaces$1(Schema.scala:304)
    at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:325)
    at com.datastax.spark.connector.cql.Schema$$anonfun$fromCassandra$1.apply(Schema.scala:322)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:122)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withClusterDo$1.apply(CassandraConnector.scala:121)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:111)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$withSessionDo$1.apply(CassandraConnector.scala:110)
    at com.datastax.spark.connector.cql.CassandraConnector.closeResourceAfterUse(CassandraConnector.scala:140)
    at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:110)
    at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:121)
    at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:322)
    at com.datastax.spark.connector.cql.Schema$.tableFromCassandra(Schema.scala:342)
    at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.tableDef(CassandraTableRowReaderProvider.scala:50)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef$lzycompute(CassandraTableScanRDD.scala:60)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.tableDef(CassandraTableScanRDD.scala:60)
    at com.datastax.spark.connector.rdd.CassandraTableRowReaderProvider$class.verify(CassandraTableRowReaderProvider.scala:137)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.verify(CassandraTableScanRDD.scala:60)
    at com.datastax.spark.connector.rdd.CassandraTableScanRDD.getPartitions(CassandraTableScanRDD.scala:232)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248)
    at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.rdd.RDD.partitions(RDD.scala:246)
    at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:875)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:873)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
    at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
    at org.apache.spark.rdd.RDD.foreach(RDD.scala:873)
    at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:350)
    at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
    at com.initech.myapp.cassandra.service.CassandraDataService.getMatches(CassandraDataService.java:45)
    at com.initech.myapp.processunit.MySparkApp.receive(MySparkApp.java:120)
    at com.initech.myapp.processunit.MySparkApp.process(MySparkApp.java:61)
    at com.initech.myapp.processunit.MySparkApp.run(MySparkApp.java:144)
    at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:789)
    at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:779)
    at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:769)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1185)
    at org.springframework.boot.SpringApplication.run(SpringApplication.java:1174)
    at com.initech.myapp.MySparkAppBootApp.main(MyAppProcessingUnitsApplication.java:20)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
    at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
    at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
    at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:58)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
    at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
16/09/08 15:12:50 INFO AnnotationConfigApplicationContext: Closing org.springframework.context.annotation.AnnotationConfigApplicationContext@3381b4fc: startup date [Thu Sep 08 15:12:40 PDT 2016]; root of context hierarchy

方法2 -

**由于我正在开发的是一个Java应用程序,我想使用,并在POM文件-**中包含了以下内容

代码语言:javascript
运行
复制
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector_2.11</artifactId>
            <version>2.0.0-M3</version>
        </dependency>
        <dependency>
            <groupId>com.datastax.spark</groupId>
            <artifactId>spark-cassandra-connector-java_2.11</artifactId>
            <version>1.2.6</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.0.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.11</artifactId>
            <version>2.0.0</version>
        </dependency>

最后得到了这个

方法2- SelectableColumnRef NoClassDefFoundError -日志文件:

16/09/08 16:28:07错误SpringApplication:应用程序启动失败java.lang.NoClassDefFoundError: com.initech.myApp.cassandra.service.CassandraDataService.getMatches(CassandraDataService.java:41)

**我的Spark方法调用下面的process()方法

代码语言:javascript
运行
复制
    public boolean process() throws InterruptedException {

    logger.debug("In the process() method");

    SparkConf sparkConf = new SparkConf().setAppName("My Process Unit");

    sparkConf.set("spark.cassandra.connection.host", "127.0.0.1");
    sparkConf.set("spark.cassandra.connection.port","9042");

    logger.debug("SparkConf = " + sparkConf);

    JavaStreamingContext javaStreamingContext = new JavaStreamingContext(sparkConf, new Duration(1000));

    logger.debug("JavaStreamingContext = " + javaStreamingContext);

    JavaSparkContext javaSparkContext = javaStreamingContext.sparkContext();

    logger.debug("Java Spark context = " + javaSparkContext);

    JavaRDD<MyData> myDataJavaRDD = receive(javaSparkContext);

    myDataJavaRDD.foreach(myData -> {
        logger.debug("myData = " + myData);
    });

    javaStreamingContext.start();
    javaStreamingContext.awaitTermination();

    return true; }

**调用下面的接收()

代码语言:javascript
运行
复制
    private JavaRDD<MyData> receive(JavaSparkContext javaSparkContext) {
    logger.debug("receive method called...");

    List<String> myAppConfigsStrings = myAppConfiguration.get();
    logger.debug("Received ..." + myAppConfigsStrings);

    for(String myAppConfigStr : myAppConfigsStrings)
    {
        ObjectMapper mapper = new ObjectMapper();
        MyAppConfig myAppConfig;
        try {

            logger.debug("Parsing the myAppConfigStr..." + myAppConfigStr);

            myAppConfig = mapper.readValue(myAppConfigStr, MyAppConfig.class);

            logger.debug("Parse Complete...");

            // Check for matching data in Cassandra
            JavaRDD<MyData> cassandraRowsRDD = cassandraDataService.getMatches(myAppConfig, javaSparkContext);

            cassandraRowsRDD.foreach(myData -> {
                logger.debug("myData = " + myData);
            });

            return cassandraRowsRDD;

        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    return null;
}

**最终调用卡桑德拉数据服务getMatches()下面**

代码语言:javascript
运行
复制
@Service    
public class CassandraDataService implements Serializable {    

    private static final Log logger = LogFactory.getLog(CassandraDataService.class);    

    public JavaRDD<MyData> getMatches(MyAppConfig myAppConfig, JavaSparkContext javaSparkContext) {    

        logger.debug("Creating the MyDataID...");    

        MyDataID myDataID = new MyDataID();    
        myDataID.set...(myAppConfig.get...);    
        myDataID.set...(myAppConfig.get...);    
        myDataID.set...(myAppConfig.get...);    

        logger.debug("MyDataID = " + myDataID);    

        JavaRDD<MyData> cassandraRowsRDD = javaFunctions(javaSparkContext).cassandraTable("myKeySpace", "myData", mapRowTo(MyData.class));    

        cassandraRowsRDD.foreach(myData -> {    
            logger.debug("====== Cassandra Data Service ========");    
            logger.debug("myData = " + myData);    
            logger.debug("====== Cassandra Data Service ========");    
        });    

        return cassandraRowsRDD;    
    }    
}    

有没有人经历过类似的错误,或者可以为我提供一些方向?我试过搜索和阅读几个项目--但没有一个项目可以挽救。谢谢。

更新9/9/2016 2:15 PM PST

我尝试了上面的方法。以下是我所做的-

  1. 使用一个工作线程运行的星火群集
  2. 使用Spring Boot Uber Jar提交我的Spark应用程序使用火花提交命令如下- ./bin/ org.springframework.boot.loader.JarLauncher提交--类org.springframework.boot.loader.JarLauncher--主星星之火://localhost:6066-部署模式集群org.springframework.boot.loader.JarLauncher
  3. 星火驱动程序成功启动并启动了我的Spark应用程序,并被设置为“等待”状态,因为只有一个正在运行的工作人员被分配给驱动程序
  4. 然后我启动了另一个辅助线程,然后another线程由于"java.lang.ClassNotFoundException: java.lang.ClassNotFoundException“而失败下面是堆栈跟踪。

如果它在某种程度上是有用的-他就是我正在使用的堆栈。

代码语言:javascript
运行
复制
1. cqlsh 5.0.1 | Cassandra 2.2.7 | CQL spec 3.3.1
2. Spark - 2.0.0
3. Spring Boot - 1.4.0.RELEASE
4. Jar's listed in the Approach 1 above

异常堆栈跟踪

代码语言:javascript
运行
复制
    16/09/09 14:13:24 ERROR SpringApplication: Application startup failed
    java.lang.IllegalStateException: Failed to execute ApplicationRunner
        at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:792)
        at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:779)
        at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:769)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:1185)
        at org.springframework.boot.SpringApplication.run(SpringApplication.java:1174)
        at com.initech.officespace.MySpringBootSparkApp.main(MySpringBootSparkApp.java:23)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
        at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
        at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:58)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.worker.DriverWrapper$.main(DriverWrapper.scala:58)
        at org.apache.spark.deploy.worker.DriverWrapper.main(DriverWrapper.scala)
    Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 6, 192.168.0.30): java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)

    Driver stacktrace:
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1450)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1438)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1437)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1437)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:811)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:811)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1659)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
        at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
        at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
        at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:632)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1871)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1884)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1897)
        at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911)
        at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:875)
        at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:873)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
        at org.apache.spark.rdd.RDD.withScope(RDD.scala:358)
        at org.apache.spark.rdd.RDD.foreach(RDD.scala:873)
        at org.apache.spark.api.java.JavaRDDLike$class.foreach(JavaRDDLike.scala:350)
        at org.apache.spark.api.java.AbstractJavaRDDLike.foreach(JavaRDDLike.scala:45)
        at com.initech.officespace.cassandra.service.CassandraDataService.getMatches(CassandraDataService.java:43)
        at com.initech.officespace.processunit.MyApp.receive(MyApp.java:120)
        at com.initech.officespace.processunit.MyApp.process(MyApp.java:61)
        at com.initech.officespace.processunit.MyApp.run(MyApp.java:144)
        at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:789)
        ... 20 more
    Caused by: java.lang.ClassNotFoundException: com.datastax.spark.connector.rdd.partitioner.CassandraPartition
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:348)
        at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)
        at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
        at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
        at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
        at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
        at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
        at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
        at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
        at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
        at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:253)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
    16/09/09 14:13:24 INFO AnnotationConfigApplicationContext: Closing org.springframework.context.annotation.AnnotationConfigApplicationContext@3381b4fc: startup date [Fri Sep 09 14:10:40 PDT 2016]; root of context hierarchy

2016年9/9更新2 3:20太平洋标准时间下午20

问题现在是根据RussS @ datastax火花卡桑德拉连接器的问题提供的答案解决的

在更新我的火花-提交到下面,我看到工人能够拿起连接者,并开始工作的RDD :)

代码语言:javascript
运行
复制
./bin/spark-submit --class org.springframework.boot.loader.JarLauncher --master spark://localhost:6066 --deploy-mode cluster  --packages com.datastax.spark:spark-cassandra-connector_2.11:2.0.0-M3  /Users/apple/Repos/Initech/Officespace/target/my-spring-spark-boot-streaming-app-0.1-SNAPSHOT.jar
EN

回答 2

Stack Overflow用户

发布于 2020-02-11 10:33:37

解决办法可能不一样。

当我试图在java上运行PC(驱动程序)上的cassandra时,我遇到了这个异常。

您可以在SparkContext中添加火花卡桑德拉连接器的jar,在我的示例中,如下所示:

代码语言:javascript
运行
复制
JavaSparkContext sc = new JavaSparkContext(conf);
    sc.addJar("./build/libs/spark-cassandra-connector_2.11-2.4.2.jar"); // location of driver could be different.
票数 0
EN

Stack Overflow用户

发布于 2016-09-09 00:53:22

com.datastax.driver.core.KeyspaceMetadata.getMaterializedViews是驱动程序的启动版本3.0。

尝试将此依赖项添加到版本1中:

代码语言:javascript
运行
复制
<dependency>
    <groupId>com.datastax.cassandra</groupId>
    <artifactId>cassandra-driver-core</artifactId>
    <version>3.1.0</version>
</dependency>
票数 -1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/39401640

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档