关于java:Cassandra Spark连接器-NoSuchMethodError:scala.runtime.ObjectRef.zero()Lscala / runtime / ObjectRef

Cassandra Spark Connector - NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef

I am trying to connect spark with cassandra database but I got the error mentioned below. I think there should some mismatch with the versions.

代码:

1
2
3
4
5
6
    SparkConf conf = new SparkConf().setAppName("kafka-sandbox").setMaster("local[2]");
    conf.set("spark.cassandra.connection.host","192.168.34.1");//connection for cassandra database
    JavaSparkContext sc = new JavaSparkContext(conf);
    CassandraConnector connector = CassandraConnector.apply(sc.getConf());
    final Session session = connector.openSession();//error in this line
    final PreparedStatement prepared = session.prepare("INSERT INTO spark_test5.messages JSON?");
1
2
3
4
5
6
7
8
9
10
error:


    Exception in thread"main" java.lang.NoSuchMethodError: scala.runtime.ObjectRef.zero()Lscala/runtime/ObjectRef;
        at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
        at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$3.apply(CassandraConnector.scala:149)
        at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
        at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
        at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:82)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
pom.xml:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkPoc</groupId>
  Spark-Poc</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-streaming_2.10</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-core_2.10</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-streaming-kafka-0-8_2.10</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-sql_2.11</artifactId>
        <version>2.0.1</version>
    </dependency>
  </dependencies>
<build>
    <plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
            <source>1.8</source>
            <target>1.8</target>
        </configuration>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        maven-assembly-plugin</artifactId>
        <version>2.4.1</version>
        <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
           
                    <manifest>
                            <mainClass>com.nwf.Consumer</mainClass>
                    </manifest>
            </archive>
        </configuration>
        <executions>
            <execution>
                    <id>make-assembly</id>
                    <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                            <goal>single</goal>
                    </goals>
            </execution>
    </executions>
    </plugin>
    </plugins>
</build>
</project>

火花版本:版本2.0.0

Scala版本:版本2.11.8


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
In my pom.xml I changed sacala version from 2.10 to 2.11.
Given below is the updated pom.xml


----------
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>SparkPoc</groupId>
  Spark-Poc</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-streaming_2.11</artifactId>
        <version>2.0.0</version>
        <scope>provided</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-core_2.11</artifactId>
        <version>2.0.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-streaming-kafka-0-8_2.11</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        spark-cassandra-connector_2.11</artifactId>
        <version>2.0.0-M3</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        spark-sql_2.11</artifactId>
        <version>2.0.1</version>
    </dependency>
  </dependencies>
<build>
    <plugins>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        maven-compiler-plugin</artifactId>
        <version>3.3</version>
        <configuration>
            <source>1.8</source>
            <target>1.8</target>
        </configuration>
    </plugin>
    <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        maven-assembly-plugin</artifactId>
        <version>2.4.1</version>
        <configuration>
            <!-- get all project dependencies -->
            <descriptorRefs>
                    <descriptorRef>jar-with-dependencies</descriptorRef>
            </descriptorRefs>
            <!-- MainClass in mainfest make a executable jar -->
           
                    <manifest>
                            <mainClass>com.nwf.Consumer</mainClass>
                    </manifest>
            </archive>
        </configuration>
        <executions>
            <execution>
                    <id>make-assembly</id>
                    <!-- bind to the packaging phase -->
                    <phase>package</phase>
                    <goals>
                            <goal>single</goal>
                    </goals>
            </execution>
    </executions>
    </plugin>
    </plugins>
</build>
</project>

根据您的pom.xml,您将scala版本混合为不同的依赖项:

  • spark-streaming_2.10
  • spark-core_2.10
  • spark-streaming-kafka-0-8_2.10
  • spark-cassandra-connector_2.11
  • spark-sql_2.11

所有依赖项应具有相同的scala版本。请尝试将所有内容更改为_2.11


Scala 2.11中引入了scala.runtime.VolatileObjectRef上的

zero()。您可能有一个针对Scala 2.11编译并在Scala 2.10运行时上运行的库。

请参阅

v2.10:https://github.com/scala/scala/blob/2.10.x/src/library/scala/runtime/VolatileObjectRef.java
v2.11:https://github.com/scala/scala/blob/2.11.x/src/library/scala/runtime/VolatileObjectRef.java