Last active
September 17, 2015 14:54
-
-
Save jacek-lewandowski/765fd6b61f439de6b712 to your computer and use it in GitHub Desktop.
SPARKC-247
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ursus-major:spark-cassandra-connector jlewandowski$ git fetch | |
ursus-major:spark-cassandra-connector jlewandowski$ git checkout SPARKC-247 | |
Switched to branch 'SPARKC-247' | |
Your branch is up-to-date with 'origin/SPARKC-247'. | |
ursus-major:spark-cassandra-connector jlewandowski$ git pull origin SPARKC-247 | |
From https://github.com/datastax/spark-cassandra-connector | |
* branch SPARKC-247 -> FETCH_HEAD | |
Already up-to-date. | |
ursus-major:spark-cassandra-connector jlewandowski$ dev/run-real-tests.sh 1.4.0 2.10 | |
Compiling everything and packaging against Scala 2.10 | |
Launching sbt from sbt/sbt-launch-0.13.8.jar | |
[info] Loading project definition from /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/project | |
[info] Compiling 2 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/project/target/scala-2.10/sbt-0.13/classes... | |
[warn] there were 13 feature warning(s); re-run with -feature for details | |
[warn] one warning found | |
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases | |
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots | |
Scala: 2.10.5 [To build against Scala 2.11 use '-Dscala-2.11=true'] | |
Scala Binary: 2.10 | |
Java: target=1.7 user=1.7.0_79 | |
[info] Set current project to root (in build file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/) | |
[success] Total time: 6 s, completed Sep 17, 2015 4:23:01 PM | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}spark-cassandra-connector-embedded... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}demos... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}spark-cassandra-connector... | |
[info] Done updating. | |
[info] Compiling 11 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/classes... | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}twitter-streaming... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}kafka-streaming... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}spark-cassandra-connector-java... | |
[info] Done updating. | |
[info] Compiling 140 Scala sources and 1 Java source to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/classes... | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}simple-demos... | |
[info] Done updating. | |
[info] Compiling 3 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/classes... | |
[info] Compiling 1 Scala source to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/classes... | |
[info] Compiling 6 Scala sources and 13 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Compiling 42 Scala sources and 8 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/test-classes... | |
[info] Compiling 7 Scala sources and 1 Java source to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/classes... | |
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Compiling 6 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/test-classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[success] Total time: 72 s, completed Sep 17, 2015 4:24:13 PM | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Compiling 24 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/it-classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Compiling 3 Scala sources and 2 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/it-classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[success] Total time: 24 s, completed Sep 17, 2015 4:24:37 PM | |
[info] Including from cache: joda-convert-1.2.jar | |
[info] Including from cache: cassandra-clientutil-2.1.5.jar | |
[info] Including from cache: metrics-core-3.0.2.jar | |
[info] Including from cache: slf4j-api-1.7.5.jar | |
[info] Including from cache: commons-lang3-3.3.2.jar | |
[info] Including from cache: jsr166e-1.1.0.jar | |
[info] Including from cache: cassandra-driver-core-2.1.5.jar | |
[info] Including from cache: joda-time-2.3.jar | |
[info] Including from cache: netty-3.9.0.Final.jar | |
[info] Including from cache: guava-14.0.1.jar | |
[info] Including from cache: slf4j-api-1.7.5.jar | |
[info] Including from cache: cassandra-clientutil-2.1.5.jar | |
[info] Including from cache: joda-convert-1.2.jar | |
[info] Including from cache: metrics-core-3.0.2.jar | |
[info] Including from cache: jsr166e-1.1.0.jar | |
[info] Including from cache: commons-lang3-3.3.2.jar | |
[info] Including from cache: cassandra-driver-core-2.1.5.jar | |
[info] Including from cache: joda-time-2.3.jar | |
[info] Including from cache: netty-3.9.0.Final.jar | |
[info] Including from cache: guava-14.0.1.jar | |
[info] OutputMetricsUpdaterSpec: | |
[info] OutputMetricsUpdater | |
[info] - should initialize task metrics properly when they are empty (341 milliseconds) | |
[info] - should initialize task metrics properly when they are defined (2 milliseconds) | |
[info] - should create updater which uses task metrics (2 milliseconds) | |
[info] - should create updater which does not use task metrics (8 milliseconds) | |
[info] - should create updater which uses Codahale metrics (27 milliseconds) | |
[info] - should create updater which doesn't use Codahale metrics (1 millisecond) | |
[info] - should work correctly with multiple threads (392 milliseconds) | |
[info] ReflectionUtilSpec: | |
[info] ReflectionUtil.findGlobalObject | |
[info] - should be able to find DefaultConnectionFactory (588 milliseconds) | |
[info] - should be able to find a global object in a multi-threaded context (13 milliseconds) | |
[info] - should be able to instantiate a singleton object based on Java class name (45 milliseconds) | |
[info] - should cache Java class instances (2 milliseconds) | |
[info] - should throw IllegalArgumentException when asked for a Scala object of wrong type (14 milliseconds) | |
[info] - should throw IllegalArgumentException when asked for class instance of wrong type (19 milliseconds) | |
[info] - should throw IllegalArgumentException when object does not exist (2 milliseconds) | |
[info] ReflectionUtil.constructorParams | |
[info] - should return proper constructor param names and types for a class with a single constructor (131 milliseconds) | |
[info] - should return main constructor's param names and types for a class with multiple constructors (3 milliseconds) | |
[info] ReflectionUtil.getters | |
[info] - should return getter names and types (29 milliseconds) | |
[info] ReflectionUtil.setters | |
[info] - should return setter names and types (6 milliseconds) | |
[info] ReflectionUtil.methodParamTypes | |
[info] - should return method param types (3 milliseconds) | |
[info] - should return proper method param types for generic type (4 milliseconds) | |
[info] - should throw IllegalArgumentException if the requested method is missing (5 milliseconds) | |
[info] MappedToGettableDataConverterSpec: | |
[info] MappedToGettableDataConverter | |
[info] - should be Serializable (203 milliseconds) | |
[info] - should convert a simple case class to a CassandraRow (16 milliseconds) | |
[info] - should convert a simple case class to a UDTValue (11 milliseconds) | |
[info] - should convert a Scala tuple to a TupleValue (56 milliseconds) | |
[info] - should convert nested classes (19 milliseconds) | |
[info] - should convert a nested UDTValue to a UDTValue (16 milliseconds) | |
[info] - should convert user defined types nested in collections (44 milliseconds) | |
[info] - should convert user defined types nested in tuples (14 milliseconds) | |
[info] - should convert tuples nested in user defined types (47 milliseconds) | |
[info] - should convert nulls to Scala Nones (10 milliseconds) | |
[info] - should convert using custom column aliases (10 milliseconds) | |
[info] - should convert a java bean to a CassandraRow (10 milliseconds) | |
[info] - should convert nested JavaBeans (7 milliseconds) | |
[info] - should convert commons-lang3 Pairs to TupleValues (21 milliseconds) | |
[info] - should convert commons-lang3 Triples to TupleValues (25 milliseconds) | |
[info] - should throw a meaningful exception when a column has an incorrect type (13 milliseconds) | |
[info] - should throw a meaningful exception when a tuple field has an incorrect number of components (9 milliseconds) | |
[info] - should work after serialization/deserialization (11 milliseconds) | |
[info] TableDefSpec: | |
[info] A TableDef#cql method | |
[info] should produce valid CQL | |
[info] - when it contains no clustering columns (4 milliseconds) | |
[info] - when it contains clustering columns (2 milliseconds) | |
[info] - when it contains compound partition key and multiple clustering columns (1 millisecond) | |
[info] - when it contains a column of a collection type (3 milliseconds) | |
[info] ConfigCheckSpec: | |
[info] ConfigCheck | |
[info] - should throw an exception when the configuration contains a invalid spark.cassandra prop (48 milliseconds) | |
[info] - should suggest alternatives if you have a slight misspelling (17 milliseconds) | |
[info] - should suggest alternatives if you miss a word (14 milliseconds) | |
[info] - should not throw an exception if you have a random variable not in the spark.cassandra space (2 milliseconds) | |
[info] - should not list all options as suggestions (5 milliseconds) | |
[info] - should not give suggestions when the variable is very strange (4 milliseconds) | |
[info] - should accept custom ConnectionFactory properties (4 milliseconds) | |
[info] - should accept custom AuthConfFactory properties (6 milliseconds) | |
[info] PredicatePushDownSpec: | |
[info] PredicatePushDown | |
[info] - should push down all equality predicates restricting partition key columns (23 milliseconds) | |
[info] - should not push down a partition key predicate for a part of the partition key (1 millisecond) | |
[info] - should not push down a range partition key predicate (1 millisecond) | |
[info] - should push down an IN partition key predicate on the last partition key column (1 millisecond) | |
[info] - should not push down an IN partition key predicate on the non-last partition key column (1 millisecond) | |
[info] - should push down the first clustering column predicate (2 milliseconds) | |
[info] - should push down the first and the second clustering column predicate (1 millisecond) | |
[info] - should push down restrictions on only the initial clustering columns (1 millisecond) | |
[info] - should push down only one range predicate restricting the first clustering column, if there are more range predicates on different clustering columns (1 millisecond) | |
[info] - should push down multiple range predicates for the same clustering column (2 milliseconds) | |
[info] - should push down clustering column predicates when the last clustering column is restricted by IN (1 millisecond) | |
[info] - should stop pushing down clustering column predicates on the first range predicate (1 millisecond) | |
[info] - should not push down IN restriction on non-last column (1 millisecond) | |
[info] - should not push down any clustering column predicates, if the first clustering column is missing (1 millisecond) | |
[info] - should push down equality predicates on regular indexed columns (1 millisecond) | |
[info] - should not push down range predicates on regular indexed columns (1 millisecond) | |
[info] - should not push down IN predicates on regular indexed columns (1 millisecond) | |
[info] - should push down predicates on regular non-indexed and indexed columns (2 milliseconds) | |
[info] - should not push down predicates on regular non-indexed columns if indexed ones are not included (1 millisecond) | |
[info] - should prefer to push down equality predicates over range predicates (1 millisecond) | |
[info] - should not push down unsupported predicates (1 millisecond) | |
[info] RandomPartitionerTokenFactorySpec: | |
[info] RandomPartitionerTokenFactory | |
[info] - should create a token from String (1 millisecond) | |
[info] - should create a String representation of a token (1 millisecond) | |
[info] - should calculate the distance between tokens if right > left (1 millisecond) | |
[info] - should calculate the distance between tokens if right <= left (1 millisecond) | |
[info] - should calculate ring fraction (1 millisecond) | |
[info] BufferedIterator2Spec: | |
[info] BufferedIterator | |
[info] - should return the same items as the standard Iterator (2 milliseconds) | |
[info] - should be convertible to a Seq (4 milliseconds) | |
[info] - should wrap an empty iterator (1 millisecond) | |
[info] - should offer the head element without consuming the underlying iterator (0 milliseconds) | |
[info] - should offer takeWhile that consumes only the elements matching the predicate (3 milliseconds) | |
[info] - should offer appendWhile that copies elements to ArrayBuffer and consumes only the elements matching the predicate (1 millisecond) | |
[info] - should throw NoSuchElementException if trying to get next() element that doesn't exist (2 milliseconds) | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
[info] AnyObjectFactoryTest: | |
[info] AnyObjectFactory | |
[info] when instantiated for a bean class with a single, no-args constructor | |
[info] - should create an instance of that class with newInstance (2 milliseconds) | |
[info] - should return 0 with argCount (1 millisecond) | |
[info] - should return empty collection with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a bean class with multiple constructors which include no-args constructor | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for an inner Java class | |
[info] - should create an instance of that class with newInstance (2 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a deeply nested inner Java class | |
[info] - should create an instance of that class with newInstance (8 milliseconds) | |
[info] - should return that class with javaClass (1 millisecond) | |
[info] when tried to be instantiated for an unsupported bean class | |
[info] - should throw NoSuchMethodException if class does not have suitable constructor (1 millisecond) | |
[info] when instantiated for a Scala case class with 2 args constructor | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return 2 with argCount because the only constructor of this case class has two args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (3 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a Scala case class with 2 args constructor which is defined inside an object | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return 2 with argCount because the only constructor of this case class has two args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a Scala class with 2 args constructor | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return 2 with argCount because the only constructor of this class has 2 args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (1 millisecond) | |
[info] when instantiated for a Scala class with 2 args constructor and without fields | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return 2 with argCount because the only constructor of this class has 2 args (0 milliseconds) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a Scala class with multiple constructors | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return that class with javaClass (1 millisecond) | |
[info] when instantiated for an inner Scala class with 2 args constructor | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return 2 with argCount (0 milliseconds) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a deeply nested inner Scala class | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return that class with javaClass (1 millisecond) | |
[info] when serialized | |
[info] - should allow to be deserialized and reused (7 milliseconds) | |
[info] CassandraRowTest: | |
[info] - basicAccessTest (4 milliseconds) | |
[info] - nullAccessTest (0 milliseconds) | |
[info] - nullToStringTest (1 millisecond) | |
[info] - nonExistentColumnAccessTest (0 milliseconds) | |
[info] - primitiveConversionTest (91 milliseconds) | |
[info] - collectionConversionTest (4 milliseconds) | |
[info] - serializationTest (5 milliseconds) | |
[info] ConsolidateSettingsSpec: | |
[info] - should consolidate Cassandra conf settings in order of table level -> keyspace -> cluster -> default (11 milliseconds) | |
[info] ColumnSelectorSpec: | |
[info] A ColumnSelector#selectFrom method | |
[info] - should return all columns (4 milliseconds) | |
[info] - should return partition key columns (2 milliseconds) | |
[info] - should return some columns (4 milliseconds) | |
[info] - should throw a NoSuchElementException when selected column name is invalid (1 millisecond) | |
[info] CqlWhereParserTest: | |
[info] CqlWhereParser | |
[info] - should parse 'and' operations (24 milliseconds) | |
[info] - should parse equality predicates (2 milliseconds) | |
[info] - should parse range predicates (3 milliseconds) | |
[info] - should parse IN predicates (5 milliseconds) | |
[info] - should parse quoted names (3 milliseconds) | |
[info] - should return lowercase names (2 milliseconds) | |
[info] - should parse strings (3 milliseconds) | |
[info] - should distinguish '?' from ? (3 milliseconds) | |
[info] - should accept >= (1 millisecond) | |
[info] - should accept ? (2 milliseconds) | |
[info] - should accept name with quotes and other special symbols (1 millisecond) | |
[info] - should accept param with quotes and other special symbols (2 milliseconds) | |
[info] - should accept uuid param (2 milliseconds) | |
[info] - should accept float param (1 millisecond) | |
[info] - should parse case insensitive 'aNd' operations (2 milliseconds) | |
[info] - should parse case insensitive 'iN' operations (2 milliseconds) | |
[info] - should parse case insensitive 'IN' operations ? (0 milliseconds) | |
[info] RetryDelayConfSpec: | |
[info] ConstantDelay | |
[info] - should return the same delay regardless of the retry number (2 milliseconds) | |
[info] LinearDelay | |
[info] - should return the calculated delay for different retry numbers (1 millisecond) | |
[info] ExponentialDelay | |
[info] - should return the calculated delay for different retry numbers (1 millisecond) | |
[info] RateLimiterSpec: | |
[info] RateLimiter | |
[info] - should not cause delays if rate is not exceeded (84 milliseconds) | |
[info] - should sleep to not exceed the target rate (2 milliseconds) | |
[info] - should sleep and leak properly with different Rates (20 milliseconds) | |
[info] WriteConfTest: | |
[info] WriteConf | |
[info] - should be configured with proper defaults (0 milliseconds) | |
[info] - should allow setting the rate limit as a decimal (9 milliseconds) | |
[info] - should allow to set consistency level (1 millisecond) | |
[info] - should allow to set parallelism level (1 millisecond) | |
[info] - should allow to set batch size in bytes (1 millisecond) | |
[info] - should allow to set batch size in bytes when rows are set to auto (1 millisecond) | |
[info] - should allow to set batch size in rows (1 millisecond) | |
[info] - should allow to set batch level (2 milliseconds) | |
[info] - should allow to set batch buffer size (1 millisecond) | |
[info] Murmur3TokenFactorySpec: | |
[info] Murmur3TokenFactory | |
[info] - should create a token from String (0 milliseconds) | |
[info] - should create a String representation of a token (0 milliseconds) | |
[info] - should calculate the distance between tokens if right > left (0 milliseconds) | |
[info] - should calculate the distance between tokens if right <= left (1 millisecond) | |
[info] - should calculate ring fraction (1 millisecond) | |
[info] CassandraConnectorConfSpec: | |
[info] - should be serializable (16 milliseconds) | |
[info] - should match a conf with the same settings (3 milliseconds) | |
[info] - should resolve default SSL settings correctly (5 milliseconds) | |
[info] - should resolve provided SSL settings correctly (1 millisecond) | |
[info] - should resolve default retry delay settings correctly (0 milliseconds) | |
[info] - should resolve constant retry delay settings (2 milliseconds) | |
[info] - should resolve linear retry delay settings (2 milliseconds) | |
[info] - should resolve exponential retry delay settings (2 milliseconds) | |
[info] WriteOptionTest: | |
[info] TTLOption | |
[info] - should properly create constant write option with duration in seconds (2 milliseconds) | |
[info] - should properly create constant write option with scala.concurrent.duration.Duration (0 milliseconds) | |
[info] - should properly create constant write option with scala.concurrent.duration.Duration.Infinite (0 milliseconds) | |
[info] - should properly create constant write option with org.joda.time.Duration (2 milliseconds) | |
[info] - should properly create infinite duration (0 milliseconds) | |
[info] - should properly create per-row duration placeholder (0 milliseconds) | |
[info] TimestampOption | |
[info] - should properly create constant write option with timestamp in microseconds (0 milliseconds) | |
[info] - should properly create constant write option with DateTime (0 milliseconds) | |
[info] - should properly create constant write option with Date (0 milliseconds) | |
[info] InputMetricsUpdaterSpec: | |
[info] InputMetricsUpdater | |
[info] - should initialize task metrics properly when they are empty (3 milliseconds) | |
[info] - should create updater which uses task metrics (23 milliseconds) | |
[info] - should create updater which does not use task metrics (1 millisecond) | |
[info] - should create updater which uses Codahale metrics (2 milliseconds) | |
[info] - should create updater which doesn't use Codahale metrics (1 millisecond) | |
[info] ColumnTypeSpec: | |
[info] A ColumnType companion object | |
[info] - should throw InvalidArgumentException if given unsupported type (16 milliseconds) | |
[info] should allow to obtain a proper ColumnType | |
[info] - when given a Boolean should return BooleanType (1 millisecond) | |
[info] - when given a java.lang.Boolean should return BooleanType (2 milliseconds) | |
[info] - when given an Int should return IntType (0 milliseconds) | |
[info] - when given an java.lang.Integer should return IntType (1 millisecond) | |
[info] - when given a Long should return BigIntType (1 millisecond) | |
[info] - when given a java.lang.Long should return BigIntType (0 milliseconds) | |
[info] - when given a Float should return FloatType (0 milliseconds) | |
[info] - when given a java.lang.Float should return FloatType (0 milliseconds) | |
[info] - when given a Double should return DoubleType (1 millisecond) | |
[info] - when given a java.lang.Double should return DoubleType (1 millisecond) | |
[info] - when given a String should return VarcharType (1 millisecond) | |
[info] - when given a java.util.Date should return TimestampType (1 millisecond) | |
[info] - when given a java.sql.Date should return TimestampType (1 millisecond) | |
[info] - when given a org.joda.time.DateTime should return TimestampType (1 millisecond) | |
[info] - when given a ByteBuffer should return BlobType (1 millisecond) | |
[info] - when given an Array[Byte] should return BlobType (1 millisecond) | |
[info] - when given an UUID should return UUIDType (1 millisecond) | |
[info] - when given a List[String] should return ListType(VarcharType) (8 milliseconds) | |
[info] - when given a Set[InetAddress] should return SetType(InetType) (24 milliseconds) | |
[info] - when given a Map[Int, Date] should return MapType(IntType, TimestampType) (5 milliseconds) | |
[info] - when given an Option[Int] should return IntType (6 milliseconds) | |
[info] - when given an Option[Vector[Int]] should return ListType(IntType) (12 milliseconds) | |
[info] SpanningIteratorSpec: | |
[info] SpanningIterator | |
[info] - should group an empty collection (2 milliseconds) | |
[info] - should group a sequence of elements with the same key into a single item and should preserve order (1 millisecond) | |
[info] - should group a sequence of elements with distinct keys the same number of groups (1 millisecond) | |
[info] - should group a sequence of elements with two keys into two groups (1 millisecond) | |
[info] - should be lazy and work with infinite streams (2 milliseconds) | |
[info] PriorityHashMapSpec: | |
[info] A PriorityHashMap | |
[info] - should support adding elements (simple) (1 millisecond) | |
[info] - should support adding elements ascending by value (30 milliseconds) | |
[info] - should support adding elements descending by value (19 milliseconds) | |
[info] - should support adding elements in random order of values (17 milliseconds) | |
[info] - should support adding elements in random order of values and keys (25 milliseconds) | |
[info] - should support removing elements in ascending order (8 milliseconds) | |
[info] - should support removing elements in descending order (11 milliseconds) | |
[info] - should support removing elements in random order from a sorted map (8 milliseconds) | |
[info] - should support removing elements from a randomly created map in random order (24 milliseconds) | |
[info] - should allow to heapsort an array of integers (19 milliseconds) | |
[info] - should allow to update item priority (5 milliseconds) | |
[info] - should be able to store multiple items with the same priority (0 milliseconds) | |
[info] - should return false when removing a non-existing key (1 millisecond) | |
[info] - should have capacity rounded up to the nearest power of two (0 milliseconds) | |
[info] - should throw NoSuchElement exception if requested a head of empty map (1 millisecond) | |
[info] - should throw NoSuchElement exception if requested a non-existing key (1 millisecond) | |
[info] - should throw IllegalStateException exception if trying to exceed allowed capacity (1 millisecond) | |
[info] GettableDataToMappedTypeConverterSpec: | |
[info] GettableDataToMappedTypeConverter | |
[info] - should be Serializable (55 milliseconds) | |
[info] - should convert a CassandraRow to a case class object (11 milliseconds) | |
[info] - should convert a CassandraRow to a case class object after being serialized/deserialized (12 milliseconds) | |
[info] - should convert a CassandraRow to a tuple (5 milliseconds) | |
[info] - should convert a CassandraRow to a tuple in reversed order (4 milliseconds) | |
[info] - should convert a CassandraRow to a tuple with a subset of columns (6 milliseconds) | |
[info] - should convert a UDTValue to a case class object (8 milliseconds) | |
[info] - should convert a TupleValue to a Scala tuple (5 milliseconds) | |
[info] - should allow for nesting UDTValues inside of TupleValues (12 milliseconds) | |
[info] - should allow for nesting TupleValues inside of UDTValues (12 milliseconds) | |
[info] - should convert nulls to Scala Nones (10 milliseconds) | |
[info] - should convert using custom column aliases (8 milliseconds) | |
[info] - should set property values with setters (7 milliseconds) | |
[info] - should apply proper type conversions for columns (7 milliseconds) | |
[info] - should convert a CassandraRow with a UDTValue into nested case class objects (16 milliseconds) | |
[info] - should convert a CassandraRow with a UDTValue into a case class with a nested tuple (11 milliseconds) | |
[info] - should convert a CassandraRow with an optional UDTValue (16 milliseconds) | |
[info] - should convert a CassandraRow with a list of UDTValues (16 milliseconds) | |
[info] - should convert a CassandraRow with a set of UDTValues (15 milliseconds) | |
[info] - should convert a CassandraRow with a collection of UDTValues (17 milliseconds) | |
[info] - should convert a CassandraRow with a collection of tuples (12 milliseconds) | |
[info] - should convert a CassandraRow to a JavaBean (7 milliseconds) | |
[info] - should convert a CassandraRow with UDTs to nested JavaBeans (11 milliseconds) | |
[info] - should throw a meaningful exception when a column type is not supported (5 milliseconds) | |
[info] - should throw a meaningful exception when a column value fails to be converted (5 milliseconds) | |
[info] - should throw NPE with a meaningful message when a column value is null (4 milliseconds) | |
[info] - should throw NPE when trying to access its targetTypeTag after serialization/deserialization (7 milliseconds) | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.writer.AsyncExecutorTest.test started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.445s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderTest.testSerialize started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.015s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testSetters1 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testSetters2 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.columnNameOverrideConstructor started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testGetters1 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testGetters2 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testWorkWithAliasesAndHonorOverrides started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.columnNameOverrideGetters started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNotEnoughPropertiesForWriting started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForClassWithVars started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForEmptyClass started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testConstructorParams1 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testConstructorParams2 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNotEnoughColumnsSelectedForReading started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testImplicit started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.columnNameOverrideSetters started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForClassWithUnsupportedPropertyType started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testWorkWithAliases started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForCaseClass started | |
[info] Test run finished: 0 failed, 0 ignored, 18 total, 0.051s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaDouble started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testInetAddress started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testSerializeMapConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaInteger started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testChainedConverters started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTreeMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTreeSet started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testInt started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testSet started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testByteArray started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testBigDecimal started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testFloat started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testDate started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testList started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testLong started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testPair started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testUUID started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTypeAliases started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJodaTime started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testCalendar1 started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testCalendar2 started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaBigDecimal started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testBoolean started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaFloat started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testUnsupportedType started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testRegisterCustomConverterExtension started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaBigInteger started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaList started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaLong started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaBoolean started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testRegisterCustomConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testSerializeCollectionConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testBigInt started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testChainedConverterSerializability started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testDouble started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaHashMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaHashSet started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testOption started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testString started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTriple started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testOptionToNullConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaArrayList started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaSet started | |
[info] Test run finished: 0 failed, 0 ignored, 45 total, 0.091s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testIncompleteConstructor started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testIncompleteGetters started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testGetters started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testSerialize started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testImplicit started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testConstructor started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testNewTable started | |
[info] Test run finished: 0 failed, 0 ignored, 7 total, 0.013s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.writer.PropertyExtractorTest.testSimpleExtraction started | |
[info] Test com.datastax.spark.connector.writer.PropertyExtractorTest.testWrongPropertyName started | |
[info] Test com.datastax.spark.connector.writer.PropertyExtractorTest.testAvailableProperties started | |
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.002s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.types.CanBuildFromTest.testBuild started | |
[info] Test com.datastax.spark.connector.types.CanBuildFromTest.testSerializeAndBuild started | |
[info] Test com.datastax.spark.connector.types.CanBuildFromTest.testSerializeAndBuildWithOrdering started | |
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.003s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testTypeConversionsInUDTValuesAreApplied started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testTypeConversionsAreApplied started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testSerializability started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testCustomTypeConvertersAreUsed started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.019s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testSplit started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testZeroRows started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testWrapAround started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testNoSplit started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.008s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testWorkWithAliasesAndHonorOverrides started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testSerializeColumnMap started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testGetters started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testColumnNameOverrideSetters started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testImplicit started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testSetters started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testWorkWithAliases started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testColumnNameOverrideGetters started | |
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.008s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.types.TypeSerializationTest.testSerializationOfCollectionTypes started | |
[info] Test com.datastax.spark.connector.types.TypeSerializationTest.testSerializationOfPrimitiveTypes started | |
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.009s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testSplit started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testZeroRows started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testWrapAround started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testNoSplit started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.005s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testTrivialClustering started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testMultipleEndpoints started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testEmpty started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testTooLargeRanges started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testMaxClusterSize started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testSplitByHost started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testSplitByCount started | |
[info] Test run finished: 0 failed, 0 ignored, 7 total, 0.009s | |
15/09/17 16:24:45 INFO Utils: Shutdown hook called | |
[info] ScalaTest | |
[info] Run completed in 5 seconds, 553 milliseconds. | |
[info] Total number of tests run: 263 | |
[info] Suites: completed 24, aborted 0 | |
[info] Tests: succeeded 263, failed 0, canceled 0, ignored 0, pending 0 | |
[info] All tests passed. | |
[info] Passed: Total 370, Failed 0, Errors 0, Passed 370 | |
[info] Checking every *.class/*.jar file's SHA-1. | |
[info] Merging files... | |
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard' | |
[warn] Strategy 'discard' was applied to a file | |
[info] SHA-1: ab749ae20a54616bde3c3cfb63b35ab45c482dda | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.CustomTypeConverterTest.test1 started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.512s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithConnector started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithReadConf started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithAscOrder started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testLimit started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWhere started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectColumns started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectedColumnRefs started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectedColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithDescOrder started | |
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.234s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetBytes started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetFloat started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetShort started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGet started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testToMap started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDateTime started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetByte started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDate started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetInet started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetList started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetLong started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetUUID started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetObjectAndApply started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetBoolean started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDouble started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetInt started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetMap started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetSet started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetString started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetVarInt started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDecimal started | |
[info] Test run finished: 0 failed, 0 ignored, 21 total, 0.227s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJoinJavaRDDTest.testOn started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.056s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.SparkContextJavaFunctionsTest.testReadConfPopulating started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.196s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithConnector started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithReadConf started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithAscOrder started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testLimit started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWhere started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectColumns started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectedColumnRefs started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectedColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithDescOrder started | |
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.008s | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
15/09/17 16:24:50 INFO Utils: Shutdown hook called | |
[info] ScalaTest | |
[info] Run completed in 2 seconds, 83 milliseconds. | |
[info] Total number of tests run: 0 | |
[info] Suites: completed 0, aborted 0 | |
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 | |
[info] No tests were executed. | |
[info] Passed: Total 44, Failed 0, Errors 0, Passed 44 | |
[info] Checking every *.class/*.jar file's SHA-1. | |
[info] Merging files... | |
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard' | |
[warn] Strategy 'discard' was applied to a file | |
[info] SHA-1: 7207747c6dca1469f5ca84d51b7f20d21f07942e | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[success] Total time: 15 s, completed Sep 17, 2015 4:24:53 PM | |
Spark 1.4.0 for Scala 2.10 already downloaded | |
Installing Spark 1.4.0 for Scala 2.10 | |
x NOTICE | |
x CHANGES.txt | |
x python/ | |
x python/test_support/ | |
x python/test_support/userlibrary.py | |
x python/test_support/userlib-0.1.zip | |
x python/test_support/SimpleHTTPServer.py | |
x python/test_support/hello.txt | |
x python/docs/ | |
x python/docs/pyspark.ml.rst | |
x python/docs/pyspark.streaming.rst | |
x python/docs/conf.py | |
x python/docs/pyspark.rst | |
x python/docs/make.bat | |
x python/docs/epytext.py | |
x python/docs/make2.bat | |
x python/docs/index.rst | |
x python/docs/pyspark.sql.rst | |
x python/docs/pyspark.mllib.rst | |
x python/docs/Makefile | |
x python/.gitignore | |
x python/pyspark/ | |
x python/pyspark/status.py | |
x python/pyspark/conf.py | |
x python/pyspark/ml/ | |
x python/pyspark/ml/evaluation.py | |
x python/pyspark/ml/util.py | |
x python/pyspark/ml/classification.py | |
x python/pyspark/ml/regression.py | |
x python/pyspark/ml/tests.py | |
x python/pyspark/ml/tuning.py | |
x python/pyspark/ml/pipeline.py | |
x python/pyspark/ml/feature.py | |
x python/pyspark/ml/recommendation.py | |
x python/pyspark/ml/__init__.py | |
x python/pyspark/ml/wrapper.py | |
x python/pyspark/ml/param/ | |
x python/pyspark/ml/param/_shared_params_code_gen.py | |
x python/pyspark/ml/param/shared.py | |
x python/pyspark/ml/param/__init__.py | |
x python/pyspark/statcounter.py | |
x python/pyspark/profiler.py | |
x python/pyspark/serializers.py | |
x python/pyspark/traceback_utils.py | |
x python/pyspark/shell.py | |
x python/pyspark/sql/ | |
x python/pyspark/sql/window.py | |
x python/pyspark/sql/tests.py | |
x python/pyspark/sql/group.py | |
x python/pyspark/sql/types.py | |
x python/pyspark/sql/context.py | |
x python/pyspark/sql/dataframe.py | |
x python/pyspark/sql/column.py | |
x python/pyspark/sql/__init__.py | |
x python/pyspark/sql/readwriter.py | |
x python/pyspark/sql/functions.py | |
x python/pyspark/daemon.py | |
x python/pyspark/tests.py | |
x python/pyspark/resultiterable.py | |
x python/pyspark/heapq3.py | |
x python/pyspark/broadcast.py | |
x python/pyspark/shuffle.py | |
x python/pyspark/cloudpickle.py | |
x python/pyspark/accumulators.py | |
x python/pyspark/java_gateway.py | |
x python/pyspark/streaming/ | |
x python/pyspark/streaming/util.py | |
x python/pyspark/streaming/tests.py | |
x python/pyspark/streaming/kafka.py | |
x python/pyspark/streaming/dstream.py | |
x python/pyspark/streaming/context.py | |
x python/pyspark/streaming/__init__.py | |
x python/pyspark/context.py | |
x python/pyspark/storagelevel.py | |
x python/pyspark/__init__.py | |
x python/pyspark/join.py | |
x python/pyspark/mllib/ | |
x python/pyspark/mllib/tree.py | |
x python/pyspark/mllib/linalg.py | |
x python/pyspark/mllib/evaluation.py | |
x python/pyspark/mllib/util.py | |
x python/pyspark/mllib/classification.py | |
x python/pyspark/mllib/regression.py | |
x python/pyspark/mllib/tests.py | |
x python/pyspark/mllib/common.py | |
x python/pyspark/mllib/feature.py | |
x python/pyspark/mllib/clustering.py | |
x python/pyspark/mllib/recommendation.py | |
x python/pyspark/mllib/stat/ | |
x python/pyspark/mllib/stat/__init__.py | |
x python/pyspark/mllib/stat/_statistics.py | |
x python/pyspark/mllib/stat/test.py | |
x python/pyspark/mllib/stat/distribution.py | |
x python/pyspark/mllib/random.py | |
x python/pyspark/mllib/__init__.py | |
x python/pyspark/mllib/fpm.py | |
x python/pyspark/rdd.py | |
x python/pyspark/rddsampler.py | |
x python/pyspark/worker.py | |
x python/pyspark/files.py | |
x python/run-tests | |
x python/lib/ | |
x python/lib/py4j-0.8.2.1-src.zip | |
x python/lib/pyspark.zip | |
x python/lib/PY4J_LICENSE.txt | |
x RELEASE | |
x sbin/ | |
x sbin/start-mesos-dispatcher.sh | |
x sbin/spark-daemon.sh | |
x sbin/stop-slaves.sh | |
x sbin/stop-thriftserver.sh | |
x sbin/stop-shuffle-service.sh | |
x sbin/stop-history-server.sh | |
x sbin/spark-config.sh | |
x sbin/start-history-server.sh | |
x sbin/start-thriftserver.sh | |
x sbin/start-shuffle-service.sh | |
x sbin/spark-daemons.sh | |
x sbin/start-all.sh | |
x sbin/stop-master.sh | |
x sbin/stop-mesos-dispatcher.sh | |
x sbin/stop-slave.sh | |
x sbin/start-slave.sh | |
x sbin/start-slaves.sh | |
x sbin/stop-all.sh | |
x sbin/slaves.sh | |
x sbin/start-master.sh | |
x examples/ | |
x examples/src/ | |
x examples/src/main/ | |
x examples/src/main/r/ | |
x examples/src/main/r/dataframe.R | |
x examples/src/main/python/ | |
x examples/src/main/python/status_api_demo.py | |
x examples/src/main/python/ml/ | |
x examples/src/main/python/ml/simple_text_classification_pipeline.py | |
x examples/src/main/python/ml/random_forest_example.py | |
x examples/src/main/python/ml/gradient_boosted_trees.py | |
x examples/src/main/python/ml/simple_params_example.py | |
x examples/src/main/python/pagerank.py | |
x examples/src/main/python/wordcount.py | |
x examples/src/main/python/pi.py | |
x examples/src/main/python/hbase_inputformat.py | |
x examples/src/main/python/logistic_regression.py | |
x examples/src/main/python/cassandra_outputformat.py | |
x examples/src/main/python/streaming/ | |
x examples/src/main/python/streaming/sql_network_wordcount.py | |
x examples/src/main/python/streaming/network_wordcount.py | |
x examples/src/main/python/streaming/kafka_wordcount.py | |
x examples/src/main/python/streaming/stateful_network_wordcount.py | |
x examples/src/main/python/streaming/direct_kafka_wordcount.py | |
x examples/src/main/python/streaming/hdfs_wordcount.py | |
x examples/src/main/python/streaming/recoverable_network_wordcount.py | |
x examples/src/main/python/transitive_closure.py | |
x examples/src/main/python/kmeans.py | |
x examples/src/main/python/avro_inputformat.py | |
x examples/src/main/python/mllib/ | |
x examples/src/main/python/mllib/sampled_rdds.py | |
x examples/src/main/python/mllib/gaussian_mixture_model.py | |
x examples/src/main/python/mllib/logistic_regression.py | |
x examples/src/main/python/mllib/random_forest_example.py | |
x examples/src/main/python/mllib/dataset_example.py | |
x examples/src/main/python/mllib/word2vec.py | |
x examples/src/main/python/mllib/decision_tree_runner.py | |
x examples/src/main/python/mllib/kmeans.py | |
x examples/src/main/python/mllib/random_rdd_generation.py | |
x examples/src/main/python/mllib/gradient_boosted_trees.py | |
x examples/src/main/python/mllib/correlations.py | |
x examples/src/main/python/parquet_inputformat.py | |
x examples/src/main/python/hbase_outputformat.py | |
x examples/src/main/python/als.py | |
x examples/src/main/python/sql.py | |
x examples/src/main/python/sort.py | |
x examples/src/main/python/cassandra_inputformat.py | |
x examples/src/main/java/ | |
x examples/src/main/java/org/ | |
x examples/src/main/java/org/apache/ | |
x examples/src/main/java/org/apache/spark/ | |
x examples/src/main/java/org/apache/spark/examples/ | |
x examples/src/main/java/org/apache/spark/examples/ml/ | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaSimpleTextClassificationPipeline.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaSimpleParamsExample.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaDeveloperApiExample.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaCrossValidatorExample.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaOneVsRestExample.java | |
x examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java | |
x examples/src/main/java/org/apache/spark/examples/sql/ | |
x examples/src/main/java/org/apache/spark/examples/sql/JavaSparkSQL.java | |
x examples/src/main/java/org/apache/spark/examples/JavaLogQuery.java | |
x examples/src/main/java/org/apache/spark/examples/JavaTC.java | |
x examples/src/main/java/org/apache/spark/examples/JavaStatusTrackerDemo.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/ | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaRecord.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaFlumeEventCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaDirectKafkaWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaSqlNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaRecoverableNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaStatefulNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaCustomReceiver.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaQueueStream.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/JavaHdfsLR.java | |
x examples/src/main/java/org/apache/spark/examples/JavaPageRank.java | |
x examples/src/main/java/org/apache/spark/examples/JavaWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/ | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaRandomForestExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaLDAExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaDecisionTree.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaPowerIterationClusteringExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaALS.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaFPGrowthExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaGradientBoostedTreesRunner.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaLR.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaKMeans.java | |
x examples/src/main/scala/ | |
x examples/src/main/scala/org/ | |
x examples/src/main/scala/org/apache/ | |
x examples/src/main/scala/org/apache/spark/ | |
x examples/src/main/scala/org/apache/spark/examples/ | |
x examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/ | |
x examples/src/main/scala/org/apache/spark/examples/ml/RandomForestExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/CrossValidatorExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/GBTExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/DeveloperApiExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/DecisionTreeExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/MovieLensALS.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/OneVsRestExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/SimpleTextClassificationPipeline.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/SimpleParamsExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkKMeans.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkTachyonPi.scala | |
x examples/src/main/scala/org/apache/spark/examples/MultiBroadcastTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/sql/ | |
x examples/src/main/scala/org/apache/spark/examples/sql/hive/ | |
x examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala | |
x examples/src/main/scala/org/apache/spark/examples/sql/RDDRelation.scala | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/ | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/HBaseConverters.scala | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/CassandraConverters.scala | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/AvroConverters.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkTC.scala | |
x examples/src/main/scala/org/apache/spark/examples/BroadcastTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/ExceptionHandlingTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalKMeans.scala | |
x examples/src/main/scala/org/apache/spark/examples/graphx/ | |
x examples/src/main/scala/org/apache/spark/examples/graphx/Analytics.scala | |
x examples/src/main/scala/org/apache/spark/examples/graphx/SynthBenchmark.scala | |
x examples/src/main/scala/org/apache/spark/examples/graphx/LiveJournalPageRank.scala | |
x examples/src/main/scala/org/apache/spark/examples/HdfsTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/SimpleSkewedGroupByTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkPageRank.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkTachyonHdfsLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/ | |
x examples/src/main/scala/org/apache/spark/examples/streaming/StatefulNetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/TwitterAlgebirdCMS.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/HdfsWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/DirectKafkaWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/QueueStream.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/TwitterPopularTags.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/FlumePollingEventCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/SqlNetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/FlumeEventCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/ZeroMQWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/RecoverableNetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/clickstream/ | |
x examples/src/main/scala/org/apache/spark/examples/streaming/clickstream/PageViewStream.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/clickstream/PageViewGenerator.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/MQTTWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/TwitterAlgebirdHLL.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/StreamingExamples.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/NetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/CustomReceiver.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/KafkaWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/RawNetworkGrep.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkPi.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkALS.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/DriverSubmissionTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/LogQuery.scala | |
x examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/ | |
x examples/src/main/scala/org/apache/spark/examples/mllib/AbstractParams.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/StreamingKMeansExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/TallSkinnyPCA.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/StreamingLinearRegression.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DecisionTreeRunner.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/TallSkinnySVD.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DenseGaussianMixture.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DatasetExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/Correlations.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/MovieLensALS.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/CosineSimilarity.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/StreamingLogisticRegression.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/SampledRDDs.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/RandomRDDGeneration.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/SparseNaiveBayes.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/BinaryClassification.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/PowerIterationClusteringExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DenseKMeans.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/MultivariateSummarizer.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/FPGrowthExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/GradientBoostedTreesRunner.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/LinearRegression.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/LDAExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalPi.scala | |
x examples/src/main/scala/org/apache/spark/examples/CassandraCQLTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkHdfsLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/SkewedGroupByTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/CassandraTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalALS.scala | |
x examples/src/main/resources/ | |
x examples/src/main/resources/people.json | |
x examples/src/main/resources/people.txt | |
x examples/src/main/resources/full_user.avsc | |
x examples/src/main/resources/kv1.txt | |
x examples/src/main/resources/users.parquet | |
x examples/src/main/resources/user.avsc | |
x examples/src/main/resources/users.avro | |
x data/ | |
x data/mllib/ | |
x data/mllib/pagerank_data.txt | |
x data/mllib/kmeans_data.txt | |
x data/mllib/als/ | |
x data/mllib/als/sample_movielens_movies.txt | |
x data/mllib/als/test.data | |
x data/mllib/als/sample_movielens_ratings.txt | |
x data/mllib/lr-data/ | |
x data/mllib/lr-data/random.data | |
x data/mllib/sample_naive_bayes_data.txt | |
x data/mllib/sample_tree_data.csv | |
x data/mllib/sample_fpgrowth.txt | |
x data/mllib/sample_libsvm_data.txt | |
x data/mllib/ridge-data/ | |
x data/mllib/ridge-data/lpsa.data | |
x data/mllib/sample_multiclass_classification_data.txt | |
x data/mllib/sample_linear_regression_data.txt | |
x data/mllib/sample_isotonic_regression_data.txt | |
x data/mllib/sample_binary_classification_data.txt | |
x data/mllib/sample_lda_data.txt | |
x data/mllib/sample_movielens_data.txt | |
x data/mllib/sample_svm_data.txt | |
x data/mllib/lr_data.txt | |
x data/mllib/gmm_data.txt | |
x R/ | |
x R/lib/ | |
x R/lib/SparkR/ | |
x R/lib/SparkR/html/ | |
x R/lib/SparkR/html/groupBy.html | |
x R/lib/SparkR/html/sql.html | |
x R/lib/SparkR/html/DataFrame.html | |
x R/lib/SparkR/html/hashCode.html | |
x R/lib/SparkR/html/distinct.html | |
x R/lib/SparkR/html/print.jobj.html | |
x R/lib/SparkR/html/saveAsParquetFile.html | |
x R/lib/SparkR/html/sparkRHive.init.html | |
x R/lib/SparkR/html/registerTempTable.html | |
x R/lib/SparkR/html/tables.html | |
x R/lib/SparkR/html/structType.html | |
x R/lib/SparkR/html/parquetFile.html | |
x R/lib/SparkR/html/isLocal.html | |
x R/lib/SparkR/html/tableNames.html | |
x R/lib/SparkR/html/createDataFrame.html | |
x R/lib/SparkR/html/except.html | |
x R/lib/SparkR/html/withColumn.html | |
x R/lib/SparkR/html/print.structType.html | |
x R/lib/SparkR/html/count.html | |
x R/lib/SparkR/html/saveAsTable.html | |
x R/lib/SparkR/html/describe.html | |
x R/lib/SparkR/html/persist.html | |
x R/lib/SparkR/html/selectExpr.html | |
x R/lib/SparkR/html/jsonFile.html | |
x R/lib/SparkR/html/insertInto.html | |
x R/lib/SparkR/html/unionAll.html | |
x R/lib/SparkR/html/showDF.html | |
x R/lib/SparkR/html/schema.html | |
x R/lib/SparkR/html/filter.html | |
x R/lib/SparkR/html/cache-methods.html | |
x R/lib/SparkR/html/table.html | |
x R/lib/SparkR/html/head.html | |
x R/lib/SparkR/html/limit.html | |
x R/lib/SparkR/html/structField.html | |
x R/lib/SparkR/html/cacheTable.html | |
x R/lib/SparkR/html/dtypes.html | |
x R/lib/SparkR/html/columns.html | |
x R/lib/SparkR/html/unpersist-methods.html | |
x R/lib/SparkR/html/arrange.html | |
x R/lib/SparkR/html/collect-methods.html | |
x R/lib/SparkR/html/uncacheTable.html | |
x R/lib/SparkR/html/infer_type.html | |
x R/lib/SparkR/html/sparkR.stop.html | |
x R/lib/SparkR/html/explain.html | |
x R/lib/SparkR/html/R.css | |
x R/lib/SparkR/html/take.html | |
x R/lib/SparkR/html/column.html | |
x R/lib/SparkR/html/show.html | |
x R/lib/SparkR/html/printSchema.html | |
x R/lib/SparkR/html/createExternalTable.html | |
x R/lib/SparkR/html/sparkR.init.html | |
x R/lib/SparkR/html/agg.html | |
x R/lib/SparkR/html/00Index.html | |
x R/lib/SparkR/html/print.structField.html | |
x R/lib/SparkR/html/write.df.html | |
x R/lib/SparkR/html/intersect.html | |
x R/lib/SparkR/html/sparkRSQL.init.html | |
x R/lib/SparkR/html/select.html | |
x R/lib/SparkR/html/dropTempTable.html | |
x R/lib/SparkR/html/sample.html | |
x R/lib/SparkR/html/repartition.html | |
x R/lib/SparkR/html/first.html | |
x R/lib/SparkR/html/read.df.html | |
x R/lib/SparkR/html/withColumnRenamed.html | |
x R/lib/SparkR/html/clearCache.html | |
x R/lib/SparkR/html/nafunctions.html | |
x R/lib/SparkR/html/GroupedData.html | |
x R/lib/SparkR/html/join.html | |
x R/lib/SparkR/INDEX | |
x R/lib/SparkR/R/ | |
x R/lib/SparkR/R/SparkR.rdx | |
x R/lib/SparkR/R/SparkR | |
x R/lib/SparkR/R/SparkR.rdb | |
x R/lib/SparkR/help/ | |
x R/lib/SparkR/help/aliases.rds | |
x R/lib/SparkR/help/paths.rds | |
x R/lib/SparkR/help/SparkR.rdx | |
x R/lib/SparkR/help/AnIndex | |
x R/lib/SparkR/help/SparkR.rdb | |
x R/lib/SparkR/DESCRIPTION | |
x R/lib/SparkR/Meta/ | |
x R/lib/SparkR/Meta/hsearch.rds | |
x R/lib/SparkR/Meta/nsInfo.rds | |
x R/lib/SparkR/Meta/package.rds | |
x R/lib/SparkR/Meta/links.rds | |
x R/lib/SparkR/Meta/Rd.rds | |
x R/lib/SparkR/profile/ | |
x R/lib/SparkR/profile/general.R | |
x R/lib/SparkR/profile/shell.R | |
x R/lib/SparkR/worker/ | |
x R/lib/SparkR/worker/daemon.R | |
x R/lib/SparkR/worker/worker.R | |
x R/lib/SparkR/NAMESPACE | |
x R/lib/SparkR/tests/ | |
x R/lib/SparkR/tests/test_textFile.R | |
x R/lib/SparkR/tests/test_broadcast.R | |
x R/lib/SparkR/tests/test_binaryFile.R | |
x R/lib/SparkR/tests/test_rdd.R | |
x R/lib/SparkR/tests/test_sparkSQL.R | |
x R/lib/SparkR/tests/test_parallelize_collect.R | |
x R/lib/SparkR/tests/test_includePackage.R | |
x R/lib/SparkR/tests/test_shuffle.R | |
x R/lib/SparkR/tests/test_binary_function.R | |
x R/lib/SparkR/tests/test_context.R | |
x R/lib/SparkR/tests/test_take.R | |
x R/lib/SparkR/tests/test_utils.R | |
x ec2/ | |
x ec2/spark_ec2.py | |
x ec2/README | |
x ec2/spark-ec2 | |
x ec2/deploy.generic/ | |
x ec2/deploy.generic/root/ | |
x ec2/deploy.generic/root/spark-ec2/ | |
x ec2/deploy.generic/root/spark-ec2/ec2-variables.sh | |
x conf/ | |
x conf/fairscheduler.xml.template | |
x conf/metrics.properties.template | |
x conf/spark-env.sh.template | |
x conf/log4j.properties.template | |
x conf/docker.properties.template | |
x conf/slaves.template | |
x conf/spark-defaults.conf.template | |
x LICENSE | |
x bin/ | |
x bin/spark-shell | |
x bin/spark-submit.cmd | |
x bin/spark-shell2.cmd | |
x bin/pyspark | |
x bin/sparkR.cmd | |
x bin/spark-class2.cmd | |
x bin/run-example.cmd | |
x bin/spark-submit2.cmd | |
x bin/spark-class | |
x bin/spark-submit | |
x bin/spark-sql | |
x bin/run-example | |
x bin/beeline | |
x bin/pyspark2.cmd | |
x bin/spark-shell.cmd | |
x bin/spark-class.cmd | |
x bin/pyspark.cmd | |
x bin/sparkR | |
x bin/beeline.cmd | |
x bin/sparkR2.cmd | |
x bin/run-example2.cmd | |
x bin/load-spark-env.sh | |
x bin/load-spark-env.cmd | |
x lib/ | |
x lib/datanucleus-core-3.2.10.jar | |
x lib/datanucleus-api-jdo-3.2.6.jar | |
x lib/spark-examples-1.4.0-hadoop1.0.4.jar | |
x lib/datanucleus-rdbms-3.2.9.jar | |
x lib/spark-assembly-1.4.0-hadoop1.0.4.jar | |
x README.md | |
Running Spark cluster | |
starting org.apache.spark.deploy.master.Master, logging to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/log/spark-jlewandowski-org.apache.spark.deploy.master.Master-1-ursus-major.out | |
starting org.apache.spark.deploy.worker.Worker, logging to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/log/spark-jlewandowski-org.apache.spark.deploy.worker.Worker-2-ursus-major.out | |
Running tests for Spark 1.4.0 and Scala 2.10 | |
Launching sbt from sbt/sbt-launch-0.13.8.jar | |
[info] Loading project definition from /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/project | |
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases | |
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots | |
Scala: 2.10.5 [To build against Scala 2.11 use '-Dscala-2.11=true'] | |
Scala Binary: 2.10 | |
Java: target=1.7 user=1.7.0_79 | |
[info] Set current project to root (in build file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/) | |
objc[54044]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:25:21,703 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:25:21,705 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
objc[54046]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:25:25,406 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:25:25,408 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:25:28,161 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
WARN 16:25:28,251 org.apache.spark.Logging$class (Logging.scala:71) - Your hostname, ursus-major resolves to a loopback address: 127.0.0.1; using 192.168.1.105 instead (on interface en0) | |
WARN 16:25:28,251 org.apache.spark.Logging$class (Logging.scala:71) - Set SPARK_LOCAL_IP if you need to bind to another address | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraSQLClusterLevelSpec: | |
INFO 16:25:33,008 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:33,015 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:33,215 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:33,216 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to join tables from different clusters (6 seconds, 574 milliseconds) | |
INFO 16:25:38,135 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:38,135 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:38,992 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:38,993 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to write data to another cluster (1 second, 460 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraSQLSpec: | |
INFO 16:25:40,643 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:40,645 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select all rows (3 seconds, 468 milliseconds) | |
INFO 16:25:44,119 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(g,2) | |
INFO 16:25:44,121 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(g,2)) | |
[info] - should allow to select rows with index columns (696 milliseconds) | |
INFO 16:25:44,816 org.apache.spark.Logging$class (Logging.scala:59) - filters: GreaterThanOrEqual(b,2) | |
INFO 16:25:44,817 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with >= clause (477 milliseconds) | |
INFO 16:25:45,286 org.apache.spark.Logging$class (Logging.scala:59) - filters: GreaterThan(b,2) | |
INFO 16:25:45,287 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with > clause (471 milliseconds) | |
INFO 16:25:45,755 org.apache.spark.Logging$class (Logging.scala:59) - filters: LessThan(b,2) | |
INFO 16:25:45,756 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with < clause (437 milliseconds) | |
INFO 16:25:46,193 org.apache.spark.Logging$class (Logging.scala:59) - filters: LessThanOrEqual(b,2) | |
INFO 16:25:46,193 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with <= clause (395 milliseconds) | |
INFO 16:25:46,599 org.apache.spark.Logging$class (Logging.scala:59) - filters: In(b,[Ljava.lang.Object;@2cc00afc) | |
INFO 16:25:46,599 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with in clause (398 milliseconds) | |
INFO 16:25:47,017 org.apache.spark.Logging$class (Logging.scala:59) - filters: In(a,[Ljava.lang.Object;@10e2d29d) | |
INFO 16:25:47,017 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(In(a,[Ljava.lang.Object;@10e2d29d)) | |
[info] - should allow to select rows with in clause pushed down (175 milliseconds) | |
INFO 16:25:47,172 org.apache.spark.Logging$class (Logging.scala:59) - filters: Or(EqualTo(b,2),EqualTo(b,1)) | |
INFO 16:25:47,173 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with or clause (394 milliseconds) | |
INFO 16:25:47,555 org.apache.spark.Logging$class (Logging.scala:59) - filters: Not(EqualTo(b,2)) | |
INFO 16:25:47,556 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with != clause (341 milliseconds) | |
INFO 16:25:47,892 org.apache.spark.Logging$class (Logging.scala:59) - filters: Not(EqualTo(b,2)) | |
INFO 16:25:47,893 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with <> clause (343 milliseconds) | |
INFO 16:25:48,237 org.apache.spark.Logging$class (Logging.scala:59) - filters: Not(In(b,[Ljava.lang.Object;@11d5e909)) | |
INFO 16:25:48,238 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with not in clause (281 milliseconds) | |
INFO 16:25:48,525 org.apache.spark.Logging$class (Logging.scala:59) - filters: IsNotNull(b) | |
INFO 16:25:48,526 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with is not null clause (291 milliseconds) | |
INFO 16:25:48,814 org.apache.spark.Logging$class (Logging.scala:59) - filters: StringEndsWith(name,om) | |
INFO 16:25:48,815 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with like clause (320 milliseconds) | |
INFO 16:25:49,122 org.apache.spark.Logging$class (Logging.scala:59) - filters: GreaterThanOrEqual(a,1), LessThanOrEqual(a,2) | |
INFO 16:25:49,123 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with between clause (241 milliseconds) | |
INFO 16:25:49,366 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:49,367 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with alias (256 milliseconds) | |
INFO 16:25:49,618 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:49,619 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with distinct column (989 milliseconds) | |
INFO 16:25:50,614 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:50,614 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with limit clause (239 milliseconds) | |
INFO 16:25:50,857 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:50,858 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with order by clause (712 milliseconds) | |
INFO 16:25:51,575 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:51,576 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with group by clause (731 milliseconds) | |
INFO 16:25:52,306 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:52,306 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:52,310 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:52,310 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with union clause (775 milliseconds) | |
INFO 16:25:53,075 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:53,076 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:53,079 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:53,080 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with union distinct clause (612 milliseconds) | |
INFO 16:25:53,686 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:53,687 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:53,690 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:53,690 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with union all clause (275 milliseconds) | |
INFO 16:25:53,992 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:53,993 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with having clause (581 milliseconds) | |
INFO 16:25:54,543 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,1), EqualTo(c,1) | |
INFO 16:25:54,543 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(a,1), EqualTo(b,1), EqualTo(c,1)) | |
[info] - should allow to select rows with partition column clause (107 milliseconds) | |
INFO 16:25:54,650 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,1), EqualTo(c,1), EqualTo(d,1), EqualTo(e,1) | |
INFO 16:25:54,651 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(e,1), EqualTo(d,1), EqualTo(b,1), EqualTo(c,1), EqualTo(a,1)) | |
[info] - should allow to select rows with partition column and cluster column clause (76 milliseconds) | |
INFO 16:25:54,725 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:54,725 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:54,954 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:54,954 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert into another table (420 milliseconds) | |
INFO 16:25:55,149 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:55,149 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:55,331 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:55,331 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert into another table in different keyspace (369 milliseconds) | |
INFO 16:25:55,557 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:55,558 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:55,561 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:55,561 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to join two tables (610 milliseconds) | |
INFO 16:25:56,181 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:56,181 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:56,185 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:56,186 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to join two tables from different keyspaces (701 milliseconds) | |
INFO 16:25:56,854 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:56,854 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:56,857 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:56,857 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to inner join two tables (644 milliseconds) | |
INFO 16:25:57,524 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:57,525 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:57,528 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:57,529 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to left join two tables (659 milliseconds) | |
INFO 16:25:58,164 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:58,164 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:58,168 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:58,168 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to left outer join two tables (595 milliseconds) | |
INFO 16:25:58,756 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:58,756 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:58,759 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:58,760 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to right join two tables (557 milliseconds) | |
INFO 16:25:59,307 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:59,307 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:59,312 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:59,312 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to right outer join two tables (546 milliseconds) | |
INFO 16:25:59,854 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:59,855 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:25:59,857 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:25:59,858 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to full join two tables (497 milliseconds) | |
INFO 16:26:00,329 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:00,329 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows for collection columns (139 milliseconds) | |
WARN 16:26:00,447 org.apache.spark.Logging$class (Logging.scala:71) - VarIntType is mapped to catalystTypes.DecimalType with unlimited values. | |
INFO 16:26:00,451 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:00,452 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows for data types of ASCII, INT, FLOAT, DOUBLE, BIGINT, BOOLEAN, DECIMAL, INET, TEXT, TIMESTAMP, UUID, VARINT (152 milliseconds) | |
WARN 16:26:00,599 org.apache.spark.Logging$class (Logging.scala:71) - VarIntType is mapped to catalystTypes.DecimalType with unlimited values. | |
INFO 16:26:00,604 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:00,605 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:26:00,773 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:00,774 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert rows for data types of ASCII, INT, FLOAT, DOUBLE, BIGINT, BOOLEAN, DECIMAL, INET, TEXT, TIMESTAMP, UUID, VARINT (294 milliseconds) | |
INFO 16:26:00,902 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:00,902 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select specified non-UDT columns from a table containing some UDT columns (110 milliseconds) | |
INFO 16:26:01,151 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:01,151 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select UDT collection column and nested UDT column (267 milliseconds) | |
INFO 16:26:01,332 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(meterid,4317) | |
INFO 16:26:01,332 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(meterid,4317)) | |
[info] - should allow to restrict a clustering timestamp column value (100 milliseconds) | |
INFO 16:26:01,444 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:01,445 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to min/max timestamp column (440 milliseconds) | |
INFO 16:26:01,908 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:01,908 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:26:02,049 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:02,050 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should use InetAddressType and UUIDType (333 milliseconds) | |
[info] CassandraRDDPartitionerSpec: | |
[info] CassandraRDDPartitioner | |
[info] - should create 1 partition per node if splitCount == 1 (12 milliseconds) | |
[info] - should create about 10000 partitions when splitCount == 10000 (155 milliseconds) | |
[info] - should create multiple partitions if the amount of data is big enough (32 seconds, 689 milliseconds) | |
[info] RDDStreamingSpec: | |
[info] RDDStream | |
[info] - should write from the stream to cassandra table: demo.streaming_wordcount (819 milliseconds) | |
[info] - should be able to utilize joinWithCassandra during transforms (995 milliseconds) | |
[info] - should be able to utilize joinWithCassandra and repartitionByCassandraTable on a Dstream (898 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraDataSourceSpec: | |
INFO 16:26:38,416 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:38,416 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select all rows (4 seconds, 256 milliseconds) | |
INFO 16:26:42,699 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:42,699 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to register as a temp table (311 milliseconds) | |
INFO 16:26:43,009 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:43,009 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:26:43,290 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:43,290 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:26:43,612 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:43,612 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert data into a cassandra table (864 milliseconds) | |
INFO 16:26:44,111 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:44,111 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:26:44,317 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:44,318 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to save data to a cassandra table (770 milliseconds) | |
INFO 16:26:44,671 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:44,672 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:26:44,872 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:44,872 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to overwrite a cassandra table (473 milliseconds) | |
INFO 16:26:45,102 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
INFO 16:26:45,103 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(a,1), EqualTo(b,2), EqualTo(c,1)) | |
[info] - should allow to filter a table (103 milliseconds) | |
INFO 16:26:45,229 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:45,230 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to filter a table with a function for a column alias (247 milliseconds) | |
INFO 16:26:45,484 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
INFO 16:26:45,485 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(a,1), EqualTo(b,2), EqualTo(c,1)) | |
[info] - should allow to filter a table with alias (113 milliseconds) | |
INFO 16:26:45,838 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:45,839 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with reversed order columns to a Cassandra table (471 milliseconds) | |
INFO 16:26:46,111 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:26:46,111 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with partial columns to a Cassandra table (266 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] TableWriterSpec: | |
[info] A TableWriter | |
[info] - should write RDD of tuples to an existing table (2 seconds, 369 milliseconds) | |
[info] - should write RDD of tuples to a new table (119 milliseconds) | |
[info] - should write RDD of tuples applying proper data type conversions (75 milliseconds) | |
[info] - should write RDD of case class objects (66 milliseconds) | |
[info] - should write RDD of case class objects to a new table using auto mapping (120 milliseconds) | |
[info] - should write RDD of case class objects applying proper data type conversions (63 milliseconds) | |
[info] - should write RDD of CassandraRow objects (52 milliseconds) | |
[info] - should write RDD of CassandraRow objects applying proper data type conversions (46 milliseconds) | |
[info] - should write RDD of tuples to a table with camel case column names (40 milliseconds) | |
[info] - should write empty values (45 milliseconds) | |
[info] - should write null values (39 milliseconds) | |
[info] - should write only specific column data if ColumnNames is passed as 'columnNames' (47 milliseconds) | |
[info] - should distinguish (deprecated) implicit `seqToSomeColumns` (44 milliseconds) | |
[info] - should write collections (109 milliseconds) | |
[info] - should write blobs (38 milliseconds) | |
[info] - should increment and decrement counters (73 milliseconds) | |
[info] - should increment and decrement counters in batches (2 seconds, 80 milliseconds) | |
[info] - should write values of user-defined classes (78 milliseconds) | |
[info] - should write values of user-defined-types in Cassandra (62 milliseconds) | |
[info] - should write values of TupleValue type (58 milliseconds) | |
[info] - should write column values of tuple type given as Scala tuples (41 milliseconds) | |
[info] - should write Scala tuples nested in UDTValues (48 milliseconds) | |
[info] - should convert components in nested Scala tuples to proper types (31 milliseconds) | |
[info] - should write to single-column tables (32 milliseconds) | |
[info] - should throw IOException if table is not found (4 milliseconds) | |
[info] - should write RDD of case class objects with default TTL (46 milliseconds) | |
[info] - should write RDD of case class objects with default timestamp (33 milliseconds) | |
[info] - should write RDD of case class objects with per-row TTL (40 milliseconds) | |
[info] - should write RDD of case class objects with per-row timestamp (39 milliseconds) | |
[info] - should write RDD of case class objects with per-row TTL with custom mapping (33 milliseconds) | |
[info] - should write RDD of case class objects with per-row timestamp with custom mapping (38 milliseconds) | |
[info] - should write RDD of case class objects applying proper data type conversions and aliases (39 milliseconds) | |
[info] - should write an RDD of tuples mapped to different ordering of fields (32 milliseconds) | |
[info] - should write an RDD of tuples with only some fields aliased (32 milliseconds) | |
[info] - should throw an exception if you try to alias tuple fields which don't exist (2 milliseconds) | |
[info] - should throw an exception when aliasing some tuple fields explicitly and others implicitly (3 milliseconds) | |
[info] - should write RDD of objects with inherited fields (34 milliseconds) | |
[info] - should write RDD of case class objects with transient fields (43 milliseconds) | |
[info] - should be able to append and prepend elements to a C* list (138 milliseconds) | |
[info] - should be able to remove elements from a C* list (108 milliseconds) | |
[info] - should be able to add elements to a C* set (87 milliseconds) | |
[info] - should be able to remove elements from a C* set (95 milliseconds) | |
[info] - should be able to add key value pairs to a C* map (62 milliseconds) | |
[info] - should throw an exception if you try to apply a collection behavior to a normal column (6 milliseconds) | |
[info] - should throw an exception if you try to remove values from a map (5 milliseconds) | |
[info] - should throw an exception if you prepend anything but a list (6 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraRDDSpec: | |
[info] A CassandraRDD | |
[info] - should allow to read a Cassandra table as Array of CassandraRow (2 seconds, 307 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of pairs of primitives (316 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of tuples (259 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class objects (263 milliseconds) | |
[info] A CassandraRDD | |
[info] - should allow to read a Cassandra table as Array of user-defined objects with inherited fields (232 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined class objects (221 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined class (with multiple constructors) objects (210 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined class (with no fields) objects (199 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class (nested) objects (199 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class (deeply nested) objects (213 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class (nested in object) objects (178 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined mutable objects (172 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class objects with custom mapping specified by aliases (186 milliseconds) | |
[info] - should allow to read a Cassandra table into CassandraRow objects with custom mapping specified by aliases (167 milliseconds) | |
[info] - should apply proper data type conversions for tuples (157 milliseconds) | |
[info] - should apply proper data type conversions for user-defined case class objects (154 milliseconds) | |
[info] - should apply proper data type conversions for user-defined mutable objects (161 milliseconds) | |
[info] - should map columns to objects using user-defined function (163 milliseconds) | |
[info] - should map columns to objects using user-defined function with type conversion (150 milliseconds) | |
[info] - should allow for selecting a subset of columns (151 milliseconds) | |
[info] - should allow for selecting a subset of rows (143 milliseconds) | |
[info] - should use a single partition per node for a tiny table (11 milliseconds) | |
[info] - should allow for reading collections (152 milliseconds) | |
[info] - should allow for reading blobs (117 milliseconds) | |
[info] - should allow for converting fields to custom types by user-defined TypeConverter (132 milliseconds) | |
[info] - should allow for reading tables with composite partitioning key (144 milliseconds) | |
[info] - should convert values passed to where to correct types (String -> Timestamp) (224 milliseconds) | |
[info] - should convert values passed to where to correct types (DateTime -> Timestamp) (201 milliseconds) | |
[info] - should convert values passed to where to correct types (Date -> Timestamp) (131 milliseconds) | |
[info] - should convert values passed to where to correct types (String -> Timestamp) (double limit) (192 milliseconds) | |
[info] - should convert values passed to where to correct types (DateTime -> Timestamp) (double limit) (140 milliseconds) | |
[info] - should convert values passed to where to correct types (Date -> Timestamp) (double limit) (128 milliseconds) | |
[info] - should accept partitioning key in where (38 milliseconds) | |
[info] - should accept partitioning key and clustering column predicate in where (35 milliseconds) | |
[info] - should accept composite partitioning key in where (38 milliseconds) | |
[info] - should allow to fetch columns from a table with user defined Cassandra type (UDT) (119 milliseconds) | |
[info] - should allow to fetch UDT columns as UDTValue objects (128 milliseconds) | |
[info] - should allow to fetch UDT columns as objects of case classes (124 milliseconds) | |
[info] - should allow to fetch tuple columns as TupleValue objects (124 milliseconds) | |
[info] - should allow to fetch tuple columns as Scala tuples (124 milliseconds) | |
[info] - should throw appropriate IOException when the table was not found at the computation time (15 milliseconds) | |
[info] - should be lazy and must not throw IOException if the table was not found at the RDD initialization time (0 milliseconds) | |
Start thread count: 88 | |
End thread count: 87 | |
Threads created: | |
[info] - should not leak threads (13 seconds, 239 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of two pairs (136 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a pair and a case class (86 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a case class and a tuple (79 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a case class and a tuple grouped by partition key (129 milliseconds) | |
[info] - should allow to read Cassandra table as Array of tuples of two case classes (78 milliseconds) | |
[info] - should allow to read Cassandra table as Array of String values (74 milliseconds) | |
[info] - should allow to read Cassandra table as Array of Int values (76 milliseconds) | |
[info] - should allow to read Cassandra table as Array of java.lang.Integer values (71 milliseconds) | |
[info] - should allow to read Cassandra table as Array of List of values (82 milliseconds) | |
[info] - should allow to read Cassandra table as Array of Set of values (78 milliseconds) | |
[info] - should allow to count a high number of rows (266 milliseconds) | |
[info] - should allow to fetch write time of a specified column as a tuple element (111 milliseconds) | |
[info] - should allow to fetch ttl of a specified column as a tuple element (98 milliseconds) | |
[info] - should allow to fetch both write time and ttl of a specified column as tuple elements (93 milliseconds) | |
[info] - should allow to fetch write time of two different columns as tuple elements (94 milliseconds) | |
[info] - should allow to fetch ttl of two different columns as tuple elements (92 milliseconds) | |
[info] - should allow to fetch writetime of a specified column and map it to a class field with custom mapping (99 milliseconds) | |
[info] - should allow to fetch ttl of a specified column and map it to a class field with custom mapping (97 milliseconds) | |
[info] - should allow to fetch writetime of a specified column and map it to a class field with aliases (89 milliseconds) | |
[info] - should allow to fetch ttl of a specified column and map it to a class field with aliases (89 milliseconds) | |
[info] - should allow to specify ascending ordering (31 milliseconds) | |
[info] - should allow to specify descending ordering (29 milliseconds) | |
[info] - should allow to specify rows number limit (27 milliseconds) | |
[info] - should allow to specify rows number with take (25 milliseconds) | |
[info] - should count the CassandraRDD items (292 milliseconds) | |
[info] - should count the CassandraRDD items with where predicate (25 milliseconds) | |
[info] - should allow to use empty RDD on undefined table (4 milliseconds) | |
[info] - should allow to use empty RDD on defined table (2 milliseconds) | |
[info] - should suggest similar tables if table doesn't exist but keyspace does (4 milliseconds) | |
[info] - should suggest possible keyspace and table matches if the keyspace and table do not exist (3 milliseconds) | |
[info] - should suggest possible keyspaces if the table exists but in a different keyspace (1 millisecond) | |
[info] - should suggest possible keyspaces and tables if the table has a fuzzy match but they keyspace does not (2 milliseconds) | |
[info] - should handle upper case charactors in UDT fields (149 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraPartitionKeyWhereSpec: | |
[info] A CassandraRDD | |
[info] - should allow partition key eq in where (3 seconds, 369 milliseconds) | |
[info] - should allow partition key 'in' in where (59 milliseconds) | |
[info] - should allow cluster key 'in' in where (457 milliseconds) | |
[info] - should work with composite keys in (57 milliseconds) | |
[info] - should work with composite keys eq (35 milliseconds) | |
[info] - should work with composite keys in2 (35 milliseconds) | |
[info] RoutingKeyGeneratorSpec: | |
[info] RoutingKeyGenerator | |
[info] - should generate proper routing keys when there is one partition key column (22 milliseconds) | |
[info] RoutingKeyGenerator | |
[info] - should generate proper routing keys when there are more partition key columns (7 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] TableWriterColumnNamesSpec: | |
[info] TableWriter | |
[info] - must distinguish `AllColumns` (2 milliseconds) | |
[info] - must distinguish and use only specified column names if provided (2 milliseconds) | |
[info] - must distinguish and use only specified column names if provided, when aliases are specified (1 millisecond) | |
[info] - must fail in the RowWriter if provided specified column names do not include primary keys (5 milliseconds) | |
[info] - must do not use TTL when it is not specified (3 milliseconds) | |
[info] - must use static TTL if it is specified (2 milliseconds) | |
[info] - must use static timestamp if it is specified (2 milliseconds) | |
[info] - must use both static TTL and static timestamp when they are specified (2 milliseconds) | |
[info] - must use per-row TTL and timestamp when the row writer provides them (2 milliseconds) | |
[info] - must use per-row TTL and static timestamp (3 milliseconds) | |
[info] - must use per-row timestamp and static TTL (2 milliseconds) | |
[info] - must use per-row TTL (1 millisecond) | |
[info] - must use per-row timestamp (2 milliseconds) | |
[info] GroupingBatchBuilderSpec: | |
[info] GroupingBatchBuilder in fixed batch key mode | |
[info] - should make bound statements when batch size is specified as RowsInBatch(1) (25 milliseconds) | |
[info] - should make bound statements when batch size is specified as BytesInBatch(0) (4 milliseconds) | |
[info] - should make a batch and a bound statements according to the number of statements in a group (4 milliseconds) | |
[info] - should make equal batches when batch size is specified in rows (3 milliseconds) | |
[info] - should make batches of size not greater than the size specified in bytes (5 milliseconds) | |
[info] - should produce empty stream when no data is available and batch size is specified in rows (1 millisecond) | |
[info] - should produce empty stream when no data is available and batch size is specified in bytes (1 millisecond) | |
[info] GroupingBatchBuilder in dynamic batch key mode | |
[info] - should make bound statements when batch size is specified as RowsInBatch(1) (2 milliseconds) | |
[info] - should make bound statements when batch size is specified as BytesInBatch(0) (2 milliseconds) | |
[info] - should make a batch and a bound statements according to the number of statements in a group and a batch key (2 milliseconds) | |
[info] - should make bound statements if batches cannot be made due to imposed limits (2 milliseconds) | |
[info] - should make equal batches when batch size is specified in rows and batch buffer is enough (2 milliseconds) | |
[info] - should make batches of size not greater than the size specified in bytes (4 milliseconds) | |
[info] - should produce empty stream when no data is available and batch size is specified in rows (1 millisecond) | |
[info] - should produce empty stream when no data is available and batch size is specified in bytes (2 milliseconds) | |
[info] - should work with random data (746 milliseconds) | |
[info] DataSizeEstimatesSpec: | |
[info] DataSizeEstimates | |
[info] - should fetch data size estimates for a known table !!! IGNORED !!! | |
[info] - should should return zeroes for an empty table (28 milliseconds) | |
[info] - should return zeroes for a non-existing table (1 millisecond) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraRDDReplSpec: | |
[info] - should allow to read a Cassandra table as Array of Scala class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of Scala case class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of ordinary Scala class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of Scala class without fields objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of Scala class with multiple constructors objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of inner Scala case class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of deeply nested inner Scala case class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of nested Scala case class objects in REPL !!! IGNORED !!! | |
[info] CassandraConnectorSpec: | |
[info] A CassandraConnector | |
[info] - should connect to Cassandra with native protocol (21 milliseconds) | |
[info] - should give access to cluster metadata (1 millisecond) | |
[info] - should run queries (55 milliseconds) | |
[info] - should cache PreparedStatements (29 milliseconds) | |
[info] - should disconnect from the cluster after use (501 milliseconds) | |
[info] - should share internal Cluster and Session object between multiple logical sessions (31 milliseconds) | |
[info] - should share internal Cluster object between multiple logical sessions created by different connectors to the same cluster (1 millisecond) | |
[info] - should cache session objects for reuse (2 milliseconds) | |
[info] - should not make multiple clusters when writing multiple RDDs (3 seconds, 14 milliseconds) | |
[info] - should be configurable from SparkConf (1 millisecond) | |
WARN 16:27:49,633 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should accept multiple hostnames in spark.cassandra.connection.host property (5 seconds, 35 milliseconds) | |
[info] - should use compression when configured (33 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=192.168.254.254,127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:27:54,991 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] RDDSpec: | |
[info] A Tuple RDD specifying partition keys | |
[info] - should be joinable with Cassandra (2 seconds, 429 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:28:05,125 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a tuple from Cassandra (5 seconds, 277 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a case class from cassandra (146 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be repartitionable (306 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] A case-class RDD specifying partition keys | |
[info] - should be retrievable from Cassandra (147 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a tuple from Cassandra (132 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a case class from cassandra (142 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be repartitionable (190 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] - should throw a meaningful exception if partition column is null when joining with Cassandra table (138 milliseconds) | |
[info] - should throw a meaningful exception if partition column is null when repartitioning by replica (90 milliseconds) | |
[info] - should throw a meaningful exception if partition column is null when saving (167 milliseconds) | |
[info] A Tuple RDD specifying partitioning keys and clustering keys | |
WARN 16:28:06,852 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 0.3 in stage 19.0 (TID 222, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:28:06,857 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 6.3 in stage 19.0 (TID 223, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:28:06,858 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 1.2 in stage 19.0 (TID 213, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:28:06,860 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 4.3 in stage 19.0 (TID 221, 192.168.1.105): TaskKilled (killed intentionally) | |
[info] - should be retrievable from Cassandra (122 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a tuple from Cassandra (108 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a case class from cassandra (116 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be repartitionable (163 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] - should be joinable on both partitioning key and clustering key (115 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be joinable on both partitioning key and clustering key using on (107 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be be able to be limited (42 milliseconds) | |
[info] - should have be able to be counted (77 milliseconds) | |
[info] A CassandraRDD | |
[info] - should be joinable with Cassandra (627 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:28:13,344 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a tuple from Cassandra (5 seconds, 520 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:28:18,863 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a case class from cassandra (5 seconds, 529 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:28:24,393 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable without repartitioning (5 seconds, 319 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] - should be repartitionable (254 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] A Joined CassandraRDD | |
[info] - should support select clauses (221 milliseconds) | |
[info] - should support where clauses (44 milliseconds) | |
[info] - should support parametrized where clauses (41 milliseconds) | |
[info] - should throw an exception if using a where on a column that is specified by the join (6 milliseconds) | |
[info] - should throw an exception if using a where on a column that is a part of the Partition key (6 milliseconds) | |
[info] - should throw an exception if you don't have all Partition Keys available (8 milliseconds) | |
[info] - should throw an exception if you try to join on later clustering columns without earlier ones (10 milliseconds) | |
[info] - should throw an exception if you try to join on later clustering columns without earlier ones even when out of order (10 milliseconds) | |
[info] - should throw an exception if you try to join on later clustering columns without earlier ones even when reversed (11 milliseconds) | |
[info] - should throw an exception if you try to join with a data column (11 milliseconds) | |
[info] - should allow to use empty RDD on undefined table (6 milliseconds) | |
[info] - should allow to use empty RDD on defined table (4 milliseconds) | |
[info] - should be lazy and not throw an exception if the table is not found at initializaiton time (3 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=192.168.254.254,127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:28:30,635 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] CassandraPrunedScanSpec: | |
INFO 16:28:30,708 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:30,708 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select all rows (2 seconds, 364 milliseconds) | |
WARN 16:28:38,092 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:28:38,120 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:38,120 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to register as a temp table (5 seconds, 421 milliseconds) | |
WARN 16:28:43,561 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:28:43,587 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:43,894 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:44,242 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
[info] - should allow to insert data into a cassandra table (5 seconds, 974 milliseconds) | |
WARN 16:28:49,526 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:28:50,097 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
WARN 16:28:55,366 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:28:55,383 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:55,383 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to save data to a cassandra table (11 seconds, 234 milliseconds) | |
INFO 16:28:55,749 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:55,944 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
[info] - should allow to overwrite a cassandra table (418 milliseconds) | |
INFO 16:28:56,135 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
[info] - should allow to filter a table (233 milliseconds) | |
INFO 16:28:56,384 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
[info] - should allow to filter a table with a function for a column alias (231 milliseconds) | |
INFO 16:28:56,627 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
[info] - should allow to filter a table with alias (223 milliseconds) | |
INFO 16:28:56,916 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:56,916 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with reversed order columns to a Cassandra table (271 milliseconds) | |
INFO 16:28:57,159 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:28:57,160 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with partial columns to a Cassandra table (232 milliseconds) | |
[info] CheckpointStreamSpec: | |
WARN 16:29:06,553 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] Spark Streaming + Checkpointing | |
[Stage 0:> (0 + 2) / 2]Exception in thread "pool-900-thread-1" java.lang.Error: java.lang.InterruptedException | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
Caused by: java.lang.InterruptedException | |
at java.lang.Object.wait(Native Method) | |
at java.lang.Object.wait(Object.java:503) | |
at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:530) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1732) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1750) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1765) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286) | |
at org.apache.spark.rdd.RDD.collect(RDD.scala:884) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$init$$2.apply(TestSuiteBase.scala:123) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$init$$2.apply(TestSuiteBase.scala:122) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:42) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at scala.util.Try$.apply(Try.scala:161) | |
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:34) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:192) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
... 2 more | |
WARN 16:29:11,696 org.apache.spark.Logging$class (Logging.scala:71) - isTimeValid called with 500 ms where as last valid time is 1000 ms | |
[Stage 0:> (0 + 2) / 2]Exception in thread "pool-913-thread-1" java.lang.Error: java.lang.InterruptedException | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
Caused by: java.lang.InterruptedException | |
at java.lang.Object.wait(Native Method) | |
at java.lang.Object.wait(Object.java:503) | |
at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:530) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1732) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1750) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1765) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286) | |
at org.apache.spark.rdd.RDD.collect(RDD.scala:884) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$init$$2.apply(TestSuiteBase.scala:123) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$init$$2.apply(TestSuiteBase.scala:122) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:42) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at scala.util.Try$.apply(Try.scala:161) | |
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:34) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:192) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
... 2 more | |
[info] - should work with JWCTable and RPCassandra Replica (8 seconds, 781 milliseconds) | |
objc[54749]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:29:19,077 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:29:19,079 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
[info] CassandraAuthenticatedConnectorSpec: | |
[info] A CassandraConnector | |
WARN 16:29:27,835 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should authenticate with username and password when using native protocol (5 seconds, 274 milliseconds) | |
[info] - should pick up user and password from SparkConf (2 milliseconds) | |
objc[54751]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:29:31,517 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:29:31,519 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] MultiThreadedSpec: | |
[info] A Spark Context | |
[info] - should be able to read a Cassandra table in different threads (4 seconds, 801 milliseconds) | |
[info] CassandraConnectorSourceSpec: | |
[info] CassandraConnectorSource | |
org.apache.spark.metrics.CassandraConnectorSource | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] - should be initialized when it was specified in metrics properties (473 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] - should not be initialized when it wasn't specified in metrics properties (265 milliseconds) | |
[info] - should be able to create a new instance in the executor environment only once (2 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraDataFrameSpec: | |
[info] A DataFrame | |
INFO 16:29:45,095 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:29:45,095 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to be created programmatically (3 seconds, 784 milliseconds) | |
INFO 16:29:49,220 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:29:49,220 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:29:49,967 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:29:49,967 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to be saved programatically (1 second, 539 milliseconds) | |
[info] - should provide error out with a sensible message when a table can't be found (20 milliseconds) | |
[info] - should provide useful suggestions if a table can't be found but a close match exists (2 milliseconds) | |
objc[54888]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:29:54,449 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:29:54,451 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
[info] CassandraSSLConnectorSpec: | |
[info] A CassandraConnector | |
[info] - should be able to use a secure connection when using native protocol (321 milliseconds) | |
objc[54890]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:30:01,872 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:30:01,874 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
[info] SchemaSpec: | |
[info] A Schema | |
[info] - should allow to get a list of keyspaces (1 millisecond) | |
[info] - should allow to look up a keyspace by name (0 milliseconds) | |
[info] A KeyspaceDef | |
[info] - should allow to get a list of tables in the given keyspace (1 millisecond) | |
[info] - should allow to look up a table by name (0 milliseconds) | |
[info] A TableDef | |
[info] - should allow to read column definitions by name (0 milliseconds) | |
[info] - should allow to read primary key column definitions (8 milliseconds) | |
[info] - should allow to read partitioning key column definitions (2 milliseconds) | |
[info] - should allow to read regular column definitions (1 millisecond) | |
[info] - should allow to read proper types of columns (0 milliseconds) | |
[info] - should allow to list fields of a user defined type (1 millisecond) | |
[info] ScalaTest | |
[info] Run completed in 4 minutes, 50 seconds. | |
[info] Total number of tests run: 304 | |
[info] Suites: completed 23, aborted 0 | |
[info] Tests: succeeded 304, failed 0, canceled 0, ignored 9, pending 0 | |
[info] All tests passed. | |
[info] Passed: Total 304, Failed 0, Errors 0, Passed 304, Ignored 9 | |
objc[54894]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:30:11,828 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:30:11,830 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:30:14,548 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
WARN 16:30:14,648 org.apache.spark.Logging$class (Logging.scala:71) - Your hostname, ursus-major resolves to a loopback address: 127.0.0.1; using 192.168.1.105 instead (on interface en0) | |
WARN 16:30:14,648 org.apache.spark.Logging$class (Logging.scala:71) - Set SPARK_LOCAL_IP if you need to bind to another address | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraJavaRDDSpec: | |
[info] CassandraJavaRDD | |
[info] - should allow to read data as CassandraRows (3 seconds, 264 milliseconds) | |
[info] - should allow to read data as Java beans (1 second, 260 milliseconds) | |
[info] - should allow to read data as Java beans with inherited fields (551 milliseconds) | |
[info] - should allow to read data as Java beans with custom mapping defined by aliases (507 milliseconds) | |
[info] - should allow to read data as Java beans (with multiple constructors) (429 milliseconds) | |
[info] - should throw NoSuchMethodException when trying to read data as Java beans (without no-args constructor) (34 milliseconds) | |
[info] - should allow to read data as nested Java beans (385 milliseconds) | |
[info] - should allow to read data as deeply nested Java beans (413 milliseconds) | |
[info] - should allow to select a subset of columns (358 milliseconds) | |
[info] - should return selected columns (8 milliseconds) | |
[info] - should allow to use where clause to filter records (521 milliseconds) | |
[info] - should allow to read rows as an array of a single-column type supported by TypeConverter (1 second, 53 milliseconds) | |
[info] - should allow to read rows as an array of a single-column list (315 milliseconds) | |
[info] - should allow to read rows as an array of a single-column set (248 milliseconds) | |
[info] - should allow to read rows as an array of a single-column map (245 milliseconds) | |
[info] - should allow to read rows as an array of multi-column type (228 milliseconds) | |
[info] - should allow to read rows as an array of multi-column type with explicit column name mapping (229 milliseconds) | |
[info] - should allow to transform rows into KV pairs of two single-column types (235 milliseconds) | |
[info] - should allow to transform rows into KV pairs of a single-column type and a multi-column type (243 milliseconds) | |
[info] - should allow to transform rows into KV pairs of a multi-column type and a single-column type (227 milliseconds) | |
[info] - should allow to transform rows into KV pairs of multi-column types (173 milliseconds) | |
[info] - should allow to read Cassandra data as array of Integer (162 milliseconds) | |
[info] - should allow to change the default Cassandra Connector to a custom one (173 milliseconds) | |
[info] - should allow to read null columns (108 milliseconds) | |
[info] - should allow to fetch UDT columns (173 milliseconds) | |
[info] - should allow to fetch tuple columns (178 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a case class and a tuple grouped by partition key (215 milliseconds) | |
[info] - should allow to set limit (76 milliseconds) | |
[info] - should allow to set ascending ordering (51 milliseconds) | |
[info] - should allow to set descending ordering (49 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraJavaUtilSpec: | |
[info] CassandraJavaUtil | |
[info] - should allow to save beans (with multiple constructors) to Cassandra (3 seconds, 335 milliseconds) | |
[info] - should allow to save beans to Cassandra (95 milliseconds) | |
[info] - should allow to save beans with transient fields to Cassandra (81 milliseconds) | |
[info] - should allow to save beans with inherited fields to Cassandra (83 milliseconds) | |
[info] - should allow to save nested beans to Cassandra (77 milliseconds) | |
[info] - should allow to read rows as Tuple1 (499 milliseconds) | |
[info] - should allow to read rows as Tuple2 (383 milliseconds) | |
[info] - should allow to read rows as Tuple3 (314 milliseconds) | |
[info] - should allow to read rows as Tuple4 (295 milliseconds) | |
[info] - should allow to read rows as Tuple5 (280 milliseconds) | |
[info] - should allow to read rows as Tuple6 (271 milliseconds) | |
[info] - should allow to read rows as Tuple7 (287 milliseconds) | |
[info] - should allow to read rows as Tuple8 (252 milliseconds) | |
[info] - should allow to read rows as Tuple9 (266 milliseconds) | |
[info] - should allow to read rows as Tuple10 (273 milliseconds) | |
[info] - should allow to read rows as Tuple11 (245 milliseconds) | |
[info] - should allow to read rows as Tuple12 (242 milliseconds) | |
[info] - should allow to read rows as Tuple13 (233 milliseconds) | |
[info] - should allow to read rows as Tuple14 (243 milliseconds) | |
[info] - should allow to read rows as Tuple15 (215 milliseconds) | |
[info] - should allow to read rows as Tuple16 (181 milliseconds) | |
[info] - should allow to read rows as Tuple17 (174 milliseconds) | |
[info] - should allow to read rows as Tuple18 (178 milliseconds) | |
[info] - should allow to read rows as Tuple19 (185 milliseconds) | |
[info] - should allow to read rows as Tuple20 (186 milliseconds) | |
[info] - should allow to read rows as Tuple21 (163 milliseconds) | |
[info] - should allow to read rows as Tuple22 (146 milliseconds) | |
[info] - should allow to write Tuple1 to Cassandra (59 milliseconds) | |
[info] - should allow to write Tuple2 to Cassandra (52 milliseconds) | |
[info] - should allow to write Tuple3 to Cassandra (55 milliseconds) | |
[info] - should allow to write Tuple4 to Cassandra (53 milliseconds) | |
[info] - should allow to write Tuple5 to Cassandra (52 milliseconds) | |
[info] - should allow to write Tuple6 to Cassandra (54 milliseconds) | |
[info] - should allow to write Tuple7 to Cassandra (61 milliseconds) | |
[info] - should allow to write Tuple8 to Cassandra (57 milliseconds) | |
[info] - should allow to write Tuple9 to Cassandra (59 milliseconds) | |
[info] - should allow to write Tuple10 to Cassandra (64 milliseconds) | |
[info] - should allow to write Tuple11 to Cassandra (60 milliseconds) | |
[info] - should allow to write Tuple12 to Cassandra (55 milliseconds) | |
[info] - should allow to write Tuple13 to Cassandra (58 milliseconds) | |
[info] - should allow to write Tuple14 to Cassandra (54 milliseconds) | |
[info] - should allow to write Tuple15 to Cassandra (54 milliseconds) | |
[info] - should allow to write Tuple16 to Cassandra (55 milliseconds) | |
[info] - should allow to write Tuple17 to Cassandra (55 milliseconds) | |
[info] - should allow to write Tuple18 to Cassandra (52 milliseconds) | |
[info] - should allow to write Tuple19 to Cassandra (83 milliseconds) | |
[info] - should allow to write Tuple20 to Cassandra (54 milliseconds) | |
[info] - should allow to write Tuple21 to Cassandra (56 milliseconds) | |
[info] - should allow to write Tuple22 to Cassandra (59 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/spark-cassandra-connector-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.10/kafka-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.10/simple-demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.10/demos_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.10/twitter-streaming_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/spark-cassandra-connector-embedded-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-it_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.10/spark-cassandra-connector-java-test_2.10-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.10/root_2.10-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.10/bundles/scalactic_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.10/bundles/scalatest_2.10-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.10/jars/kafka_2.10-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.10/jars/spark-streaming_2.10-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.10/jars/akka-testkit_2.10-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.10/jars/scalamock-core_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.10/jars/scalamock-scalatest-support_2.10-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraJavaPairRDDSpec: | |
[info] CassandraJavaPairRDD | |
[info] - should allow to reduce by key (3 seconds, 624 milliseconds) | |
[info] - should allow to use spanBy method (369 milliseconds) | |
[info] - should allow to use spanByKey method (258 milliseconds) | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions2 started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions3 started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions6 started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions7 started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.201s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnToListOf started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnToMapOf started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnToSetOf started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnTo1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnTo2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnTo3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter4 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeTag1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeTag2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeTag3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions4 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions5 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo4 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testConvertToMap started | |
[info] Test run finished: 0 failed, 0 ignored, 22 total, 0.117s | |
[info] ScalaTest | |
[info] Run completed in 39 seconds, 16 milliseconds. | |
[info] Total number of tests run: 82 | |
[info] Suites: completed 3, aborted 0 | |
[info] Tests: succeeded 82, failed 0, canceled 0, ignored 0, pending 0 | |
[info] All tests passed. | |
[info] Passed: Total 108, Failed 0, Errors 0, Passed 108 | |
[success] Total time: 332 s, completed Sep 17, 2015 4:30:48 PM | |
Tests succeeded | |
stopping org.apache.spark.deploy.worker.Worker | |
stopping org.apache.spark.deploy.master.Master | |
ursus-major:spark-cassandra-connector jlewandowski$ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ursus-major:spark-cassandra-connector jlewandowski$ dev/run-real-tests.sh 1.4.0 2.11 | |
Compiling everything and packaging against Scala 2.11 | |
Launching sbt from sbt/sbt-launch-0.13.8.jar | |
[info] Loading project definition from /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/project | |
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases | |
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots | |
Scala: 2.11.6 | |
Scala Binary: 2.11 | |
Java: target=1.7 user=1.7.0_79 | |
[info] Set current project to root (in build file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/) | |
[success] Total time: 6 s, completed Sep 17, 2015 4:42:53 PM | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}spark-cassandra-connector-embedded... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}demos... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}spark-cassandra-connector... | |
[info] Done updating. | |
[info] Compiling 11 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/classes... | |
[warn] /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/src/main/scala/com/datastax/spark/connector/embedded/EmbeddedCassandra.scala:190: a type was inferred to be `Any`; this may indicate a programming error. | |
[warn] private val jmxPort = props.getOrElse("jmx_port", DefaultJmxPort) | |
[warn] ^ | |
[warn] one warning found | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}twitter-streaming... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}spark-cassandra-connector-java... | |
[info] Done updating. | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}kafka-streaming... | |
[info] Done updating. | |
[info] Compiling 140 Scala sources and 1 Java source to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/classes... | |
[warn] /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/src/main/scala/com/datastax/spark/connector/util/ReflectionUtil.scala:93: method newTermName in trait Names is deprecated: Use TermName instead | |
[warn] val member = tpe.member(newTermName(methodName)) | |
[warn] ^ | |
[warn] one warning found | |
[info] Updating {file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/}simple-demos... | |
[info] Done updating. | |
[info] Compiling 3 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/classes... | |
[info] Compiling 6 Scala sources and 13 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/classes... | |
[info] Compiling 1 Scala source to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Compiling 42 Scala sources and 8 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/test-classes... | |
[info] Compiling 7 Scala sources and 1 Java source to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/classes... | |
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Compiling 6 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/test-classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar ... | |
[info] Done packaging. | |
[success] Total time: 71 s, completed Sep 17, 2015 4:44:05 PM | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Compiling 24 Scala sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/it-classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Compiling 3 Scala sources and 2 Java sources to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/it-classes... | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[success] Total time: 24 s, completed Sep 17, 2015 4:44:28 PM | |
[info] Including from cache: cassandra-clientutil-2.1.5.jar | |
[info] Including from cache: joda-convert-1.2.jar | |
[info] Including from cache: metrics-core-3.0.2.jar | |
[info] Including from cache: slf4j-api-1.7.5.jar | |
[info] Including from cache: jsr166e-1.1.0.jar | |
[info] Including from cache: commons-lang3-3.3.2.jar | |
[info] Including from cache: joda-time-2.3.jar | |
[info] Including from cache: cassandra-driver-core-2.1.5.jar | |
[info] Including from cache: netty-3.9.0.Final.jar | |
[info] Including from cache: guava-14.0.1.jar | |
[info] Including from cache: joda-convert-1.2.jar | |
[info] Including from cache: slf4j-api-1.7.5.jar | |
[info] Including from cache: metrics-core-3.0.2.jar | |
[info] Including from cache: jsr166e-1.1.0.jar | |
[info] Including from cache: cassandra-clientutil-2.1.5.jar | |
[info] Including from cache: commons-lang3-3.3.2.jar | |
[info] Including from cache: joda-time-2.3.jar | |
[info] Including from cache: cassandra-driver-core-2.1.5.jar | |
[info] Including from cache: netty-3.9.0.Final.jar | |
[info] Including from cache: guava-14.0.1.jar | |
[info] OutputMetricsUpdaterSpec: | |
[info] OutputMetricsUpdater | |
[info] - should initialize task metrics properly when they are empty (474 milliseconds) | |
[info] - should initialize task metrics properly when they are defined (1 millisecond) | |
[info] - should create updater which uses task metrics (2 milliseconds) | |
[info] - should create updater which does not use task metrics (9 milliseconds) | |
[info] - should create updater which uses Codahale metrics (28 milliseconds) | |
[info] - should create updater which doesn't use Codahale metrics (1 millisecond) | |
[info] - should work correctly with multiple threads (276 milliseconds) | |
[info] ReflectionUtilSpec: | |
[info] ReflectionUtil.findGlobalObject | |
[info] - should be able to find DefaultConnectionFactory (1 second, 275 milliseconds) | |
[info] - should be able to find a global object in a multi-threaded context (23 milliseconds) | |
[info] - should be able to instantiate a singleton object based on Java class name (6 milliseconds) | |
[info] - should cache Java class instances (3 milliseconds) | |
[info] - should throw IllegalArgumentException when asked for a Scala object of wrong type (22 milliseconds) | |
[info] - should throw IllegalArgumentException when asked for class instance of wrong type (16 milliseconds) | |
[info] - should throw IllegalArgumentException when object does not exist (3 milliseconds) | |
[info] ReflectionUtil.constructorParams | |
[info] - should return proper constructor param names and types for a class with a single constructor (35 milliseconds) | |
[info] - should return main constructor's param names and types for a class with multiple constructors (2 milliseconds) | |
[info] ReflectionUtil.getters | |
[info] - should return getter names and types (13 milliseconds) | |
[info] ReflectionUtil.setters | |
[info] - should return setter names and types (4 milliseconds) | |
[info] ReflectionUtil.methodParamTypes | |
[info] - should return method param types (2 milliseconds) | |
[info] - should return proper method param types for generic type (2 milliseconds) | |
[info] - should throw IllegalArgumentException if the requested method is missing (4 milliseconds) | |
[info] MappedToGettableDataConverterSpec: | |
[info] MappedToGettableDataConverter | |
[info] - should be Serializable (142 milliseconds) | |
[info] - should convert a simple case class to a CassandraRow (13 milliseconds) | |
[info] - should convert a simple case class to a UDTValue (10 milliseconds) | |
[info] - should convert a Scala tuple to a TupleValue (21 milliseconds) | |
[info] - should convert nested classes (16 milliseconds) | |
[info] - should convert a nested UDTValue to a UDTValue (20 milliseconds) | |
[info] - should convert user defined types nested in collections (22 milliseconds) | |
[info] - should convert user defined types nested in tuples (13 milliseconds) | |
[info] - should convert tuples nested in user defined types (33 milliseconds) | |
[info] - should convert nulls to Scala Nones (7 milliseconds) | |
[info] - should convert using custom column aliases (7 milliseconds) | |
[info] - should convert a java bean to a CassandraRow (7 milliseconds) | |
[info] - should convert nested JavaBeans (6 milliseconds) | |
[info] - should convert commons-lang3 Pairs to TupleValues (16 milliseconds) | |
[info] - should convert commons-lang3 Triples to TupleValues (19 milliseconds) | |
[info] - should throw a meaningful exception when a column has an incorrect type (10 milliseconds) | |
[info] - should throw a meaningful exception when a tuple field has an incorrect number of components (8 milliseconds) | |
[info] - should work after serialization/deserialization (8 milliseconds) | |
[info] TableDefSpec: | |
[info] A TableDef#cql method | |
[info] should produce valid CQL | |
[info] - when it contains no clustering columns (4 milliseconds) | |
[info] - when it contains clustering columns (1 millisecond) | |
[info] - when it contains compound partition key and multiple clustering columns (1 millisecond) | |
[info] - when it contains a column of a collection type (2 milliseconds) | |
[info] ConfigCheckSpec: | |
[info] ConfigCheck | |
[info] - should throw an exception when the configuration contains a invalid spark.cassandra prop (41 milliseconds) | |
[info] - should suggest alternatives if you have a slight misspelling (12 milliseconds) | |
[info] - should suggest alternatives if you miss a word (10 milliseconds) | |
[info] - should not throw an exception if you have a random variable not in the spark.cassandra space (1 millisecond) | |
[info] - should not list all options as suggestions (3 milliseconds) | |
[info] - should not give suggestions when the variable is very strange (3 milliseconds) | |
[info] - should accept custom ConnectionFactory properties (4 milliseconds) | |
[info] - should accept custom AuthConfFactory properties (4 milliseconds) | |
[info] PredicatePushDownSpec: | |
[info] PredicatePushDown | |
[info] - should push down all equality predicates restricting partition key columns (16 milliseconds) | |
[info] - should not push down a partition key predicate for a part of the partition key (1 millisecond) | |
[info] - should not push down a range partition key predicate (1 millisecond) | |
[info] - should push down an IN partition key predicate on the last partition key column (1 millisecond) | |
[info] - should not push down an IN partition key predicate on the non-last partition key column (0 milliseconds) | |
[info] - should push down the first clustering column predicate (1 millisecond) | |
[info] - should push down the first and the second clustering column predicate (1 millisecond) | |
[info] - should push down restrictions on only the initial clustering columns (1 millisecond) | |
[info] - should push down only one range predicate restricting the first clustering column, if there are more range predicates on different clustering columns (0 milliseconds) | |
[info] - should push down multiple range predicates for the same clustering column (0 milliseconds) | |
[info] - should push down clustering column predicates when the last clustering column is restricted by IN (1 millisecond) | |
[info] - should stop pushing down clustering column predicates on the first range predicate (1 millisecond) | |
[info] - should not push down IN restriction on non-last column (1 millisecond) | |
[info] - should not push down any clustering column predicates, if the first clustering column is missing (1 millisecond) | |
[info] - should push down equality predicates on regular indexed columns (1 millisecond) | |
[info] - should not push down range predicates on regular indexed columns (1 millisecond) | |
[info] - should not push down IN predicates on regular indexed columns (1 millisecond) | |
[info] - should push down predicates on regular non-indexed and indexed columns (1 millisecond) | |
[info] - should not push down predicates on regular non-indexed columns if indexed ones are not included (1 millisecond) | |
[info] - should prefer to push down equality predicates over range predicates (1 millisecond) | |
[info] - should not push down unsupported predicates (1 millisecond) | |
[info] RandomPartitionerTokenFactorySpec: | |
[info] RandomPartitionerTokenFactory | |
[info] - should create a token from String (1 millisecond) | |
[info] - should create a String representation of a token (0 milliseconds) | |
[info] - should calculate the distance between tokens if right > left (0 milliseconds) | |
[info] - should calculate the distance between tokens if right <= left (0 milliseconds) | |
[info] - should calculate ring fraction (0 milliseconds) | |
[info] BufferedIterator2Spec: | |
[info] BufferedIterator | |
[info] - should return the same items as the standard Iterator (1 millisecond) | |
[info] - should be convertible to a Seq (3 milliseconds) | |
[info] - should wrap an empty iterator (0 milliseconds) | |
[info] - should offer the head element without consuming the underlying iterator (0 milliseconds) | |
[info] - should offer takeWhile that consumes only the elements matching the predicate (2 milliseconds) | |
[info] - should offer appendWhile that copies elements to ArrayBuffer and consumes only the elements matching the predicate (1 millisecond) | |
[info] - should throw NoSuchElementException if trying to get next() element that doesn't exist (2 milliseconds) | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
[info] AnyObjectFactoryTest: | |
[info] AnyObjectFactory | |
[info] when instantiated for a bean class with a single, no-args constructor | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return 0 with argCount (0 milliseconds) | |
[info] - should return empty collection with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a bean class with multiple constructors which include no-args constructor | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for an inner Java class | |
[info] - should create an instance of that class with newInstance (2 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a deeply nested inner Java class | |
[info] - should create an instance of that class with newInstance (5 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when tried to be instantiated for an unsupported bean class | |
[info] - should throw NoSuchMethodException if class does not have suitable constructor (2 milliseconds) | |
[info] when instantiated for a Scala case class with 2 args constructor | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return 2 with argCount because the only constructor of this case class has two args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (2 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a Scala case class with 2 args constructor which is defined inside an object | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return 2 with argCount because the only constructor of this case class has two args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a Scala class with 2 args constructor | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return 2 with argCount because the only constructor of this class has 2 args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for a Scala class with 2 args constructor and without fields | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return 2 with argCount because the only constructor of this class has 2 args (1 millisecond) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (1 millisecond) | |
[info] - should return that class with javaClass (1 millisecond) | |
[info] when instantiated for a Scala class with multiple constructors | |
[info] - should create an instance of that class with newInstance (0 milliseconds) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when instantiated for an inner Scala class with 2 args constructor | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return 2 with argCount (0 milliseconds) | |
[info] - should return collection of {Int, String} types with constructorParamTypes (2 milliseconds) | |
[info] - should return that class with javaClass (1 millisecond) | |
[info] when instantiated for a deeply nested inner Scala class | |
[info] - should create an instance of that class with newInstance (1 millisecond) | |
[info] - should return that class with javaClass (0 milliseconds) | |
[info] when serialized | |
[info] - should allow to be deserialized and reused (5 milliseconds) | |
[info] CassandraRowTest: | |
[info] - basicAccessTest (4 milliseconds) | |
[info] - nullAccessTest (0 milliseconds) | |
[info] - nullToStringTest (0 milliseconds) | |
[info] - nonExistentColumnAccessTest (0 milliseconds) | |
[info] - primitiveConversionTest (60 milliseconds) | |
[info] - collectionConversionTest (4 milliseconds) | |
[info] - serializationTest (4 milliseconds) | |
[info] ConsolidateSettingsSpec: | |
[info] - should consolidate Cassandra conf settings in order of table level -> keyspace -> cluster -> default (7 milliseconds) | |
[info] ColumnSelectorSpec: | |
[info] A ColumnSelector#selectFrom method | |
[info] - should return all columns (3 milliseconds) | |
[info] - should return partition key columns (2 milliseconds) | |
[info] - should return some columns (3 milliseconds) | |
[info] - should throw a NoSuchElementException when selected column name is invalid (1 millisecond) | |
[info] CqlWhereParserTest: | |
[info] CqlWhereParser | |
[info] - should parse 'and' operations (24 milliseconds) | |
[info] - should parse equality predicates (2 milliseconds) | |
[info] - should parse range predicates (2 milliseconds) | |
[info] - should parse IN predicates (5 milliseconds) | |
[info] - should parse quoted names (2 milliseconds) | |
[info] - should return lowercase names (1 millisecond) | |
[info] - should parse strings (2 milliseconds) | |
[info] - should distinguish '?' from ? (2 milliseconds) | |
[info] - should accept >= (0 milliseconds) | |
[info] - should accept ? (1 millisecond) | |
[info] - should accept name with quotes and other special symbols (0 milliseconds) | |
[info] - should accept param with quotes and other special symbols (1 millisecond) | |
[info] - should accept uuid param (1 millisecond) | |
[info] - should accept float param (1 millisecond) | |
[info] - should parse case insensitive 'aNd' operations (2 milliseconds) | |
[info] - should parse case insensitive 'iN' operations (1 millisecond) | |
[info] - should parse case insensitive 'IN' operations ? (0 milliseconds) | |
[info] RetryDelayConfSpec: | |
[info] ConstantDelay | |
[info] - should return the same delay regardless of the retry number (1 millisecond) | |
[info] LinearDelay | |
[info] - should return the calculated delay for different retry numbers (1 millisecond) | |
[info] ExponentialDelay | |
[info] - should return the calculated delay for different retry numbers (1 millisecond) | |
[info] RateLimiterSpec: | |
[info] RateLimiter | |
[info] - should not cause delays if rate is not exceeded (68 milliseconds) | |
[info] - should sleep to not exceed the target rate (2 milliseconds) | |
[info] - should sleep and leak properly with different Rates (16 milliseconds) | |
[info] WriteConfTest: | |
[info] WriteConf | |
[info] - should be configured with proper defaults (0 milliseconds) | |
[info] - should allow setting the rate limit as a decimal (7 milliseconds) | |
[info] - should allow to set consistency level (0 milliseconds) | |
[info] - should allow to set parallelism level (0 milliseconds) | |
[info] - should allow to set batch size in bytes (0 milliseconds) | |
[info] - should allow to set batch size in bytes when rows are set to auto (0 milliseconds) | |
[info] - should allow to set batch size in rows (1 millisecond) | |
[info] - should allow to set batch level (2 milliseconds) | |
[info] - should allow to set batch buffer size (1 millisecond) | |
[info] Murmur3TokenFactorySpec: | |
[info] Murmur3TokenFactory | |
[info] - should create a token from String (1 millisecond) | |
[info] - should create a String representation of a token (0 milliseconds) | |
[info] - should calculate the distance between tokens if right > left (0 milliseconds) | |
[info] - should calculate the distance between tokens if right <= left (0 milliseconds) | |
[info] - should calculate ring fraction (1 millisecond) | |
[info] CassandraConnectorConfSpec: | |
[info] - should be serializable (18 milliseconds) | |
[info] - should match a conf with the same settings (3 milliseconds) | |
[info] - should resolve default SSL settings correctly (7 milliseconds) | |
[info] - should resolve provided SSL settings correctly (1 millisecond) | |
[info] - should resolve default retry delay settings correctly (0 milliseconds) | |
[info] - should resolve constant retry delay settings (2 milliseconds) | |
[info] - should resolve linear retry delay settings (2 milliseconds) | |
[info] - should resolve exponential retry delay settings (2 milliseconds) | |
[info] WriteOptionTest: | |
[info] TTLOption | |
[info] - should properly create constant write option with duration in seconds (2 milliseconds) | |
[info] - should properly create constant write option with scala.concurrent.duration.Duration (0 milliseconds) | |
[info] - should properly create constant write option with scala.concurrent.duration.Duration.Infinite (0 milliseconds) | |
[info] - should properly create constant write option with org.joda.time.Duration (2 milliseconds) | |
[info] - should properly create infinite duration (0 milliseconds) | |
[info] - should properly create per-row duration placeholder (0 milliseconds) | |
[info] TimestampOption | |
[info] - should properly create constant write option with timestamp in microseconds (0 milliseconds) | |
[info] - should properly create constant write option with DateTime (0 milliseconds) | |
[info] - should properly create constant write option with Date (0 milliseconds) | |
[info] InputMetricsUpdaterSpec: | |
[info] InputMetricsUpdater | |
[info] - should initialize task metrics properly when they are empty (4 milliseconds) | |
[info] - should create updater which uses task metrics (28 milliseconds) | |
[info] - should create updater which does not use task metrics (1 millisecond) | |
[info] - should create updater which uses Codahale metrics (3 milliseconds) | |
[info] - should create updater which doesn't use Codahale metrics (1 millisecond) | |
[info] ColumnTypeSpec: | |
[info] A ColumnType companion object | |
[info] - should throw InvalidArgumentException if given unsupported type (20 milliseconds) | |
[info] should allow to obtain a proper ColumnType | |
[info] - when given a Boolean should return BooleanType (2 milliseconds) | |
[info] - when given a java.lang.Boolean should return BooleanType (1 millisecond) | |
[info] - when given an Int should return IntType (0 milliseconds) | |
[info] - when given an java.lang.Integer should return IntType (1 millisecond) | |
[info] - when given a Long should return BigIntType (0 milliseconds) | |
[info] - when given a java.lang.Long should return BigIntType (0 milliseconds) | |
[info] - when given a Float should return FloatType (1 millisecond) | |
[info] - when given a java.lang.Float should return FloatType (1 millisecond) | |
[info] - when given a Double should return DoubleType (1 millisecond) | |
[info] - when given a java.lang.Double should return DoubleType (1 millisecond) | |
[info] - when given a String should return VarcharType (1 millisecond) | |
[info] - when given a java.util.Date should return TimestampType (1 millisecond) | |
[info] - when given a java.sql.Date should return TimestampType (1 millisecond) | |
[info] - when given a org.joda.time.DateTime should return TimestampType (2 milliseconds) | |
[info] - when given a ByteBuffer should return BlobType (2 milliseconds) | |
[info] - when given an Array[Byte] should return BlobType (3 milliseconds) | |
[info] - when given an UUID should return UUIDType (2 milliseconds) | |
[info] - when given a List[String] should return ListType(VarcharType) (4 milliseconds) | |
[info] - when given a Set[InetAddress] should return SetType(InetType) (5 milliseconds) | |
[info] - when given a Map[Int, Date] should return MapType(IntType, TimestampType) (4 milliseconds) | |
[info] - when given an Option[Int] should return IntType (2 milliseconds) | |
[info] - when given an Option[Vector[Int]] should return ListType(IntType) (4 milliseconds) | |
[info] SpanningIteratorSpec: | |
[info] SpanningIterator | |
[info] - should group an empty collection (2 milliseconds) | |
[info] - should group a sequence of elements with the same key into a single item and should preserve order (1 millisecond) | |
[info] - should group a sequence of elements with distinct keys the same number of groups (2 milliseconds) | |
[info] - should group a sequence of elements with two keys into two groups (1 millisecond) | |
[info] - should be lazy and work with infinite streams (2 milliseconds) | |
[info] PriorityHashMapSpec: | |
[info] A PriorityHashMap | |
[info] - should support adding elements (simple) (2 milliseconds) | |
[info] - should support adding elements ascending by value (31 milliseconds) | |
[info] - should support adding elements descending by value (20 milliseconds) | |
[info] - should support adding elements in random order of values (18 milliseconds) | |
[info] - should support adding elements in random order of values and keys (31 milliseconds) | |
[info] - should support removing elements in ascending order (8 milliseconds) | |
[info] - should support removing elements in descending order (11 milliseconds) | |
[info] - should support removing elements in random order from a sorted map (7 milliseconds) | |
[info] - should support removing elements from a randomly created map in random order (21 milliseconds) | |
[info] - should allow to heapsort an array of integers (18 milliseconds) | |
[info] - should allow to update item priority (6 milliseconds) | |
[info] - should be able to store multiple items with the same priority (0 milliseconds) | |
[info] - should return false when removing a non-existing key (0 milliseconds) | |
[info] - should have capacity rounded up to the nearest power of two (0 milliseconds) | |
[info] - should throw NoSuchElement exception if requested a head of empty map (1 millisecond) | |
[info] - should throw NoSuchElement exception if requested a non-existing key (1 millisecond) | |
[info] - should throw IllegalStateException exception if trying to exceed allowed capacity (1 millisecond) | |
[info] GettableDataToMappedTypeConverterSpec: | |
[info] GettableDataToMappedTypeConverter | |
[info] - should be Serializable (63 milliseconds) | |
[info] - should convert a CassandraRow to a case class object (12 milliseconds) | |
[info] - should convert a CassandraRow to a case class object after being serialized/deserialized (12 milliseconds) | |
[info] - should convert a CassandraRow to a tuple (6 milliseconds) | |
[info] - should convert a CassandraRow to a tuple in reversed order (5 milliseconds) | |
[info] - should convert a CassandraRow to a tuple with a subset of columns (5 milliseconds) | |
[info] - should convert a UDTValue to a case class object (8 milliseconds) | |
[info] - should convert a TupleValue to a Scala tuple (6 milliseconds) | |
[info] - should allow for nesting UDTValues inside of TupleValues (13 milliseconds) | |
[info] - should allow for nesting TupleValues inside of UDTValues (16 milliseconds) | |
[info] - should convert nulls to Scala Nones (12 milliseconds) | |
[info] - should convert using custom column aliases (9 milliseconds) | |
[info] - should set property values with setters (8 milliseconds) | |
[info] - should apply proper type conversions for columns (9 milliseconds) | |
[info] - should convert a CassandraRow with a UDTValue into nested case class objects (17 milliseconds) | |
[info] - should convert a CassandraRow with a UDTValue into a case class with a nested tuple (12 milliseconds) | |
[info] - should convert a CassandraRow with an optional UDTValue (19 milliseconds) | |
[info] - should convert a CassandraRow with a list of UDTValues (40 milliseconds) | |
[info] - should convert a CassandraRow with a set of UDTValues (14 milliseconds) | |
[info] - should convert a CassandraRow with a collection of UDTValues (21 milliseconds) | |
[info] - should convert a CassandraRow with a collection of tuples (14 milliseconds) | |
[info] - should convert a CassandraRow to a JavaBean (7 milliseconds) | |
[info] - should convert a CassandraRow with UDTs to nested JavaBeans (12 milliseconds) | |
[info] - should throw a meaningful exception when a column type is not supported (5 milliseconds) | |
[info] - should throw a meaningful exception when a column value fails to be converted (5 milliseconds) | |
[info] - should throw NPE with a meaningful message when a column value is null (5 milliseconds) | |
[info] - should throw NPE when trying to access its targetTypeTag after serialization/deserialization (7 milliseconds) | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.writer.AsyncExecutorTest.test started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.425s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.reader.ClassBasedRowReaderTest.testSerialize started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.017s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testSetters1 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testSetters2 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.columnNameOverrideConstructor started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testGetters1 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testGetters2 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testWorkWithAliasesAndHonorOverrides started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.columnNameOverrideGetters started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNotEnoughPropertiesForWriting started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForClassWithVars started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForEmptyClass started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testConstructorParams1 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testConstructorParams2 started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNotEnoughColumnsSelectedForReading started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testImplicit started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.columnNameOverrideSetters started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForClassWithUnsupportedPropertyType started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testWorkWithAliases started | |
[info] Test com.datastax.spark.connector.mapper.DefaultColumnMapperTest.testNewTableForCaseClass started | |
[info] Test run finished: 0 failed, 0 ignored, 18 total, 0.062s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaDouble started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testInetAddress started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testSerializeMapConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaInteger started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testChainedConverters started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTreeMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTreeSet started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testInt started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testSet started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testByteArray started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testBigDecimal started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testFloat started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testDate started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testList started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testLong started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testPair started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testUUID started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTypeAliases started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJodaTime started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testCalendar1 started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testCalendar2 started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaBigDecimal started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testBoolean started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaFloat started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testUnsupportedType started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testRegisterCustomConverterExtension started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaBigInteger started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaList started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaLong started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaBoolean started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testRegisterCustomConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testSerializeCollectionConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testBigInt started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testChainedConverterSerializability started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testDouble started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaHashMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaHashSet started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testOption started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testString started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testTriple started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testOptionToNullConverter started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaArrayList started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaMap started | |
[info] Test com.datastax.spark.connector.types.TypeConverterTest.testJavaSet started | |
[info] Test run finished: 0 failed, 0 ignored, 45 total, 0.104s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testIncompleteConstructor started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testIncompleteGetters started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testGetters started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testSerialize started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testImplicit started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testConstructor started | |
[info] Test com.datastax.spark.connector.mapper.TupleColumnMapperTest.testNewTable started | |
[info] Test run finished: 0 failed, 0 ignored, 7 total, 0.013s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.writer.PropertyExtractorTest.testSimpleExtraction started | |
[info] Test com.datastax.spark.connector.writer.PropertyExtractorTest.testWrongPropertyName started | |
[info] Test com.datastax.spark.connector.writer.PropertyExtractorTest.testAvailableProperties started | |
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.003s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.types.CanBuildFromTest.testBuild started | |
[info] Test com.datastax.spark.connector.types.CanBuildFromTest.testSerializeAndBuild started | |
[info] Test com.datastax.spark.connector.types.CanBuildFromTest.testSerializeAndBuildWithOrdering started | |
[info] Test run finished: 0 failed, 0 ignored, 3 total, 0.003s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testTypeConversionsInUDTValuesAreApplied started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testTypeConversionsAreApplied started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testSerializability started | |
[info] Test com.datastax.spark.connector.writer.DefaultRowWriterTest.testCustomTypeConvertersAreUsed started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.024s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testSplit started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testZeroRows started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testWrapAround started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.Murmur3PartitionerTokenRangeSplitterTest.testNoSplit started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.009s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testWorkWithAliasesAndHonorOverrides started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testSerializeColumnMap started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testGetters started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testColumnNameOverrideSetters started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testImplicit started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testSetters started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testWorkWithAliases started | |
[info] Test com.datastax.spark.connector.mapper.JavaBeanColumnMapperTest.testColumnNameOverrideGetters started | |
[info] Test run finished: 0 failed, 0 ignored, 8 total, 0.008s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.types.TypeSerializationTest.testSerializationOfCollectionTypes started | |
[info] Test com.datastax.spark.connector.types.TypeSerializationTest.testSerializationOfPrimitiveTypes started | |
[info] Test run finished: 0 failed, 0 ignored, 2 total, 0.01s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testSplit started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testZeroRows started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testWrapAround started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.RandomPartitionerTokenRangeSplitterTest.testNoSplit started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.005s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testTrivialClustering started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testMultipleEndpoints started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testEmpty started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testTooLargeRanges started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testMaxClusterSize started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testSplitByHost started | |
[info] Test com.datastax.spark.connector.rdd.partitioner.TokenRangeClustererTest.testSplitByCount started | |
[info] Test run finished: 0 failed, 0 ignored, 7 total, 0.01s | |
15/09/17 16:44:36 INFO Utils: Shutdown hook called | |
[info] ScalaTest | |
[info] Run completed in 5 seconds, 887 milliseconds. | |
[info] Total number of tests run: 263 | |
[info] Suites: completed 24, aborted 0 | |
[info] Tests: succeeded 263, failed 0, canceled 0, ignored 0, pending 0 | |
[info] All tests passed. | |
[info] Passed: Total 370, Failed 0, Errors 0, Passed 370 | |
[info] Checking every *.class/*.jar file's SHA-1. | |
[info] Merging files... | |
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard' | |
[warn] Strategy 'discard' was applied to a file | |
[info] SHA-1: 9e02cf66722c3639b394484f5a030fa5ad45e0c4 | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.CustomTypeConverterTest.test1 started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 1.282s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithConnector started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithReadConf started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithAscOrder started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testLimit started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWhere started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectColumns started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectedColumnRefs started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testSelectedColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaRDDTest.testWithDescOrder started | |
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.258s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetBytes started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetFloat started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetShort started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGet started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testToMap started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDateTime started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetByte started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDate started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetInet started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetList started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetLong started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetUUID started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetObjectAndApply started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetBoolean started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDouble started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetInt started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetMap started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetSet started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetString started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetVarInt started | |
[info] Test com.datastax.spark.connector.japi.CassandraRowTest.testGetDecimal started | |
[info] Test run finished: 0 failed, 0 ignored, 21 total, 0.193s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJoinJavaRDDTest.testOn started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.061s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.SparkContextJavaFunctionsTest.testReadConfPopulating started | |
[info] Test run finished: 0 failed, 0 ignored, 1 total, 0.193s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithConnector started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithReadConf started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithAscOrder started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testLimit started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWhere started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectColumns started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectedColumnRefs started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testSelectedColumnNames started | |
[info] Test com.datastax.spark.connector.japi.rdd.CassandraJavaPairRDDTest.testWithDescOrder started | |
[info] Test run finished: 0 failed, 0 ignored, 10 total, 0.007s | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
15/09/17 16:44:42 INFO Utils: Shutdown hook called | |
[info] ScalaTest | |
[info] Run completed in 2 seconds, 872 milliseconds. | |
[info] Total number of tests run: 0 | |
[info] Suites: completed 0, aborted 0 | |
[info] Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 | |
[info] No tests were executed. | |
[info] Passed: Total 44, Failed 0, Errors 0, Passed 44 | |
[info] Checking every *.class/*.jar file's SHA-1. | |
[info] Merging files... | |
[warn] Merging 'META-INF/MANIFEST.MF' with strategy 'discard' | |
[warn] Strategy 'discard' was applied to a file | |
[info] SHA-1: 3e2a2ce78bd801326781e9e2ebb006552d38df65 | |
[info] Packaging /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar ... | |
[info] Done packaging. | |
[success] Total time: 16 s, completed Sep 17, 2015 4:44:45 PM | |
Spark 1.4.0 for Scala 2.11 already downloaded | |
Installing Spark 1.4.0 for Scala 2.11 | |
x NOTICE | |
x CHANGES.txt | |
x python/ | |
x python/test_support/ | |
x python/test_support/userlibrary.py | |
x python/test_support/userlib-0.1.zip | |
x python/test_support/SimpleHTTPServer.py | |
x python/test_support/hello.txt | |
x python/docs/ | |
x python/docs/pyspark.ml.rst | |
x python/docs/pyspark.streaming.rst | |
x python/docs/conf.py | |
x python/docs/pyspark.rst | |
x python/docs/make.bat | |
x python/docs/epytext.py | |
x python/docs/make2.bat | |
x python/docs/index.rst | |
x python/docs/pyspark.sql.rst | |
x python/docs/pyspark.mllib.rst | |
x python/docs/Makefile | |
x python/.gitignore | |
x python/pyspark/ | |
x python/pyspark/status.py | |
x python/pyspark/conf.py | |
x python/pyspark/ml/ | |
x python/pyspark/ml/evaluation.py | |
x python/pyspark/ml/util.py | |
x python/pyspark/ml/classification.py | |
x python/pyspark/ml/regression.py | |
x python/pyspark/ml/tests.py | |
x python/pyspark/ml/tuning.py | |
x python/pyspark/ml/pipeline.py | |
x python/pyspark/ml/feature.py | |
x python/pyspark/ml/recommendation.py | |
x python/pyspark/ml/__init__.py | |
x python/pyspark/ml/wrapper.py | |
x python/pyspark/ml/param/ | |
x python/pyspark/ml/param/_shared_params_code_gen.py | |
x python/pyspark/ml/param/shared.py | |
x python/pyspark/ml/param/__init__.py | |
x python/pyspark/statcounter.py | |
x python/pyspark/profiler.py | |
x python/pyspark/serializers.py | |
x python/pyspark/traceback_utils.py | |
x python/pyspark/shell.py | |
x python/pyspark/sql/ | |
x python/pyspark/sql/window.py | |
x python/pyspark/sql/tests.py | |
x python/pyspark/sql/group.py | |
x python/pyspark/sql/types.py | |
x python/pyspark/sql/context.py | |
x python/pyspark/sql/dataframe.py | |
x python/pyspark/sql/column.py | |
x python/pyspark/sql/__init__.py | |
x python/pyspark/sql/readwriter.py | |
x python/pyspark/sql/functions.py | |
x python/pyspark/daemon.py | |
x python/pyspark/tests.py | |
x python/pyspark/resultiterable.py | |
x python/pyspark/heapq3.py | |
x python/pyspark/broadcast.py | |
x python/pyspark/shuffle.py | |
x python/pyspark/cloudpickle.py | |
x python/pyspark/accumulators.py | |
x python/pyspark/java_gateway.py | |
x python/pyspark/streaming/ | |
x python/pyspark/streaming/util.py | |
x python/pyspark/streaming/tests.py | |
x python/pyspark/streaming/kafka.py | |
x python/pyspark/streaming/dstream.py | |
x python/pyspark/streaming/context.py | |
x python/pyspark/streaming/__init__.py | |
x python/pyspark/context.py | |
x python/pyspark/storagelevel.py | |
x python/pyspark/__init__.py | |
x python/pyspark/join.py | |
x python/pyspark/mllib/ | |
x python/pyspark/mllib/tree.py | |
x python/pyspark/mllib/linalg.py | |
x python/pyspark/mllib/evaluation.py | |
x python/pyspark/mllib/util.py | |
x python/pyspark/mllib/classification.py | |
x python/pyspark/mllib/regression.py | |
x python/pyspark/mllib/tests.py | |
x python/pyspark/mllib/common.py | |
x python/pyspark/mllib/feature.py | |
x python/pyspark/mllib/clustering.py | |
x python/pyspark/mllib/recommendation.py | |
x python/pyspark/mllib/stat/ | |
x python/pyspark/mllib/stat/__init__.py | |
x python/pyspark/mllib/stat/_statistics.py | |
x python/pyspark/mllib/stat/test.py | |
x python/pyspark/mllib/stat/distribution.py | |
x python/pyspark/mllib/random.py | |
x python/pyspark/mllib/__init__.py | |
x python/pyspark/mllib/fpm.py | |
x python/pyspark/rdd.py | |
x python/pyspark/rddsampler.py | |
x python/pyspark/worker.py | |
x python/pyspark/files.py | |
x python/run-tests | |
x python/lib/ | |
x python/lib/py4j-0.8.2.1-src.zip | |
x python/lib/pyspark.zip | |
x python/lib/PY4J_LICENSE.txt | |
x RELEASE | |
x sbin/ | |
x sbin/start-mesos-dispatcher.sh | |
x sbin/spark-daemon.sh | |
x sbin/stop-slaves.sh | |
x sbin/stop-thriftserver.sh | |
x sbin/stop-shuffle-service.sh | |
x sbin/stop-history-server.sh | |
x sbin/spark-config.sh | |
x sbin/start-history-server.sh | |
x sbin/start-thriftserver.sh | |
x sbin/start-shuffle-service.sh | |
x sbin/spark-daemons.sh | |
x sbin/start-all.sh | |
x sbin/stop-master.sh | |
x sbin/stop-mesos-dispatcher.sh | |
x sbin/stop-slave.sh | |
x sbin/start-slave.sh | |
x sbin/start-slaves.sh | |
x sbin/stop-all.sh | |
x sbin/slaves.sh | |
x sbin/start-master.sh | |
x examples/ | |
x examples/src/ | |
x examples/src/main/ | |
x examples/src/main/r/ | |
x examples/src/main/r/dataframe.R | |
x examples/src/main/python/ | |
x examples/src/main/python/status_api_demo.py | |
x examples/src/main/python/ml/ | |
x examples/src/main/python/ml/simple_text_classification_pipeline.py | |
x examples/src/main/python/ml/random_forest_example.py | |
x examples/src/main/python/ml/gradient_boosted_trees.py | |
x examples/src/main/python/ml/simple_params_example.py | |
x examples/src/main/python/pagerank.py | |
x examples/src/main/python/wordcount.py | |
x examples/src/main/python/pi.py | |
x examples/src/main/python/hbase_inputformat.py | |
x examples/src/main/python/logistic_regression.py | |
x examples/src/main/python/cassandra_outputformat.py | |
x examples/src/main/python/streaming/ | |
x examples/src/main/python/streaming/sql_network_wordcount.py | |
x examples/src/main/python/streaming/network_wordcount.py | |
x examples/src/main/python/streaming/kafka_wordcount.py | |
x examples/src/main/python/streaming/stateful_network_wordcount.py | |
x examples/src/main/python/streaming/direct_kafka_wordcount.py | |
x examples/src/main/python/streaming/hdfs_wordcount.py | |
x examples/src/main/python/streaming/recoverable_network_wordcount.py | |
x examples/src/main/python/transitive_closure.py | |
x examples/src/main/python/kmeans.py | |
x examples/src/main/python/avro_inputformat.py | |
x examples/src/main/python/mllib/ | |
x examples/src/main/python/mllib/sampled_rdds.py | |
x examples/src/main/python/mllib/gaussian_mixture_model.py | |
x examples/src/main/python/mllib/logistic_regression.py | |
x examples/src/main/python/mllib/random_forest_example.py | |
x examples/src/main/python/mllib/dataset_example.py | |
x examples/src/main/python/mllib/word2vec.py | |
x examples/src/main/python/mllib/decision_tree_runner.py | |
x examples/src/main/python/mllib/kmeans.py | |
x examples/src/main/python/mllib/random_rdd_generation.py | |
x examples/src/main/python/mllib/gradient_boosted_trees.py | |
x examples/src/main/python/mllib/correlations.py | |
x examples/src/main/python/parquet_inputformat.py | |
x examples/src/main/python/hbase_outputformat.py | |
x examples/src/main/python/als.py | |
x examples/src/main/python/sql.py | |
x examples/src/main/python/sort.py | |
x examples/src/main/python/cassandra_inputformat.py | |
x examples/src/main/java/ | |
x examples/src/main/java/org/ | |
x examples/src/main/java/org/apache/ | |
x examples/src/main/java/org/apache/spark/ | |
x examples/src/main/java/org/apache/spark/examples/ | |
x examples/src/main/java/org/apache/spark/examples/ml/ | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaSimpleTextClassificationPipeline.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaSimpleParamsExample.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaDeveloperApiExample.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaCrossValidatorExample.java | |
x examples/src/main/java/org/apache/spark/examples/ml/JavaOneVsRestExample.java | |
x examples/src/main/java/org/apache/spark/examples/JavaSparkPi.java | |
x examples/src/main/java/org/apache/spark/examples/sql/ | |
x examples/src/main/java/org/apache/spark/examples/sql/JavaSparkSQL.java | |
x examples/src/main/java/org/apache/spark/examples/JavaLogQuery.java | |
x examples/src/main/java/org/apache/spark/examples/JavaTC.java | |
x examples/src/main/java/org/apache/spark/examples/JavaStatusTrackerDemo.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/ | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaRecord.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaFlumeEventCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaDirectKafkaWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaSqlNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaRecoverableNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaStatefulNetworkWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaCustomReceiver.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaQueueStream.java | |
x examples/src/main/java/org/apache/spark/examples/streaming/JavaKafkaWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/JavaHdfsLR.java | |
x examples/src/main/java/org/apache/spark/examples/JavaPageRank.java | |
x examples/src/main/java/org/apache/spark/examples/JavaWordCount.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/ | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaRandomForestExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaLDAExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaDecisionTree.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaPowerIterationClusteringExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaALS.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaFPGrowthExample.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaGradientBoostedTreesRunner.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaLR.java | |
x examples/src/main/java/org/apache/spark/examples/mllib/JavaKMeans.java | |
x examples/src/main/scala/ | |
x examples/src/main/scala/org/ | |
x examples/src/main/scala/org/apache/ | |
x examples/src/main/scala/org/apache/spark/ | |
x examples/src/main/scala/org/apache/spark/examples/ | |
x examples/src/main/scala/org/apache/spark/examples/GroupByTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/ | |
x examples/src/main/scala/org/apache/spark/examples/ml/RandomForestExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/CrossValidatorExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/GBTExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/DeveloperApiExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/DecisionTreeExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/MovieLensALS.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/OneVsRestExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/SimpleTextClassificationPipeline.scala | |
x examples/src/main/scala/org/apache/spark/examples/ml/SimpleParamsExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkKMeans.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkTachyonPi.scala | |
x examples/src/main/scala/org/apache/spark/examples/MultiBroadcastTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/sql/ | |
x examples/src/main/scala/org/apache/spark/examples/sql/hive/ | |
x examples/src/main/scala/org/apache/spark/examples/sql/hive/HiveFromSpark.scala | |
x examples/src/main/scala/org/apache/spark/examples/sql/RDDRelation.scala | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/ | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/HBaseConverters.scala | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/CassandraConverters.scala | |
x examples/src/main/scala/org/apache/spark/examples/pythonconverters/AvroConverters.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkTC.scala | |
x examples/src/main/scala/org/apache/spark/examples/BroadcastTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/ExceptionHandlingTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalKMeans.scala | |
x examples/src/main/scala/org/apache/spark/examples/graphx/ | |
x examples/src/main/scala/org/apache/spark/examples/graphx/Analytics.scala | |
x examples/src/main/scala/org/apache/spark/examples/graphx/SynthBenchmark.scala | |
x examples/src/main/scala/org/apache/spark/examples/graphx/LiveJournalPageRank.scala | |
x examples/src/main/scala/org/apache/spark/examples/HdfsTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/SimpleSkewedGroupByTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkPageRank.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkTachyonHdfsLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/ | |
x examples/src/main/scala/org/apache/spark/examples/streaming/StatefulNetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/TwitterAlgebirdCMS.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/HdfsWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/DirectKafkaWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/QueueStream.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/TwitterPopularTags.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/FlumePollingEventCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/SqlNetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/FlumeEventCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/ZeroMQWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/RecoverableNetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/clickstream/ | |
x examples/src/main/scala/org/apache/spark/examples/streaming/clickstream/PageViewStream.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/clickstream/PageViewGenerator.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/ActorWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/MQTTWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/TwitterAlgebirdHLL.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/StreamingExamples.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/NetworkWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/CustomReceiver.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/KafkaWordCount.scala | |
x examples/src/main/scala/org/apache/spark/examples/streaming/RawNetworkGrep.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkPi.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkALS.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalFileLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/DriverSubmissionTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/LogQuery.scala | |
x examples/src/main/scala/org/apache/spark/examples/HBaseTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/ | |
x examples/src/main/scala/org/apache/spark/examples/mllib/AbstractParams.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/StreamingKMeansExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/TallSkinnyPCA.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/StreamingLinearRegression.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DecisionTreeRunner.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/TallSkinnySVD.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DenseGaussianMixture.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DatasetExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/Correlations.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/MovieLensALS.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/CosineSimilarity.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/StreamingLogisticRegression.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/SampledRDDs.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/RandomRDDGeneration.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/SparseNaiveBayes.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/BinaryClassification.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/PowerIterationClusteringExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/DenseKMeans.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/MultivariateSummarizer.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/FPGrowthExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/GradientBoostedTreesRunner.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/LinearRegression.scala | |
x examples/src/main/scala/org/apache/spark/examples/mllib/LDAExample.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalPi.scala | |
x examples/src/main/scala/org/apache/spark/examples/CassandraCQLTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/SparkHdfsLR.scala | |
x examples/src/main/scala/org/apache/spark/examples/SkewedGroupByTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/CassandraTest.scala | |
x examples/src/main/scala/org/apache/spark/examples/LocalALS.scala | |
x examples/src/main/resources/ | |
x examples/src/main/resources/people.json | |
x examples/src/main/resources/people.txt | |
x examples/src/main/resources/full_user.avsc | |
x examples/src/main/resources/kv1.txt | |
x examples/src/main/resources/users.parquet | |
x examples/src/main/resources/user.avsc | |
x examples/src/main/resources/users.avro | |
x data/ | |
x data/mllib/ | |
x data/mllib/pagerank_data.txt | |
x data/mllib/kmeans_data.txt | |
x data/mllib/als/ | |
x data/mllib/als/sample_movielens_movies.txt | |
x data/mllib/als/test.data | |
x data/mllib/als/sample_movielens_ratings.txt | |
x data/mllib/lr-data/ | |
x data/mllib/lr-data/random.data | |
x data/mllib/sample_naive_bayes_data.txt | |
x data/mllib/sample_tree_data.csv | |
x data/mllib/sample_fpgrowth.txt | |
x data/mllib/sample_libsvm_data.txt | |
x data/mllib/ridge-data/ | |
x data/mllib/ridge-data/lpsa.data | |
x data/mllib/sample_multiclass_classification_data.txt | |
x data/mllib/sample_linear_regression_data.txt | |
x data/mllib/sample_isotonic_regression_data.txt | |
x data/mllib/sample_binary_classification_data.txt | |
x data/mllib/sample_lda_data.txt | |
x data/mllib/sample_movielens_data.txt | |
x data/mllib/sample_svm_data.txt | |
x data/mllib/lr_data.txt | |
x data/mllib/gmm_data.txt | |
x R/ | |
x R/lib/ | |
x R/lib/SparkR/ | |
x R/lib/SparkR/html/ | |
x R/lib/SparkR/html/groupBy.html | |
x R/lib/SparkR/html/sql.html | |
x R/lib/SparkR/html/DataFrame.html | |
x R/lib/SparkR/html/hashCode.html | |
x R/lib/SparkR/html/distinct.html | |
x R/lib/SparkR/html/print.jobj.html | |
x R/lib/SparkR/html/saveAsParquetFile.html | |
x R/lib/SparkR/html/sparkRHive.init.html | |
x R/lib/SparkR/html/registerTempTable.html | |
x R/lib/SparkR/html/tables.html | |
x R/lib/SparkR/html/structType.html | |
x R/lib/SparkR/html/parquetFile.html | |
x R/lib/SparkR/html/isLocal.html | |
x R/lib/SparkR/html/tableNames.html | |
x R/lib/SparkR/html/createDataFrame.html | |
x R/lib/SparkR/html/except.html | |
x R/lib/SparkR/html/withColumn.html | |
x R/lib/SparkR/html/print.structType.html | |
x R/lib/SparkR/html/count.html | |
x R/lib/SparkR/html/saveAsTable.html | |
x R/lib/SparkR/html/describe.html | |
x R/lib/SparkR/html/persist.html | |
x R/lib/SparkR/html/selectExpr.html | |
x R/lib/SparkR/html/jsonFile.html | |
x R/lib/SparkR/html/insertInto.html | |
x R/lib/SparkR/html/unionAll.html | |
x R/lib/SparkR/html/showDF.html | |
x R/lib/SparkR/html/schema.html | |
x R/lib/SparkR/html/filter.html | |
x R/lib/SparkR/html/cache-methods.html | |
x R/lib/SparkR/html/table.html | |
x R/lib/SparkR/html/head.html | |
x R/lib/SparkR/html/limit.html | |
x R/lib/SparkR/html/structField.html | |
x R/lib/SparkR/html/cacheTable.html | |
x R/lib/SparkR/html/dtypes.html | |
x R/lib/SparkR/html/columns.html | |
x R/lib/SparkR/html/unpersist-methods.html | |
x R/lib/SparkR/html/arrange.html | |
x R/lib/SparkR/html/collect-methods.html | |
x R/lib/SparkR/html/uncacheTable.html | |
x R/lib/SparkR/html/infer_type.html | |
x R/lib/SparkR/html/sparkR.stop.html | |
x R/lib/SparkR/html/explain.html | |
x R/lib/SparkR/html/R.css | |
x R/lib/SparkR/html/take.html | |
x R/lib/SparkR/html/column.html | |
x R/lib/SparkR/html/show.html | |
x R/lib/SparkR/html/printSchema.html | |
x R/lib/SparkR/html/createExternalTable.html | |
x R/lib/SparkR/html/sparkR.init.html | |
x R/lib/SparkR/html/agg.html | |
x R/lib/SparkR/html/00Index.html | |
x R/lib/SparkR/html/print.structField.html | |
x R/lib/SparkR/html/write.df.html | |
x R/lib/SparkR/html/intersect.html | |
x R/lib/SparkR/html/sparkRSQL.init.html | |
x R/lib/SparkR/html/select.html | |
x R/lib/SparkR/html/dropTempTable.html | |
x R/lib/SparkR/html/sample.html | |
x R/lib/SparkR/html/repartition.html | |
x R/lib/SparkR/html/first.html | |
x R/lib/SparkR/html/read.df.html | |
x R/lib/SparkR/html/withColumnRenamed.html | |
x R/lib/SparkR/html/clearCache.html | |
x R/lib/SparkR/html/nafunctions.html | |
x R/lib/SparkR/html/GroupedData.html | |
x R/lib/SparkR/html/join.html | |
x R/lib/SparkR/INDEX | |
x R/lib/SparkR/R/ | |
x R/lib/SparkR/R/SparkR.rdx | |
x R/lib/SparkR/R/SparkR | |
x R/lib/SparkR/R/SparkR.rdb | |
x R/lib/SparkR/help/ | |
x R/lib/SparkR/help/aliases.rds | |
x R/lib/SparkR/help/paths.rds | |
x R/lib/SparkR/help/SparkR.rdx | |
x R/lib/SparkR/help/AnIndex | |
x R/lib/SparkR/help/SparkR.rdb | |
x R/lib/SparkR/DESCRIPTION | |
x R/lib/SparkR/Meta/ | |
x R/lib/SparkR/Meta/hsearch.rds | |
x R/lib/SparkR/Meta/nsInfo.rds | |
x R/lib/SparkR/Meta/package.rds | |
x R/lib/SparkR/Meta/links.rds | |
x R/lib/SparkR/Meta/Rd.rds | |
x R/lib/SparkR/profile/ | |
x R/lib/SparkR/profile/general.R | |
x R/lib/SparkR/profile/shell.R | |
x R/lib/SparkR/worker/ | |
x R/lib/SparkR/worker/daemon.R | |
x R/lib/SparkR/worker/worker.R | |
x R/lib/SparkR/NAMESPACE | |
x R/lib/SparkR/tests/ | |
x R/lib/SparkR/tests/test_textFile.R | |
x R/lib/SparkR/tests/test_broadcast.R | |
x R/lib/SparkR/tests/test_binaryFile.R | |
x R/lib/SparkR/tests/test_rdd.R | |
x R/lib/SparkR/tests/test_sparkSQL.R | |
x R/lib/SparkR/tests/test_parallelize_collect.R | |
x R/lib/SparkR/tests/test_includePackage.R | |
x R/lib/SparkR/tests/test_shuffle.R | |
x R/lib/SparkR/tests/test_binary_function.R | |
x R/lib/SparkR/tests/test_context.R | |
x R/lib/SparkR/tests/test_take.R | |
x R/lib/SparkR/tests/test_utils.R | |
x ec2/ | |
x ec2/spark_ec2.py | |
x ec2/README | |
x ec2/spark-ec2 | |
x ec2/deploy.generic/ | |
x ec2/deploy.generic/root/ | |
x ec2/deploy.generic/root/spark-ec2/ | |
x ec2/deploy.generic/root/spark-ec2/ec2-variables.sh | |
x conf/ | |
x conf/fairscheduler.xml.template | |
x conf/metrics.properties.template | |
x conf/spark-env.sh.template | |
x conf/log4j.properties.template | |
x conf/docker.properties.template | |
x conf/slaves.template | |
x conf/spark-defaults.conf.template | |
x LICENSE | |
x bin/ | |
x bin/spark-shell | |
x bin/spark-submit.cmd | |
x bin/spark-shell2.cmd | |
x bin/pyspark | |
x bin/sparkR.cmd | |
x bin/spark-class2.cmd | |
x bin/run-example.cmd | |
x bin/spark-submit2.cmd | |
x bin/spark-class | |
x bin/spark-submit | |
x bin/spark-sql | |
x bin/run-example | |
x bin/beeline | |
x bin/pyspark2.cmd | |
x bin/spark-shell.cmd | |
x bin/spark-class.cmd | |
x bin/pyspark.cmd | |
x bin/sparkR | |
x bin/beeline.cmd | |
x bin/sparkR2.cmd | |
x bin/run-example2.cmd | |
x bin/load-spark-env.sh | |
x bin/load-spark-env.cmd | |
x lib/ | |
x lib/datanucleus-core-3.2.10.jar | |
x lib/datanucleus-api-jdo-3.2.6.jar | |
x lib/spark-examples-1.4.0-hadoop1.0.4.jar | |
x lib/datanucleus-rdbms-3.2.9.jar | |
x lib/spark-assembly-1.4.0-hadoop1.0.4.jar | |
x README.md | |
Running Spark cluster | |
starting org.apache.spark.deploy.master.Master, logging to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/log/spark-jlewandowski-org.apache.spark.deploy.master.Master-1-ursus-major.out | |
starting org.apache.spark.deploy.worker.Worker, logging to /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/log/spark-jlewandowski-org.apache.spark.deploy.worker.Worker-2-ursus-major.out | |
Running tests for Spark 1.4.0 and Scala 2.11 | |
Launching sbt from sbt/sbt-launch-0.13.8.jar | |
[info] Loading project definition from /Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/project | |
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases | |
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots | |
Scala: 2.11.6 | |
Scala Binary: 2.11 | |
Java: target=1.7 user=1.7.0_79 | |
[info] Set current project to root (in build file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/) | |
objc[55379]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:45:12,932 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:45:12,934 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
objc[55381]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:45:16,420 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:45:16,423 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:45:19,249 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
WARN 16:45:19,354 org.apache.spark.Logging$class (Logging.scala:71) - Your hostname, ursus-major resolves to a loopback address: 127.0.0.1; using 192.168.1.105 instead (on interface en0) | |
WARN 16:45:19,355 org.apache.spark.Logging$class (Logging.scala:71) - Set SPARK_LOCAL_IP if you need to bind to another address | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraSQLClusterLevelSpec: | |
INFO 16:45:24,838 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:24,845 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:25,077 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:25,077 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to join tables from different clusters (8 seconds, 137 milliseconds) | |
INFO 16:45:30,876 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:30,877 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:31,765 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:31,765 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to write data to another cluster (1 second, 520 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraSQLSpec: | |
INFO 16:45:33,423 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:33,424 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select all rows (4 seconds, 64 milliseconds) | |
INFO 16:45:37,494 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(g,2) | |
INFO 16:45:37,496 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(g,2)) | |
[info] - should allow to select rows with index columns (706 milliseconds) | |
INFO 16:45:38,196 org.apache.spark.Logging$class (Logging.scala:59) - filters: GreaterThanOrEqual(b,2) | |
INFO 16:45:38,197 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with >= clause (490 milliseconds) | |
INFO 16:45:38,683 org.apache.spark.Logging$class (Logging.scala:59) - filters: GreaterThan(b,2) | |
INFO 16:45:38,684 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with > clause (477 milliseconds) | |
INFO 16:45:39,159 org.apache.spark.Logging$class (Logging.scala:59) - filters: LessThan(b,2) | |
INFO 16:45:39,159 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with < clause (474 milliseconds) | |
INFO 16:45:39,637 org.apache.spark.Logging$class (Logging.scala:59) - filters: LessThanOrEqual(b,2) | |
INFO 16:45:39,637 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with <= clause (481 milliseconds) | |
INFO 16:45:40,130 org.apache.spark.Logging$class (Logging.scala:59) - filters: In(b,[Ljava.lang.Object;@586c4d72) | |
INFO 16:45:40,130 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with in clause (417 milliseconds) | |
INFO 16:45:40,563 org.apache.spark.Logging$class (Logging.scala:59) - filters: In(a,[Ljava.lang.Object;@11e60ff3) | |
INFO 16:45:40,564 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(In(a,[Ljava.lang.Object;@11e60ff3)) | |
[info] - should allow to select rows with in clause pushed down (158 milliseconds) | |
INFO 16:45:40,698 org.apache.spark.Logging$class (Logging.scala:59) - filters: Or(EqualTo(b,2),EqualTo(b,1)) | |
INFO 16:45:40,698 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with or clause (380 milliseconds) | |
INFO 16:45:41,069 org.apache.spark.Logging$class (Logging.scala:59) - filters: Not(EqualTo(b,2)) | |
INFO 16:45:41,069 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with != clause (339 milliseconds) | |
INFO 16:45:41,410 org.apache.spark.Logging$class (Logging.scala:59) - filters: Not(EqualTo(b,2)) | |
INFO 16:45:41,411 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with <> clause (371 milliseconds) | |
INFO 16:45:41,782 org.apache.spark.Logging$class (Logging.scala:59) - filters: Not(In(b,[Ljava.lang.Object;@2f8d4a53)) | |
INFO 16:45:41,782 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with not in clause (322 milliseconds) | |
INFO 16:45:42,159 org.apache.spark.Logging$class (Logging.scala:59) - filters: IsNotNull(b) | |
INFO 16:45:42,160 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with is not null clause (358 milliseconds) | |
INFO 16:45:42,459 org.apache.spark.Logging$class (Logging.scala:59) - filters: StringEndsWith(name,om) | |
INFO 16:45:42,459 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with like clause (320 milliseconds) | |
INFO 16:45:42,771 org.apache.spark.Logging$class (Logging.scala:59) - filters: GreaterThanOrEqual(a,1), LessThanOrEqual(a,2) | |
INFO 16:45:42,771 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with between clause (259 milliseconds) | |
INFO 16:45:43,032 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:43,032 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with alias (279 milliseconds) | |
INFO 16:45:43,305 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:43,306 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with distinct column (1 second, 174 milliseconds) | |
INFO 16:45:44,488 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:44,488 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with limit clause (400 milliseconds) | |
INFO 16:45:44,890 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:44,890 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with order by clause (659 milliseconds) | |
INFO 16:45:45,561 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:45,561 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with group by clause (958 milliseconds) | |
INFO 16:45:46,518 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:46,518 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:46,522 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:46,522 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with union clause (817 milliseconds) | |
INFO 16:45:47,339 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:47,340 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:47,345 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:47,346 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with union distinct clause (673 milliseconds) | |
INFO 16:45:48,002 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:48,002 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:48,006 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:48,006 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with union all clause (269 milliseconds) | |
INFO 16:45:48,307 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:48,308 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows with having clause (623 milliseconds) | |
INFO 16:45:48,892 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,1), EqualTo(c,1) | |
INFO 16:45:48,892 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(a,1), EqualTo(b,1), EqualTo(c,1)) | |
[info] - should allow to select rows with partition column clause (122 milliseconds) | |
INFO 16:45:49,017 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,1), EqualTo(c,1), EqualTo(d,1), EqualTo(e,1) | |
INFO 16:45:49,017 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(e,1), EqualTo(d,1), EqualTo(b,1), EqualTo(c,1), EqualTo(a,1)) | |
[info] - should allow to select rows with partition column and cluster column clause (79 milliseconds) | |
INFO 16:45:49,095 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:49,096 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:49,361 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:49,362 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert into another table (483 milliseconds) | |
INFO 16:45:49,580 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:49,581 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:49,797 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:49,798 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert into another table in different keyspace (424 milliseconds) | |
INFO 16:45:50,041 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:50,041 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:50,044 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:50,045 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to join two tables (709 milliseconds) | |
INFO 16:45:50,768 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:50,768 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:50,772 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:50,772 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to join two tables from different keyspaces (740 milliseconds) | |
INFO 16:45:51,479 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:51,479 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:51,483 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:51,483 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to inner join two tables (748 milliseconds) | |
INFO 16:45:52,256 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:52,257 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:52,259 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:52,260 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to left join two tables (663 milliseconds) | |
INFO 16:45:52,908 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:52,908 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:52,911 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:52,911 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to left outer join two tables (668 milliseconds) | |
INFO 16:45:53,563 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:53,564 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:53,566 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:53,567 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to right join two tables (625 milliseconds) | |
INFO 16:45:54,176 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:54,177 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:54,180 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:54,180 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to right outer join two tables (516 milliseconds) | |
INFO 16:45:54,696 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:54,697 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:54,700 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:54,700 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to full join two tables (552 milliseconds) | |
INFO 16:45:55,240 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:55,240 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows for collection columns (202 milliseconds) | |
WARN 16:45:55,413 org.apache.spark.Logging$class (Logging.scala:71) - VarIntType is mapped to catalystTypes.DecimalType with unlimited values. | |
INFO 16:45:55,417 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:55,417 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select rows for data types of ASCII, INT, FLOAT, DOUBLE, BIGINT, BOOLEAN, DECIMAL, INET, TEXT, TIMESTAMP, UUID, VARINT (206 milliseconds) | |
WARN 16:45:55,621 org.apache.spark.Logging$class (Logging.scala:71) - VarIntType is mapped to catalystTypes.DecimalType with unlimited values. | |
INFO 16:45:55,627 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:55,628 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:55,823 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:55,824 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert rows for data types of ASCII, INT, FLOAT, DOUBLE, BIGINT, BOOLEAN, DECIMAL, INET, TEXT, TIMESTAMP, UUID, VARINT (339 milliseconds) | |
INFO 16:45:55,967 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:55,967 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select specified non-UDT columns from a table containing some UDT columns (137 milliseconds) | |
INFO 16:45:56,255 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:56,255 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select UDT collection column and nested UDT column (326 milliseconds) | |
INFO 16:45:56,484 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(meterid,4317) | |
INFO 16:45:56,484 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(meterid,4317)) | |
[info] - should allow to restrict a clustering timestamp column value (104 milliseconds) | |
INFO 16:45:56,605 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:56,605 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to min/max timestamp column (480 milliseconds) | |
INFO 16:45:57,110 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:57,110 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:45:57,278 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:45:57,278 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should use InetAddressType and UUIDType (397 milliseconds) | |
[info] CassandraRDDPartitionerSpec: | |
[info] CassandraRDDPartitioner | |
[info] - should create 1 partition per node if splitCount == 1 (7 milliseconds) | |
[info] - should create about 10000 partitions when splitCount == 10000 (151 milliseconds) | |
[info] - should create multiple partitions if the amount of data is big enough (36 seconds, 383 milliseconds) | |
[info] RDDStreamingSpec: | |
[info] RDDStream | |
[info] - should write from the stream to cassandra table: demo.streaming_wordcount (667 milliseconds) | |
[info] - should be able to utilize joinWithCassandra during transforms (1 second, 202 milliseconds) | |
[info] - should be able to utilize joinWithCassandra and repartitionByCassandraTable on a Dstream (884 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraDataSourceSpec: | |
INFO 16:46:37,401 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:37,401 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select all rows (4 seconds, 857 milliseconds) | |
INFO 16:46:42,277 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:42,277 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to register as a temp table (317 milliseconds) | |
INFO 16:46:42,624 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:42,624 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:46:42,930 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:42,930 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:46:43,283 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:43,283 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to insert data into a cassandra table (985 milliseconds) | |
INFO 16:46:43,849 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:43,849 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:46:44,095 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:44,096 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to save data to a cassandra table (915 milliseconds) | |
INFO 16:46:44,526 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:44,527 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:46:44,762 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:44,762 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to overwrite a cassandra table (504 milliseconds) | |
INFO 16:46:44,988 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
INFO 16:46:44,989 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(a,1), EqualTo(b,2), EqualTo(c,1)) | |
[info] - should allow to filter a table (103 milliseconds) | |
INFO 16:46:45,118 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:45,119 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to filter a table with a function for a column alias (274 milliseconds) | |
INFO 16:46:45,391 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
INFO 16:46:45,392 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer(EqualTo(a,1), EqualTo(b,2), EqualTo(c,1)) | |
[info] - should allow to filter a table with alias (101 milliseconds) | |
INFO 16:46:45,957 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:45,957 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with reversed order columns to a Cassandra table (755 milliseconds) | |
INFO 16:46:46,309 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:46:46,309 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with partial columns to a Cassandra table (312 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] TableWriterSpec: | |
[info] A TableWriter | |
[info] - should write RDD of tuples to an existing table (3 seconds, 25 milliseconds) | |
[info] - should write RDD of tuples to a new table (130 milliseconds) | |
[info] - should write RDD of tuples applying proper data type conversions (65 milliseconds) | |
[info] - should write RDD of case class objects (71 milliseconds) | |
[info] - should write RDD of case class objects to a new table using auto mapping (140 milliseconds) | |
[info] - should write RDD of case class objects applying proper data type conversions (64 milliseconds) | |
[info] - should write RDD of CassandraRow objects (59 milliseconds) | |
[info] - should write RDD of CassandraRow objects applying proper data type conversions (49 milliseconds) | |
[info] - should write RDD of tuples to a table with camel case column names (68 milliseconds) | |
[info] - should write empty values (48 milliseconds) | |
[info] - should write null values (42 milliseconds) | |
[info] - should write only specific column data if ColumnNames is passed as 'columnNames' (39 milliseconds) | |
[info] - should distinguish (deprecated) implicit `seqToSomeColumns` (41 milliseconds) | |
[info] - should write collections (117 milliseconds) | |
[info] - should write blobs (45 milliseconds) | |
[info] - should increment and decrement counters (91 milliseconds) | |
[info] - should increment and decrement counters in batches (2 seconds, 266 milliseconds) | |
[info] - should write values of user-defined classes (73 milliseconds) | |
[info] - should write values of user-defined-types in Cassandra (62 milliseconds) | |
[info] - should write values of TupleValue type (55 milliseconds) | |
[info] - should write column values of tuple type given as Scala tuples (56 milliseconds) | |
[info] - should write Scala tuples nested in UDTValues (36 milliseconds) | |
[info] - should convert components in nested Scala tuples to proper types (38 milliseconds) | |
[info] - should write to single-column tables (30 milliseconds) | |
[info] - should throw IOException if table is not found (3 milliseconds) | |
[info] - should write RDD of case class objects with default TTL (45 milliseconds) | |
[info] - should write RDD of case class objects with default timestamp (32 milliseconds) | |
[info] - should write RDD of case class objects with per-row TTL (42 milliseconds) | |
[info] - should write RDD of case class objects with per-row timestamp (41 milliseconds) | |
[info] - should write RDD of case class objects with per-row TTL with custom mapping (38 milliseconds) | |
[info] - should write RDD of case class objects with per-row timestamp with custom mapping (37 milliseconds) | |
[info] - should write RDD of case class objects applying proper data type conversions and aliases (46 milliseconds) | |
[info] - should write an RDD of tuples mapped to different ordering of fields (35 milliseconds) | |
[info] - should write an RDD of tuples with only some fields aliased (35 milliseconds) | |
[info] - should throw an exception if you try to alias tuple fields which don't exist (1 millisecond) | |
[info] - should throw an exception when aliasing some tuple fields explicitly and others implicitly (2 milliseconds) | |
[info] - should write RDD of objects with inherited fields (38 milliseconds) | |
[info] - should write RDD of case class objects with transient fields (38 milliseconds) | |
[info] - should be able to append and prepend elements to a C* list (130 milliseconds) | |
[info] - should be able to remove elements from a C* list (109 milliseconds) | |
[info] - should be able to add elements to a C* set (71 milliseconds) | |
[info] - should be able to remove elements from a C* set (95 milliseconds) | |
[info] - should be able to add key value pairs to a C* map (78 milliseconds) | |
[info] - should throw an exception if you try to apply a collection behavior to a normal column (10 milliseconds) | |
[info] - should throw an exception if you try to remove values from a map (8 milliseconds) | |
[info] - should throw an exception if you prepend anything but a list (9 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraRDDSpec: | |
[info] A CassandraRDD | |
[info] - should allow to read a Cassandra table as Array of CassandraRow (2 seconds, 967 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of pairs of primitives (337 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of tuples (306 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class objects (274 milliseconds) | |
[info] A CassandraRDD | |
[info] - should allow to read a Cassandra table as Array of user-defined objects with inherited fields (260 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined class objects (265 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined class (with multiple constructors) objects (226 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined class (with no fields) objects (214 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class (nested) objects (220 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class (deeply nested) objects (233 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class (nested in object) objects (232 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined mutable objects (191 milliseconds) | |
[info] - should allow to read a Cassandra table as Array of user-defined case class objects with custom mapping specified by aliases (194 milliseconds) | |
[info] - should allow to read a Cassandra table into CassandraRow objects with custom mapping specified by aliases (172 milliseconds) | |
[info] - should apply proper data type conversions for tuples (178 milliseconds) | |
[info] - should apply proper data type conversions for user-defined case class objects (173 milliseconds) | |
[info] - should apply proper data type conversions for user-defined mutable objects (173 milliseconds) | |
[info] - should map columns to objects using user-defined function (164 milliseconds) | |
[info] - should map columns to objects using user-defined function with type conversion (173 milliseconds) | |
[info] - should allow for selecting a subset of columns (187 milliseconds) | |
[info] - should allow for selecting a subset of rows (173 milliseconds) | |
[info] - should use a single partition per node for a tiny table (7 milliseconds) | |
[info] - should allow for reading collections (170 milliseconds) | |
[info] - should allow for reading blobs (137 milliseconds) | |
[info] - should allow for converting fields to custom types by user-defined TypeConverter (148 milliseconds) | |
[info] - should allow for reading tables with composite partitioning key (164 milliseconds) | |
[info] - should convert values passed to where to correct types (String -> Timestamp) (257 milliseconds) | |
[info] - should convert values passed to where to correct types (DateTime -> Timestamp) (231 milliseconds) | |
[info] - should convert values passed to where to correct types (Date -> Timestamp) (165 milliseconds) | |
[info] - should convert values passed to where to correct types (String -> Timestamp) (double limit) (221 milliseconds) | |
[info] - should convert values passed to where to correct types (DateTime -> Timestamp) (double limit) (191 milliseconds) | |
[info] - should convert values passed to where to correct types (Date -> Timestamp) (double limit) (155 milliseconds) | |
[info] - should accept partitioning key in where (37 milliseconds) | |
[info] - should accept partitioning key and clustering column predicate in where (34 milliseconds) | |
[info] - should accept composite partitioning key in where (36 milliseconds) | |
[info] - should allow to fetch columns from a table with user defined Cassandra type (UDT) (138 milliseconds) | |
[info] - should allow to fetch UDT columns as UDTValue objects (136 milliseconds) | |
[info] - should allow to fetch UDT columns as objects of case classes (148 milliseconds) | |
[info] - should allow to fetch tuple columns as TupleValue objects (143 milliseconds) | |
[info] - should allow to fetch tuple columns as Scala tuples (140 milliseconds) | |
[info] - should throw appropriate IOException when the table was not found at the computation time (13 milliseconds) | |
[info] - should be lazy and must not throw IOException if the table was not found at the RDD initialization time (1 millisecond) | |
Start thread count: 88 | |
End thread count: 86 | |
Threads created: | |
ForkJoinPool-3-worker-23 | |
[info] - should not leak threads (20 seconds, 519 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of two pairs (149 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a pair and a case class (102 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a case class and a tuple (100 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a case class and a tuple grouped by partition key (149 milliseconds) | |
[info] - should allow to read Cassandra table as Array of tuples of two case classes (96 milliseconds) | |
[info] - should allow to read Cassandra table as Array of String values (91 milliseconds) | |
[info] - should allow to read Cassandra table as Array of Int values (96 milliseconds) | |
[info] - should allow to read Cassandra table as Array of java.lang.Integer values (104 milliseconds) | |
[info] - should allow to read Cassandra table as Array of List of values (102 milliseconds) | |
[info] - should allow to read Cassandra table as Array of Set of values (91 milliseconds) | |
[info] - should allow to count a high number of rows (284 milliseconds) | |
[info] - should allow to fetch write time of a specified column as a tuple element (114 milliseconds) | |
[info] - should allow to fetch ttl of a specified column as a tuple element (118 milliseconds) | |
[info] - should allow to fetch both write time and ttl of a specified column as tuple elements (121 milliseconds) | |
[info] - should allow to fetch write time of two different columns as tuple elements (113 milliseconds) | |
[info] - should allow to fetch ttl of two different columns as tuple elements (112 milliseconds) | |
[info] - should allow to fetch writetime of a specified column and map it to a class field with custom mapping (122 milliseconds) | |
[info] - should allow to fetch ttl of a specified column and map it to a class field with custom mapping (115 milliseconds) | |
[info] - should allow to fetch writetime of a specified column and map it to a class field with aliases (109 milliseconds) | |
[info] - should allow to fetch ttl of a specified column and map it to a class field with aliases (120 milliseconds) | |
[info] - should allow to specify ascending ordering (33 milliseconds) | |
[info] - should allow to specify descending ordering (30 milliseconds) | |
[info] - should allow to specify rows number limit (27 milliseconds) | |
[info] - should allow to specify rows number with take (29 milliseconds) | |
[info] - should count the CassandraRDD items (340 milliseconds) | |
[info] - should count the CassandraRDD items with where predicate (21 milliseconds) | |
[info] - should allow to use empty RDD on undefined table (4 milliseconds) | |
[info] - should allow to use empty RDD on defined table (3 milliseconds) | |
[info] - should suggest similar tables if table doesn't exist but keyspace does (4 milliseconds) | |
[info] - should suggest possible keyspace and table matches if the keyspace and table do not exist (4 milliseconds) | |
[info] - should suggest possible keyspaces if the table exists but in a different keyspace (2 milliseconds) | |
[info] - should suggest possible keyspaces and tables if the table has a fuzzy match but they keyspace does not (2 milliseconds) | |
[info] - should handle upper case charactors in UDT fields (158 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraPartitionKeyWhereSpec: | |
[info] A CassandraRDD | |
[info] - should allow partition key eq in where (4 seconds, 91 milliseconds) | |
[info] - should allow partition key 'in' in where (60 milliseconds) | |
[info] - should allow cluster key 'in' in where (509 milliseconds) | |
[info] - should work with composite keys in (56 milliseconds) | |
[info] - should work with composite keys eq (35 milliseconds) | |
[info] - should work with composite keys in2 (34 milliseconds) | |
[info] RoutingKeyGeneratorSpec: | |
[info] RoutingKeyGenerator | |
[info] - should generate proper routing keys when there is one partition key column (23 milliseconds) | |
[info] RoutingKeyGenerator | |
[info] - should generate proper routing keys when there are more partition key columns (8 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] TableWriterColumnNamesSpec: | |
[info] TableWriter | |
[info] - must distinguish `AllColumns` (3 milliseconds) | |
[info] - must distinguish and use only specified column names if provided (1 millisecond) | |
[info] - must distinguish and use only specified column names if provided, when aliases are specified (1 millisecond) | |
[info] - must fail in the RowWriter if provided specified column names do not include primary keys (5 milliseconds) | |
[info] - must do not use TTL when it is not specified (4 milliseconds) | |
[info] - must use static TTL if it is specified (1 millisecond) | |
[info] - must use static timestamp if it is specified (2 milliseconds) | |
[info] - must use both static TTL and static timestamp when they are specified (3 milliseconds) | |
[info] - must use per-row TTL and timestamp when the row writer provides them (2 milliseconds) | |
[info] - must use per-row TTL and static timestamp (2 milliseconds) | |
[info] - must use per-row timestamp and static TTL (2 milliseconds) | |
[info] - must use per-row TTL (3 milliseconds) | |
[info] - must use per-row timestamp (2 milliseconds) | |
[info] GroupingBatchBuilderSpec: | |
[info] GroupingBatchBuilder in fixed batch key mode | |
[info] - should make bound statements when batch size is specified as RowsInBatch(1) (35 milliseconds) | |
[info] - should make bound statements when batch size is specified as BytesInBatch(0) (5 milliseconds) | |
[info] - should make a batch and a bound statements according to the number of statements in a group (5 milliseconds) | |
[info] - should make equal batches when batch size is specified in rows (4 milliseconds) | |
[info] - should make batches of size not greater than the size specified in bytes (6 milliseconds) | |
[info] - should produce empty stream when no data is available and batch size is specified in rows (2 milliseconds) | |
[info] - should produce empty stream when no data is available and batch size is specified in bytes (2 milliseconds) | |
[info] GroupingBatchBuilder in dynamic batch key mode | |
[info] - should make bound statements when batch size is specified as RowsInBatch(1) (4 milliseconds) | |
[info] - should make bound statements when batch size is specified as BytesInBatch(0) (4 milliseconds) | |
[info] - should make a batch and a bound statements according to the number of statements in a group and a batch key (5 milliseconds) | |
[info] - should make bound statements if batches cannot be made due to imposed limits (4 milliseconds) | |
[info] - should make equal batches when batch size is specified in rows and batch buffer is enough (4 milliseconds) | |
[info] - should make batches of size not greater than the size specified in bytes (7 milliseconds) | |
[info] - should produce empty stream when no data is available and batch size is specified in rows (3 milliseconds) | |
[info] - should produce empty stream when no data is available and batch size is specified in bytes (3 milliseconds) | |
[info] - should work with random data (612 milliseconds) | |
[info] DataSizeEstimatesSpec: | |
[info] DataSizeEstimates | |
[info] - should fetch data size estimates for a known table !!! IGNORED !!! | |
[info] - should should return zeroes for an empty table (30 milliseconds) | |
[info] - should return zeroes for a non-existing table (1 millisecond) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraRDDReplSpec: | |
[info] - should allow to read a Cassandra table as Array of Scala class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of Scala case class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of ordinary Scala class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of Scala class without fields objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of Scala class with multiple constructors objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of inner Scala case class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of deeply nested inner Scala case class objects in REPL !!! IGNORED !!! | |
[info] - should allow to read a Cassandra table as Array of nested Scala case class objects in REPL !!! IGNORED !!! | |
[info] CassandraConnectorSpec: | |
[info] A CassandraConnector | |
[info] - should connect to Cassandra with native protocol (22 milliseconds) | |
[info] - should give access to cluster metadata (1 millisecond) | |
[info] - should run queries (98 milliseconds) | |
[info] - should cache PreparedStatements (25 milliseconds) | |
[info] - should disconnect from the cluster after use (501 milliseconds) | |
[info] - should share internal Cluster and Session object between multiple logical sessions (20 milliseconds) | |
[info] - should share internal Cluster object between multiple logical sessions created by different connectors to the same cluster (0 milliseconds) | |
[info] - should cache session objects for reuse (3 milliseconds) | |
[info] - should not make multiple clusters when writing multiple RDDs (3 seconds, 665 milliseconds) | |
[info] - should be configurable from SparkConf (2 milliseconds) | |
WARN 16:48:01,776 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should accept multiple hostnames in spark.cassandra.connection.host property (5 seconds, 35 milliseconds) | |
[info] - should use compression when configured (58 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=192.168.254.254,127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:48:07,166 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] RDDSpec: | |
[info] A Tuple RDD specifying partition keys | |
[info] - should be joinable with Cassandra (3 seconds, 89 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:48:18,014 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a tuple from Cassandra (5 seconds, 311 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:48:23,335 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a case class from cassandra (5 seconds, 272 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be repartitionable (331 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] A case-class RDD specifying partition keys | |
[info] - should be retrievable from Cassandra (166 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a tuple from Cassandra (141 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a case class from cassandra (125 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be repartitionable (254 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] - should throw a meaningful exception if partition column is null when joining with Cassandra table (142 milliseconds) | |
[info] - should throw a meaningful exception if partition column is null when repartitioning by replica (88 milliseconds) | |
[info] - should throw a meaningful exception if partition column is null when saving (182 milliseconds) | |
[info] A Tuple RDD specifying partitioning keys and clustering keys | |
WARN 16:48:25,019 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 2.3 in stage 19.0 (TID 215, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:48:25,020 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 0.2 in stage 19.0 (TID 209, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:48:25,024 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 7.3 in stage 19.0 (TID 220, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:48:25,025 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 3.3 in stage 19.0 (TID 217, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:48:25,029 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 6.2 in stage 19.0 (TID 213, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:48:25,034 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 5.3 in stage 19.0 (TID 218, 192.168.1.105): TaskKilled (killed intentionally) | |
WARN 16:48:25,036 org.apache.spark.Logging$class (Logging.scala:71) - Lost task 4.3 in stage 19.0 (TID 219, 192.168.1.105): TaskKilled (killed intentionally) | |
[info] - should be retrievable from Cassandra (111 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a tuple from Cassandra (117 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be retreivable as a case class from cassandra (129 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be repartitionable (233 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] - should be joinable on both partitioning key and clustering key (155 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be joinable on both partitioning key and clustering key using on (132 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
[info] - should be be able to be limited (40 milliseconds) | |
[info] - should have be able to be counted (70 milliseconds) | |
[info] A CassandraRDD | |
[info] - should be joinable with Cassandra (732 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:48:31,756 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a tuple from Cassandra (5 seconds, 597 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:48:37,357 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable as a case class from cassandra (5 seconds, 655 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] + Checking LeftSide | |
WARN 16:48:43,011 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should be retreivable without repartitioning (5 seconds, 350 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] - should be repartitionable (276 milliseconds) | |
[info] + Checking RightSide Join Results | |
[info] A Joined CassandraRDD | |
[info] - should support select clauses (255 milliseconds) | |
[info] - should support where clauses (42 milliseconds) | |
[info] - should support parametrized where clauses (51 milliseconds) | |
[info] - should throw an exception if using a where on a column that is specified by the join (6 milliseconds) | |
[info] - should throw an exception if using a where on a column that is a part of the Partition key (7 milliseconds) | |
[info] - should throw an exception if you don't have all Partition Keys available (9 milliseconds) | |
[info] - should throw an exception if you try to join on later clustering columns without earlier ones (12 milliseconds) | |
[info] - should throw an exception if you try to join on later clustering columns without earlier ones even when out of order (14 milliseconds) | |
[info] - should throw an exception if you try to join on later clustering columns without earlier ones even when reversed (12 milliseconds) | |
[info] - should throw an exception if you try to join with a data column (13 milliseconds) | |
[info] - should allow to use empty RDD on undefined table (7 milliseconds) | |
[info] - should allow to use empty RDD on defined table (5 milliseconds) | |
[info] - should be lazy and not throw an exception if the table is not found at initializaiton time (5 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=192.168.254.254,127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:48:54,545 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] CassandraPrunedScanSpec: | |
INFO 16:48:54,614 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:48:54,614 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to select all rows (3 seconds, 559 milliseconds) | |
WARN 16:49:03,186 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:49:03,214 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:03,215 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to register as a temp table (5 seconds, 412 milliseconds) | |
WARN 16:49:09,012 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:49:09,037 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:09,401 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:09,764 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
[info] - should allow to insert data into a cassandra table (6 seconds, 426 milliseconds) | |
WARN 16:49:15,070 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:49:15,364 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:15,599 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:15,599 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should allow to save data to a cassandra table (5 seconds, 893 milliseconds) | |
INFO 16:49:15,957 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:16,168 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
[info] - should allow to overwrite a cassandra table (466 milliseconds) | |
INFO 16:49:16,389 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
[info] - should allow to filter a table (265 milliseconds) | |
INFO 16:49:16,682 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
[info] - should allow to filter a table with a function for a column alias (292 milliseconds) | |
INFO 16:49:16,976 org.apache.spark.Logging$class (Logging.scala:59) - filters: EqualTo(a,1), EqualTo(b,2), EqualTo(c,1), EqualTo(e,1) | |
[info] - should allow to filter a table with alias (251 milliseconds) | |
WARN 16:49:22,238 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
INFO 16:49:22,434 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:22,434 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with reversed order columns to a Cassandra table (5 seconds, 493 milliseconds) | |
INFO 16:49:22,760 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:49:22,760 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to save DF with partial columns to a Cassandra table (256 milliseconds) | |
[info] CheckpointStreamSpec: | |
WARN 16:49:33,413 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] Spark Streaming + Checkpointing | |
[Stage 0:> (0 + 2) / 2]Exception in thread "pool-945-thread-1" java.lang.Error: java.lang.InterruptedException | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
Caused by: java.lang.InterruptedException | |
at java.lang.Object.wait(Native Method) | |
at java.lang.Object.wait(Object.java:503) | |
at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:530) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1732) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1750) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1765) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286) | |
at org.apache.spark.rdd.RDD.collect(RDD.scala:884) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$lessinit$greater$2.apply(TestSuiteBase.scala:123) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$lessinit$greater$2.apply(TestSuiteBase.scala:122) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:42) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at scala.util.Try$.apply(Try.scala:191) | |
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:34) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:192) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
... 2 more | |
WARN 16:49:38,660 org.apache.spark.Logging$class (Logging.scala:71) - isTimeValid called with 500 ms where as last valid time is 1000 ms | |
[Stage 0:> (0 + 2) / 2]Exception in thread "pool-958-thread-1" java.lang.Error: java.lang.InterruptedException | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1151) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
Caused by: java.lang.InterruptedException | |
at java.lang.Object.wait(Native Method) | |
at java.lang.Object.wait(Object.java:503) | |
at org.apache.spark.scheduler.JobWaiter.awaitResult(JobWaiter.scala:73) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:530) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1732) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1750) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1765) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779) | |
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:286) | |
at org.apache.spark.rdd.RDD.collect(RDD.scala:884) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$lessinit$greater$2.apply(TestSuiteBase.scala:123) | |
at org.apache.spark.streaming.TestOutputStreamWithPartitions$$anonfun$$lessinit$greater$2.apply(TestSuiteBase.scala:122) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:42) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:399) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:40) | |
at scala.util.Try$.apply(Try.scala:191) | |
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:34) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:193) | |
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58) | |
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:192) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
... 2 more | |
[info] - should work with JWCTable and RPCassandra Replica (8 seconds, 873 milliseconds) | |
objc[56111]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:49:46,316 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:49:46,319 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
[info] CassandraAuthenticatedConnectorSpec: | |
[info] A CassandraConnector | |
WARN 16:49:55,382 com.datastax.driver.core.Cluster$Manager (Cluster.java:1919) - You listed /192.168.254.254:9042 in your contact points, but it could not be reached at startup | |
[info] - should authenticate with username and password when using native protocol (5 seconds, 270 milliseconds) | |
[info] - should pick up user and password from SparkConf (3 milliseconds) | |
objc[56115]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:49:59,135 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:49:59,137 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] MultiThreadedSpec: | |
[info] A Spark Context | |
[info] - should be able to read a Cassandra table in different threads (5 seconds, 711 milliseconds) | |
[info] CassandraConnectorSourceSpec: | |
[info] CassandraConnectorSource | |
org.apache.spark.metrics.CassandraConnectorSource | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] - should be initialized when it was specified in metrics properties (450 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] - should not be initialized when it wasn't specified in metrics properties (265 milliseconds) | |
[info] - should be able to create a new instance in the executor environment only once (2 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.auth.password=cassandra | |
spark.cassandra.auth.username=cassandra | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraDataFrameSpec: | |
[info] A DataFrame | |
INFO 16:50:13,670 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:50:13,671 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to be created programmatically (4 seconds, 883 milliseconds) | |
INFO 16:50:19,031 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:50:19,031 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
INFO 16:50:20,007 org.apache.spark.Logging$class (Logging.scala:59) - filters: | |
INFO 16:50:20,007 org.apache.spark.Logging$class (Logging.scala:59) - pushdown filters: ArrayBuffer() | |
[info] - should be able to be saved programatically (1 second, 937 milliseconds) | |
[info] - should provide error out with a sensible message when a table can't be found (20 milliseconds) | |
[info] - should provide useful suggestions if a table can't be found but a close match exists (2 milliseconds) | |
objc[56243]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:50:24,430 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:50:24,433 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
[info] CassandraSSLConnectorSpec: | |
[info] A CassandraConnector | |
[info] - should be able to use a secure connection when using native protocol (329 milliseconds) | |
objc[56245]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:50:32,097 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:50:32,099 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
[info] SchemaSpec: | |
[info] A Schema | |
[info] - should allow to get a list of keyspaces (1 millisecond) | |
[info] - should allow to look up a keyspace by name (0 milliseconds) | |
[info] A KeyspaceDef | |
[info] - should allow to get a list of tables in the given keyspace (1 millisecond) | |
[info] - should allow to look up a table by name (1 millisecond) | |
[info] A TableDef | |
[info] - should allow to read column definitions by name (0 milliseconds) | |
[info] - should allow to read primary key column definitions (7 milliseconds) | |
[info] - should allow to read partitioning key column definitions (2 milliseconds) | |
[info] - should allow to read regular column definitions (1 millisecond) | |
[info] - should allow to read proper types of columns (0 milliseconds) | |
[info] - should allow to list fields of a user defined type (0 milliseconds) | |
[info] ScalaTest | |
[info] Run completed in 5 minutes, 29 seconds. | |
[info] Total number of tests run: 304 | |
[info] Suites: completed 23, aborted 0 | |
[info] Tests: succeeded 304, failed 0, canceled 0, ignored 9, pending 0 | |
[info] All tests passed. | |
[info] Passed: Total 304, Failed 0, Errors 0, Passed 304, Ignored 9 | |
objc[56248]: Class JavaLaunchHelper is implemented in both /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/bin/java and /Library/Java/JavaVirtualMachines/jdk1.7.0_79.jdk/Contents/Home/jre/lib/libinstrument.dylib. One of the two will be used. Which one is undefined. | |
WARN 16:50:42,637 org.apache.cassandra.utils.CLibrary (CLibrary.java:70) - JNA link failure, one or more native method will be unavailable. | |
WARN 16:50:42,639 org.apache.cassandra.service.CassandraDaemon (CassandraDaemon.java:81) - JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info. | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.master=spark://127.0.0.1:7777 | |
WARN 16:50:45,615 org.apache.hadoop.util.NativeCodeLoader (NativeCodeLoader.java:62) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
WARN 16:50:45,718 org.apache.spark.Logging$class (Logging.scala:71) - Your hostname, ursus-major resolves to a loopback address: 127.0.0.1; using 192.168.1.105 instead (on interface en0) | |
WARN 16:50:45,719 org.apache.spark.Logging$class (Logging.scala:71) - Set SPARK_LOCAL_IP if you need to bind to another address | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraJavaRDDSpec: | |
[info] CassandraJavaRDD | |
[info] - should allow to read data as CassandraRows (4 seconds, 308 milliseconds) | |
[info] - should allow to read data as Java beans (2 seconds, 126 milliseconds) | |
[info] - should allow to read data as Java beans with inherited fields (585 milliseconds) | |
[info] - should allow to read data as Java beans with custom mapping defined by aliases (555 milliseconds) | |
[info] - should allow to read data as Java beans (with multiple constructors) (459 milliseconds) | |
[info] - should throw NoSuchMethodException when trying to read data as Java beans (without no-args constructor) (35 milliseconds) | |
[info] - should allow to read data as nested Java beans (433 milliseconds) | |
[info] - should allow to read data as deeply nested Java beans (442 milliseconds) | |
[info] - should allow to select a subset of columns (370 milliseconds) | |
[info] - should return selected columns (8 milliseconds) | |
[info] - should allow to use where clause to filter records (555 milliseconds) | |
[info] - should allow to read rows as an array of a single-column type supported by TypeConverter (1 second, 134 milliseconds) | |
[info] - should allow to read rows as an array of a single-column list (341 milliseconds) | |
[info] - should allow to read rows as an array of a single-column set (298 milliseconds) | |
[info] - should allow to read rows as an array of a single-column map (268 milliseconds) | |
[info] - should allow to read rows as an array of multi-column type (249 milliseconds) | |
[info] - should allow to read rows as an array of multi-column type with explicit column name mapping (251 milliseconds) | |
[info] - should allow to transform rows into KV pairs of two single-column types (279 milliseconds) | |
[info] - should allow to transform rows into KV pairs of a single-column type and a multi-column type (270 milliseconds) | |
[info] - should allow to transform rows into KV pairs of a multi-column type and a single-column type (302 milliseconds) | |
[info] - should allow to transform rows into KV pairs of multi-column types (205 milliseconds) | |
[info] - should allow to read Cassandra data as array of Integer (189 milliseconds) | |
[info] - should allow to change the default Cassandra Connector to a custom one (191 milliseconds) | |
[info] - should allow to read null columns (116 milliseconds) | |
[info] - should allow to fetch UDT columns (208 milliseconds) | |
[info] - should allow to fetch tuple columns (197 milliseconds) | |
[info] - should allow to read Cassandra table as Array of KV tuples of a case class and a tuple grouped by partition key (225 milliseconds) | |
[info] - should allow to set limit (87 milliseconds) | |
[info] - should allow to set ascending ordering (51 milliseconds) | |
[info] - should allow to set descending ordering (49 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraJavaUtilSpec: | |
[info] CassandraJavaUtil | |
[info] - should allow to save beans (with multiple constructors) to Cassandra (4 seconds, 44 milliseconds) | |
[info] - should allow to save beans to Cassandra (103 milliseconds) | |
[info] - should allow to save beans with transient fields to Cassandra (116 milliseconds) | |
[info] - should allow to save beans with inherited fields to Cassandra (92 milliseconds) | |
[info] - should allow to save nested beans to Cassandra (89 milliseconds) | |
[info] - should allow to read rows as Tuple1 (572 milliseconds) | |
[info] - should allow to read rows as Tuple2 (390 milliseconds) | |
[info] - should allow to read rows as Tuple3 (338 milliseconds) | |
[info] - should allow to read rows as Tuple4 (355 milliseconds) | |
[info] - should allow to read rows as Tuple5 (300 milliseconds) | |
[info] - should allow to read rows as Tuple6 (306 milliseconds) | |
[info] - should allow to read rows as Tuple7 (285 milliseconds) | |
[info] - should allow to read rows as Tuple8 (283 milliseconds) | |
[info] - should allow to read rows as Tuple9 (317 milliseconds) | |
[info] - should allow to read rows as Tuple10 (350 milliseconds) | |
[info] - should allow to read rows as Tuple11 (376 milliseconds) | |
[info] - should allow to read rows as Tuple12 (309 milliseconds) | |
[info] - should allow to read rows as Tuple13 (303 milliseconds) | |
[info] - should allow to read rows as Tuple14 (225 milliseconds) | |
[info] - should allow to read rows as Tuple15 (225 milliseconds) | |
[info] - should allow to read rows as Tuple16 (210 milliseconds) | |
[info] - should allow to read rows as Tuple17 (210 milliseconds) | |
[info] - should allow to read rows as Tuple18 (244 milliseconds) | |
[info] - should allow to read rows as Tuple19 (228 milliseconds) | |
[info] - should allow to read rows as Tuple20 (220 milliseconds) | |
[info] - should allow to read rows as Tuple21 (179 milliseconds) | |
[info] - should allow to read rows as Tuple22 (186 milliseconds) | |
[info] - should allow to write Tuple1 to Cassandra (65 milliseconds) | |
[info] - should allow to write Tuple2 to Cassandra (67 milliseconds) | |
[info] - should allow to write Tuple3 to Cassandra (65 milliseconds) | |
[info] - should allow to write Tuple4 to Cassandra (74 milliseconds) | |
[info] - should allow to write Tuple5 to Cassandra (70 milliseconds) | |
[info] - should allow to write Tuple6 to Cassandra (100 milliseconds) | |
[info] - should allow to write Tuple7 to Cassandra (72 milliseconds) | |
[info] - should allow to write Tuple8 to Cassandra (69 milliseconds) | |
[info] - should allow to write Tuple9 to Cassandra (73 milliseconds) | |
[info] - should allow to write Tuple10 to Cassandra (75 milliseconds) | |
[info] - should allow to write Tuple11 to Cassandra (80 milliseconds) | |
[info] - should allow to write Tuple12 to Cassandra (78 milliseconds) | |
[info] - should allow to write Tuple13 to Cassandra (73 milliseconds) | |
[info] - should allow to write Tuple14 to Cassandra (79 milliseconds) | |
[info] - should allow to write Tuple15 to Cassandra (74 milliseconds) | |
[info] - should allow to write Tuple16 to Cassandra (74 milliseconds) | |
[info] - should allow to write Tuple17 to Cassandra (71 milliseconds) | |
[info] - should allow to write Tuple18 to Cassandra (76 milliseconds) | |
[info] - should allow to write Tuple19 to Cassandra (75 milliseconds) | |
[info] - should allow to write Tuple20 to Cassandra (76 milliseconds) | |
[info] - should allow to write Tuple21 to Cassandra (80 milliseconds) | |
[info] - should allow to write Tuple22 to Cassandra (91 milliseconds) | |
Starting SparkContext with the following configuration: | |
spark.app.name=Test | |
spark.cassandra.connection.host=127.0.0.1 | |
spark.cassandra.connection.port=9042 | |
spark.cleaner.ttl=3600 | |
spark.jars=file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.11/spark-cassandra-connector-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/kafka-streaming/target/scala-2.11/kafka-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/simple-demos/target/scala-2.11/simple-demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/target/scala-2.11/demos_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-demos/twitter-streaming/target/scala-2.11/twitter-streaming_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.11/spark-cassandra-connector-embedded-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-assembly-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-it_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/spark-cassandra-connector-java/target/scala-2.11/spark-cassandra-connector-java-test_2.11-1.4.0-RC1-SNAPSHOT.jar,file:/Users/jlewandowski/Projects/OpenSource/spark-cassandra-connector/target/scala-2.11/root_2.11-1.4.0-RC1-SNAPSHOT-tests.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-all/jars/cassandra-all-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/net.java.dev.jna/jna/jars/jna-4.0.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.thinkaurelius.thrift/thrift-server/jars/thrift-server-0.3.7.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr-runtime/jars/antlr-runtime-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.addthis.metrics/reporter-config/jars/reporter-config-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.json-simple/json-simple/jars/json-simple-1.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-mockito/jars/powermock-api-mockito-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/javax.validation/validation-api/jars/validation-api-1.0.0.GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.hamcrest/hamcrest-core/jars/hamcrest-core-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/org.hibernate/hibernate-validator/jars/hibernate-validator-4.3.0.Final.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-reflect/jars/powermock-reflect-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.javassist/javassist/bundles/javassist-3.19.0-GA.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka_2.11/jars/kafka_2.11-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-scalatest-support_2.11/jars/scalamock-scalatest-support_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.yaml/snakeyaml/bundles/snakeyaml-1.11.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.supercsv/super-csv/jars/super-csv-2.1.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.typesafe.akka/akka-testkit_2.11/jars/akka-testkit_2.11-2.3.4.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalactic/scalactic_2.11/bundles/scalactic_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit/jars/junit-4.12.jar,file:/Users/jlewandowski/.ivy2/cache/com.101tec/zkclient/jars/zkclient-0.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.google.guava/guava/bundles/guava-16.0.jar,file:/Users/jlewandowski/.ivy2/cache/com.boundary/high-scale-lib/jars/high-scale-lib-1.0.6.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.cassandra/cassandra-thrift/jars/cassandra-thrift-2.1.5.jar,file:/Users/jlewandowski/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-api-support/jars/powermock-api-support-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.thrift/libthrift/jars/libthrift-0.9.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scala-tools.testing/test-interface/jars/test-interface-0.5.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.6.jar,file:/Users/jlewandowski/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.3.jar,file:/Users/jlewandowski/.ivy2/cache/com.lmax/disruptor/jars/disruptor-3.0.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.spark/spark-streaming_2.11/jars/spark-streaming_2.11-1.4.0-tests.jar,file:/Users/jlewandowski/.ivy2/cache/net.sf.jopt-simple/jopt-simple/jars/jopt-simple-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4-common/jars/powermock-module-junit4-common-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/jline/jline/jars/jline-1.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.8.2.1.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-core/jars/powermock-core-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.jboss.logging/jboss-logging/jars/jboss-logging-3.1.0.CR2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/antlr/jars/antlr-3.5.2.jar,file:/Users/jlewandowski/.ivy2/cache/com.github.jbellis/jamm/jars/jamm-0.3.0.jar,file:/Users/jlewandowski/.ivy2/cache/org.mockito/mockito-all/jars/mockito-all-1.10.19.jar,file:/Users/jlewandowski/.ivy2/cache/org.mindrot/jbcrypt/jars/jbcrypt-0.3m.jar,file:/Users/jlewandowski/.ivy2/cache/org.powermock/powermock-module-junit4/jars/powermock-module-junit4-1.6.2.jar,file:/Users/jlewandowski/.ivy2/cache/junit/junit-dep/jars/junit-dep-4.10.jar,file:/Users/jlewandowski/.ivy2/cache/com.novocode/junit-interface/jars/junit-interface-0.10.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalatest/scalatest_2.11/bundles/scalatest_2.11-2.2.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.scalamock/scalamock-core_2.11/jars/scalamock-core_2.11-3.2.jar,file:/Users/jlewandowski/.ivy2/cache/org.antlr/ST4/jars/ST4-4.0.8.jar | |
spark.master=spark://127.0.0.1:7777 | |
[info] CassandraJavaPairRDDSpec: | |
[info] CassandraJavaPairRDD | |
[info] - should allow to reduce by key (4 seconds, 167 milliseconds) | |
[info] - should allow to use spanBy method (380 milliseconds) | |
[info] - should allow to use spanByKey method (314 milliseconds) | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions2 started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions3 started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions6 started | |
[info] Test com.datastax.spark.connector.CassandraStreamingJavaUtilTest.testJavaFunctions7 started | |
[info] Test run finished: 0 failed, 0 ignored, 4 total, 0.241s | |
[info] Test run started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnToListOf started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnToMapOf started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnToSetOf started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnTo1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnTo2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapColumnTo3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeConverter4 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeTag1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeTag2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testTypeTag3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions4 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testJavaFunctions5 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo1 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo2 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo3 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testMapRowTo4 started | |
[info] Test com.datastax.spark.connector.CassandraJavaUtilTest.testConvertToMap started | |
[info] Test run finished: 0 failed, 0 ignored, 22 total, 0.137s | |
[info] ScalaTest | |
[info] Run completed in 46 seconds, 57 milliseconds. | |
[info] Total number of tests run: 82 | |
[info] Suites: completed 3, aborted 0 | |
[info] Tests: succeeded 82, failed 0, canceled 0, ignored 0, pending 0 | |
[info] All tests passed. | |
[info] Passed: Total 108, Failed 0, Errors 0, Passed 108 | |
[success] Total time: 379 s, completed Sep 17, 2015 4:51:26 PM | |
Tests succeeded | |
stopping org.apache.spark.deploy.worker.Worker | |
stopping org.apache.spark.deploy.master.Master | |
ursus-major:spark-cassandra-connector jlewandowski$ |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment