Skip to content

Instantly share code, notes, and snippets.

@jayunit100
Created August 27, 2014 00:41
Show Gist options
  • Save jayunit100/d424b6b825ce8517d68c to your computer and use it in GitHub Desktop.
Save jayunit100/d424b6b825ce8517d68c to your computer and use it in GitHub Desktop.
Using as default JAVA_HOME.
Note, this will be overridden by -java-home if it is set.
[info] Loading project definition from /home/jay/Development/spark/project/project
[info] Loading project definition from /home/jay/.sbt/0.13/staging/ec3aa8f39111944cc5f2/sbt-pom-reader/project
[warn] Multiple resolvers having different access mechanism configured with same name 'sbt-plugin-releases'. To avoid conflict, Remove duplicate project resolvers (`resolvers`) or rename publishing resolver (`publishTo`).
[info] Loading project definition from /home/jay/Development/spark/project
[info] Set current project to spark-parent (in build file:/home/jay/Development/spark/)
[info] TriangleCountSuite:
[info] - Count a single triangle
[info] - Count two triangles
[info] - Count two triangles with bi-directed edges
[info] - Count a single triangle with duplicate edges
[info] GraphOpsSuite:
[info] - joinVertices
[info] - collectNeighborIds
[info] - filter
[info] - collectEdgesCycleDirectionOut
[info] - collectEdgesCycleDirectionIn
[info] - collectEdgesCycleDirectionEither
[info] - collectEdgesChainDirectionOut
[info] - collectEdgesChainDirectionIn
[info] - collectEdgesChainDirectionEither
[info] LabelPropagationSuite:
[info] - Label Propagation
[info] PageRankSuite:
[info] - Star PageRank
[info] - Grid PageRank
[info] - Chain PageRank
[info] ShortestPathsSuite:
[info] - Shortest Path Computations
[info] GraphSuite:
[info] - Graph.fromEdgeTuples
[info] - Graph.fromEdges
[info] - Graph.apply
[info] - triplets
[info] - partitionBy
[info] - mapVertices
[info] - mapVertices changing type with same erased type
[info] - mapEdges
[info] - mapTriplets
[info] - reverse
[info] - reverse with join elimination
[info] - subgraph
[info] - mask
[info] - groupEdges
[info] - mapReduceTriplets
[info] - outerJoinVertices
[info] - more edge partitions than vertex partitions
[info] StronglyConnectedComponentsSuite:
[info] - Island Strongly Connected Components
[info] - Cycle Strongly Connected Components
[info] - 2 Cycle Strongly Connected Components
[info] EdgeSuite:
[info] - compare
[info] SerializerSuite:
[info] - IntAggMsgSerializer
[info] - LongAggMsgSerializer
[info] - DoubleAggMsgSerializer
[info] - variable long encoding
[info] EdgeTripletIteratorSuite:
[info] - iterator.toList
[info] PregelSuite:
[info] - 1 iteration
[info] - chain propagation
[info] VertexRDDSuite:
[info] - filter
[info] - mapValues
[info] - diff
[info] - leftJoin
[info] - innerJoin
[info] - aggregateUsingIndex
[info] VertexPartitionSuite:
[info] - isDefined, filter
[info] - map
[info] - diff
[info] - leftJoin
[info] - innerJoin
[info] - createUsingIndex
[info] - innerJoinKeepLeft
[info] - aggregateUsingIndex
[info] - reindex
[info] - serialization
[info] ConnectedComponentsSuite:
[info] - Grid Connected Components
[info] - Reverse Grid Connected Components
[info] - Chain Connected Components
[info] - Reverse Chain Connected Components
[info] - Connected Components on a Toy Connected Graph
[info] SVDPlusPlusSuite:
[info] - Test SVD++ with mean square error on training set
[info] EdgePartitionSuite:
[info] - reverse
[info] - map
[info] - filter
[info] - groupEdges
[info] - upgradeIterator
[info] - indexIterator
[info] - innerJoin
[info] - isActive, numActives, replaceActives
[info] - serialization
[info] BytecodeUtilsSuite:
[info] - closure invokes a method
[info] - closure inside a closure invokes a method
[info] - closure inside a closure inside a closure invokes a method
[info] - closure calling a function that invokes a method
[info] - closure calling a function that invokes a method which uses another closure
[info] - nested closure
[info] ScalaTest
[info] Run completed in 1 minute, 37 seconds.
[info] Total number of tests run: 83
[info] Suites: completed 17, aborted 0
[info] Tests: succeeded 83, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 83, Failed 0, Errors 0, Passed 83
[info] BagelSuite:
[info] - halting by voting
[info] - halting by message silence
[info] - large number of iterations
[info] - using non-default persistence level
[info] ScalaTest
[info] Run completed in 1 minute, 50 seconds.
[info] Total number of tests run: 4
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 4, Failed 0, Errors 0, Passed 4
[info] GeneratedEvaluationSuite:
[info] - literals
[info] - 3VL Not
[info] - 3VL AND
[info] - 3VL OR
[info] - 3VL =
[info] - IN
[info] - LIKE literal Regular Expression
[info] - LIKE Non-literal Regular Expression
[info] - RLIKE literal Regular Expression
[info] - RLIKE Non-literal Regular Expression
[info] - data type casting
[info] - timestamp
[info] - timestamp casting
[info] - null checking
[info] - case when
[info] - complex type
[info] - arithmetic
[info] - BinaryComparison
[info] - StringComparison
[info] - Substring
[info] - multithreaded eval
[info] AnalysisSuite:
[info] - analyze project
[info] - resolve relations
[info] - throw errors for unresolved attributes during analysis
[info] HiveTypeCoercionSuite:
[info] - tightest common bound for numeric and boolean types
[info] FilterPushdownSuite:
[info] - eliminate subqueries
[info] - simple push down
[info] - can't push without rewrite
[info] - filters: combines filters
[info] - joins: push to either side
[info] - joins: push to one side
[info] - joins: rewrite filter to push to either side
[info] - joins: push down left outer join #1
[info] - joins: push down right outer join #1
[info] - joins: push down left outer join #2
[info] - joins: push down right outer join #2
[info] - joins: push down left outer join #3
[info] - joins: push down right outer join #3
[info] - joins: push down left outer join #4
[info] - joins: push down right outer join #4
[info] - joins: push down left outer join #5
[info] - joins: push down right outer join #5
[info] - joins: can't push down
[info] - joins: conjunctive predicates
[info] - joins: conjunctive predicates #2
[info] - joins: conjunctive predicates #3
[info] CombiningLimitsSuite:
[info] - limits: combines two limits
[info] - limits: combines three limits
[info] DistributionSuite:
[info] - HashPartitioning is the output partitioning
[info] - RangePartitioning is the output partitioning
[info] LikeSimplificationSuite:
[info] - simplify Like into StartsWith
[info] - simplify Like into EndsWith
[info] - simplify Like into Contains
[info] - simplify Like into EqualTo
[info] TreeNodeSuite:
[info] - top node changed
[info] - one child changed
[info] - no change
[info] - collect
[info] - pre-order transform
[info] - post-order transform
[info] - transform works on nodes with Option children
[info] ConstantFoldingSuite:
[info] - eliminate subqueries
[info] - Constant folding test: expressions only have literals
[info] - Constant folding test: expressions have attribute references and literals in arithmetic operations
[info] - Constant folding test: expressions have attribute references and literals in predicates
[info] - Constant folding test: expressions have foldable functions
[info] - Constant folding test: expressions have nonfoldable functions
[info] - Constant folding test: expressions have null literals
[info] GeneratedMutableEvaluationSuite:
[info] - literals
[info] - 3VL Not
[info] - 3VL AND
[info] - 3VL OR
[info] - 3VL =
[info] - IN
[info] - LIKE literal Regular Expression
[info] - LIKE Non-literal Regular Expression
[info] - RLIKE literal Regular Expression
[info] - RLIKE Non-literal Regular Expression
[info] - data type casting
[info] - timestamp
[info] - timestamp casting
[info] - null checking
[info] - case when
[info] - complex type
[info] - arithmetic
[info] - BinaryComparison
[info] - StringComparison
[info] - Substring
[info] ExpressionEvaluationSuite:
[info] - literals
[info] - 3VL Not
[info] - 3VL AND
[info] - 3VL OR
[info] - 3VL =
[info] - IN
[info] - LIKE literal Regular Expression
[info] - LIKE Non-literal Regular Expression
[info] - RLIKE literal Regular Expression
[info] - RLIKE Non-literal Regular Expression
[info] - data type casting
[info] - timestamp
[info] - timestamp casting
[info] - null checking
[info] - case when
[info] - complex type
[info] - arithmetic
[info] - BinaryComparison
[info] - StringComparison
[info] - Substring
[info] SimplifyCaseConversionExpressionsSuite:
[info] - simplify UPPER(UPPER(str))
[info] - simplify UPPER(LOWER(str))
[info] - simplify LOWER(UPPER(str))
[info] - simplify LOWER(LOWER(str))
[info] ExpressionOptimizationSuite:
[info] - literals
[info] - 3VL Not
[info] - 3VL AND
[info] - 3VL OR
[info] - 3VL =
[info] - IN
[info] - LIKE literal Regular Expression
[info] - LIKE Non-literal Regular Expression
[info] - RLIKE literal Regular Expression
[info] - RLIKE Non-literal Regular Expression
[info] - data type casting
[info] - timestamp
[info] - timestamp casting
[info] - null checking
[info] - case when
[info] - complex type
[info] - arithmetic
[info] - BinaryComparison
[info] - StringComparison
[info] - Substring
[info] RuleExecutorSuite:
[info] - only once
[info] - to fixed point
[info] - to maxIterations
[info] PlanTest:
[info] ScalaReflectionSuite:
[info] - primitive data
[info] - nullable data
[info] - optinal data
[info] - complex data
[info] - generic data
[info] - tuple data
[info] - get data type of a value
[info] ScalaTest
[info] Run completed in 2 minutes, 59 seconds.
[info] Total number of tests run: 142
[info] Suites: completed 16, aborted 0
[info] Tests: succeeded 142, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[info] Passed: Total 142, Failed 0, Errors 0, Passed 142
[info] FailureSuite:
[info] - multiple failures with map
[info] - multiple failures with updateStateByKey
[info] StreamingListenerSuite:
[info] - batch info reporting
[info] - receiver info reporting
[info] NetworkReceiverSuite:
[info] - network receiver life cycle
[info] - block generator
[info] - block generator throttling
[info] UISuite:
[info] - streaming tab in spark UI !!! IGNORED !!!
[info] RateLimitedOutputStreamSuite:
[info] - write
[info] StreamingContextSuite:
[info] - from no conf constructor
[info] - from no conf + spark home
[info] - from no conf + spark home + env
[info] - from conf with settings
[info] - from existing SparkContext
[info] - from existing SparkContext with settings
[info] - from checkpoint
[info] - start and stop state check
[info] - start multiple times
[info] - stop multiple times
[info] - stop before start and start after stop
[info] - stop only streaming context
[info] - stop gracefully
[info] - awaitTermination
[info] - awaitTermination after stop
[info] - awaitTermination with error in task
[info] - awaitTermination with error in job generation
[info] WindowOperationsSuite:
[info] - window - basic window
[info] - window - tumbling window
[info] - window - larger window
[info] - window - non-overlapping window
[info] - window - persistence level
[info] - reduceByKeyAndWindow - basic reduction
[info] - reduceByKeyAndWindow - key already in window and new value added into window
[info] - reduceByKeyAndWindow - new key added into window
[info] - reduceByKeyAndWindow - key removed from window
[info] - reduceByKeyAndWindow - larger slide time
[info] - reduceByKeyAndWindow - big test
[info] - reduceByKeyAndWindow with inverse function - basic reduction
[info] - reduceByKeyAndWindow with inverse function - key already in window and new value added into window
[info] - reduceByKeyAndWindow with inverse function - new key added into window
[info] - reduceByKeyAndWindow with inverse function - key removed from window
[info] - reduceByKeyAndWindow with inverse function - larger slide time
[info] - reduceByKeyAndWindow with inverse function - big test
[info] - reduceByKeyAndWindow with inverse and filter functions - big test
[info] - groupByKeyAndWindow
[info] - countByWindow
[info] - countByValueAndWindow
[info] BasicOperationsSuite:
[info] - map
[info] - flatMap
[info] - filter
[info] - glom
[info] - mapPartitions
[info] - repartition (more partitions)
[info] - repartition (fewer partitions)
[info] - groupByKey
[info] - reduceByKey
[info] - reduce
[info] - count
[info] - countByValue
[info] - mapValues
[info] - flatMapValues
[info] - union
[info] - StreamingContext.union
[info] - transform
[info] - transformWith
[info] - StreamingContext.transform
[info] - cogroup
[info] - join
[info] - leftOuterJoin
[info] - rightOuterJoin
[info] - updateStateByKey
[info] - updateStateByKey - object lifecycle
[info] - slice
[info] - slice - has not been initialized
[info] - rdd cleanup - map and window
[info] - rdd cleanup - updateStateByKey
[info] - rdd cleanup - input blocks and persisted RDDs
[info] CheckpointSuite:
[info] - basic rdd checkpoints + dstream graph checkpoint recovery
[info] - persistence of conf through checkpoints
[info] - recovery with map and reduceByKey operations
[info] - recovery with invertible reduceByKeyAndWindow operation
[info] - recovery with updateStateByKey operation
[info] - recovery with file input stream
[info] InputStreamsSuite:
[info] - socket input stream
[info] - file input stream
[info] - actor input stream !!! IGNORED !!!
[info] - multi-thread receiver
[info] - queue input stream - oneAtATime=true
[info] - queue input stream - oneAtATime=false
[info] Test run started
[info] Test org.apache.spark.streaming.JavaAPISuite.testMap started
[info] Test org.apache.spark.streaming.JavaAPISuite.testUpdateStateByKey started
[info] Test org.apache.spark.streaming.JavaAPISuite.testWindowWithSlideDuration started
[info] Test org.apache.spark.streaming.JavaAPISuite.testFilter started
[info] Test org.apache.spark.streaming.JavaAPISuite.testRepartitionMorePartitions started
[info] Test org.apache.spark.streaming.JavaAPISuite.testRepartitionFewerPartitions started
[info] Test org.apache.spark.streaming.JavaAPISuite.testGlom started
[info] Test org.apache.spark.streaming.JavaAPISuite.testTransformWith started
[info] Test org.apache.spark.streaming.JavaAPISuite.testVariousTransformWith started
[info] Test org.apache.spark.streaming.JavaAPISuite.testStreamingContextTransform started
[info] Test org.apache.spark.streaming.JavaAPISuite.testFlatMap started
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairFlatMap started
[info] Test org.apache.spark.streaming.JavaAPISuite.testUnion started
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairMap started
[info] Test org.apache.spark.streaming.JavaAPISuite.testInitialization started
[info] Test org.apache.spark.streaming.JavaAPISuite.testCombineByKey started
[info] Test org.apache.spark.streaming.JavaAPISuite.testCountByValue started
[info] Test org.apache.spark.streaming.JavaAPISuite.testGroupByKeyAndWindow started
[info] Test org.apache.spark.streaming.JavaAPISuite.testCountByValueAndWindow started
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairTransform started
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairToNormalRDDTransform started
[info] Test org.apache.spark.streaming.JavaAPISuite.testMapValues started
[info] Test org.apache.spark.streaming.JavaAPISuite.testFlatMapValues started
[info] Test org.apache.spark.streaming.JavaAPISuite.testCoGroup started
[info] Test org.apache.spark.streaming.JavaAPISuite.testJoin started
[info] Test org.apache.spark.streaming.JavaAPISuite.testLeftOuterJoin started
[info] Test org.apache.spark.streaming.JavaAPISuite.testCheckpointMasterRecovery started
[info] Test org.apache.spark.streaming.JavaAPISuite.testSocketTextStream started
[info] Test org.apache.spark.streaming.JavaAPISuite.testSocketString started
[info] Test org.apache.spark.streaming.JavaAPISuite.testTextFileStream started
[error] Test org.apache.spark.streaming.JavaAPISuite.testTextFileStream failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at org.apache.spark.ContextCleaner.start(ContextCleaner.scala:90)
[error]  at org.apache.spark.SparkContext$$anonfun$22.apply(SparkContext.scala:332)
[error]  at org.apache.spark.SparkContext$$anonfun$22.apply(SparkContext.scala:332)
[error]  at scala.Option.foreach(Option.scala:236)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:332)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testTextFileStream failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testRawSocketStream started
[error] Test org.apache.spark.streaming.JavaAPISuite.testRawSocketStream failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testRawSocketStream failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testMapPartitions started
[error] Test org.apache.spark.streaming.JavaAPISuite.testMapPartitions failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testMapPartitions failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testReduce started
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduce failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.externalPush(ForkJoinPool.java:1829)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.execute(ForkJoinPool.java:2955)
[error]  at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinPool.execute(AbstractDispatcher.scala:374)
[error]  at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:212)
[error]  at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:43)
[error]  at akka.dispatch.Dispatcher.registerForExecution(Dispatcher.scala:118)
[error]  at akka.dispatch.Dispatcher.dispatch(Dispatcher.scala:59)
[error]  at akka.actor.dungeon.Dispatch$class.sendMessage(Dispatch.scala:120)
[error]  at akka.actor.ActorCell.sendMessage(ActorCell.scala:338)
[error]  at akka.actor.Cell$class.sendMessage(ActorCell.scala:259)
[error]  at akka.actor.ActorCell.sendMessage(ActorCell.scala:338)
[error]  at akka.actor.RepointableActorRef.$bang(RepointableActorRef.scala:157)
[error]  at akka.event.EventStream.publish(EventStream.scala:40)
[error]  at akka.event.EventStream.publish(EventStream.scala:26)
[error]  at akka.event.SubchannelClassification$$anonfun$publish$1.apply(EventBus.scala:168)
[error]  at akka.event.SubchannelClassification$$anonfun$publish$1.apply(EventBus.scala:168)
[error]  at scala.collection.immutable.Set$Set1.foreach(Set.scala:74)
[error]  at akka.event.SubchannelClassification$class.publish(EventBus.scala:168)
[error]  at akka.event.EventStream.publish(EventStream.scala:26)
[error]  at akka.event.BusLogging.notifyInfo(Logging.scala:1035)
[error]  at akka.event.LoggingAdapter$class.info(Logging.scala:908)
[error]  at akka.event.BusLogging.info(Logging.scala:1023)
[error]  at akka.remote.Remoting.start(Remoting.scala:163)
[error]  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
[error]  at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
[error]  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
[error]  at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduce failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testReduceByWindow started
[ERROR] [08/26/2014 11:23:10.316] [spark-akka.actor.default-dispatcher-3] [ActorSystem(spark)] Uncaught fatal error from thread [spark-akka.actor.default-dispatcher-3] shutting down ActorSystem [spark]
java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at scala.concurrent.forkjoin.ForkJoinPool.deregisterWorker(ForkJoinPool.java:1795)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:117)
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduceByWindow failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduceByWindow failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testQueueStream started
[error] Test org.apache.spark.streaming.JavaAPISuite.testQueueStream failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testQueueStream failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testTransform started
[error] Test org.apache.spark.streaming.JavaAPISuite.testTransform failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testTransform failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testVariousTransform started
[error] Test org.apache.spark.streaming.JavaAPISuite.testVariousTransform failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testVariousTransform failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairFilter started
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairFilter failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairFilter failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions started
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairMapPartitions failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairMap2 started
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairMap2 failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.fullExternalPush(ForkJoinPool.java:1905)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.externalPush(ForkJoinPool.java:1834)
[error]  at scala.concurrent.forkjoin.ForkJoinPool.execute(ForkJoinPool.java:2955)
[error]  at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinPool.execute(AbstractDispatcher.scala:374)
[error]  at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:212)
[error]  at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:43)
[error]  at akka.dispatch.Dispatcher.registerForExecution(Dispatcher.scala:118)
[error]  at akka.dispatch.MessageDispatcher.attach(AbstractDispatcher.scala:134)
[error]  at akka.actor.dungeon.Dispatch$class.start(Dispatch.scala:84)
[error]  at akka.actor.ActorCell.start(ActorCell.scala:338)
[error]  at akka.actor.LocalActorRef.start(ActorRef.scala:321)
[error]  at akka.actor.LocalActorRefProvider.init(ActorRefProvider.scala:619)
[error]  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:157)
[error]  at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
[error]  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
[error]  at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairMap2 failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes started
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairToPairFlatMapWithChangingTypes failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey started
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairGroupByKey failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey started
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testPairReduceByKey failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testCount started
[error] Test org.apache.spark.streaming.JavaAPISuite.testCount failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testCount failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse started
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindowWithInverse failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testWindow started
[error] Test org.apache.spark.streaming.JavaAPISuite.testWindow failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testWindow failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow started
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[error]  at sun.reflect.GeneratedConstructorAccessor21.newInstance(Unknown Source)
[error]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[error]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[error]  at scala.util.Try$.apply(Try.scala:161)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[error]  at scala.util.Success.flatMap(Try.scala:200)
[error]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
[error]  at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
[error]  at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
[error]  at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
[error]  at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
[error]  at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
[error]  at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
[error]  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
[error]  at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
[error]  at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
[error]  at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
[error]  at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:555)
[error]  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:567)
[error]  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:91)
[error]  at org.apache.spark.streaming.api.java.JavaStreamingContext.<init>(JavaStreamingContext.scala:61)
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.setUp(LocalJavaStreamingContext.java:31)
[error]  ...
[error] Test org.apache.spark.streaming.JavaAPISuite.testReduceByKeyAndWindow failed: java.lang.NullPointerException: null
[error]  at org.apache.spark.streaming.LocalJavaStreamingContext.tearDown(LocalJavaStreamingContext.java:37)
[error]  ...
[info] Test run finished: 36 failed, 0 ignored, 47 total, 38.262s
[info] Test run started
[info] Test org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver started
[error] Test org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver failed: java.lang.OutOfMemoryError: unable to create new native thread
[error]  at java.lang.Thread.start0(Native Method)
[error]  at java.lang.Thread.start(Thread.java:714)
[error]  at org.apache.spark.streaming.TestServer.start(InputStreamsSuite.scala:372)
[error]  at org.apache.spark.streaming.JavaReceiverAPISuite.testReceiver(JavaReceiverAPISuite.java:57)
[error]  ...
[info] Test run finished: 1 failed, 0 ignored, 1 total, 0.001s
[info] ScalaTest
[info] Run completed in 8 minutes, 18 seconds.
[info] Total number of tests run: 87
[info] Suites: completed 10, aborted 0
[info] Tests: succeeded 87, failed 0, canceled 0, ignored 2, pending 0
[info] All tests passed.
[error] Failed: Total 135, Failed 19, Errors 0, Passed 116, Ignored 2
[error] Failed tests:
[error]  org.apache.spark.streaming.JavaAPISuite
[error]  org.apache.spark.streaming.JavaReceiverAPISuite
[info] TimeStampedHashMapSuite:
[info] - HashMap - basic test
[info] - TimeStampedHashMap - basic test
[info] - TimeStampedHashMap - threading safety test
[info] - TimeStampedWeakValueHashMap - basic test
[info] - TimeStampedWeakValueHashMap - threading safety test
[info] - TimeStampedHashMap - clearing by timestamp
[info] - TimeStampedWeakValueHashMap - clearing by timestamp
[info] - TimeStampedWeakValueHashMap - clearing weak references
[info] TaskResultGetterSuite:
[info] - handling results smaller than Akka frame size
[info] - handling results larger than Akka frame size
[info] - task retried if result missing from block manager
[info] TaskSchedulerImplSuite:
[info] - FIFO Scheduler Test
[info] - Fair Scheduler Test
[info] - Nested Pool Test
[info] - Scheduler does not always schedule tasks on the same workers
[info] - Scheduler correctly accounts for multiple CPUs per task
[info] AppendOnlyMapSuite:
[info] - initialization
[info] - object keys and values
[info] - primitive keys and values
[info] - null keys
[info] - null values
[info] - changeValue
[info] - inserting in capacity-1 map
[info] - destructive sort
[info] MapOutputTrackerSuite:
[info] - compressSize
[info] - decompressSize
[info] - master start and stop
[info] - master register shuffle and fetch
[info] - master register and unregister shuffle
[info] - master register shuffle and unregister map output and fetch
[info] - remote fetch
[info] - remote fetch below akka frame size
[INFO] [08/26/2014 11:23:30.915] [test-akka.actor.default-dispatcher-4] [akka://test/deadLetters] Message [[B] from TestActor[akka://test/user/$$a] to Actor[akka://test/deadLetters] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
[info] - remote fetch exceeds akka frame size
[info] FileAppenderSuite:
[info] - basic file appender
[info] - rolling file appender - time-based rolling
[info] - rolling file appender - size-based rolling
[info] - rolling file appender - cleaning
[info] - file appender selection
[info] WholeTextFileRecordReaderSuite:
Local disk address is /tmp/1409066618815-0.
[info] - Correctness of WholeTextFileRecordReader.
[info] LocalDirsSuite:
[info] - Utils.getLocalDir() returns a valid directory, even if some local dirs are missing
[info] - SPARK_LOCAL_DIRS override also affects driver
[info] ContextCleanerSuite:
[info] - cleanup RDD
[info] - cleanup shuffle
[info] - cleanup broadcast
[info] - automatically cleanup RDD
[info] - automatically cleanup shuffle
[info] - automatically cleanup broadcast
[info] - automatically cleanup RDD + shuffle + broadcast
[info] - automatically cleanup RDD + shuffle + broadcast in distributed mode
[info] SortShuffleContextCleanerSuite:
[info] - cleanup shuffle
[info] - automatically cleanup shuffle
[info] - automatically cleanup RDD + shuffle + broadcast in distributed mode
[info] BroadcastSuite:
[info] - Using HttpBroadcast locally
[info] - Accessing HttpBroadcast variables from multiple threads
[info] - Accessing HttpBroadcast variables in a local cluster
[info] - Using TorrentBroadcast locally
[info] - Accessing TorrentBroadcast variables from multiple threads
[info] - Accessing TorrentBroadcast variables in a local cluster
[info] - Unpersisting HttpBroadcast on executors only in local mode
[info] - Unpersisting HttpBroadcast on executors and driver in local mode
[info] - Unpersisting HttpBroadcast on executors only in distributed mode
[info] - Unpersisting HttpBroadcast on executors and driver in distributed mode
[info] - Unpersisting TorrentBroadcast on executors only in local mode
[info] - Unpersisting TorrentBroadcast on executors and driver in local mode
[info] - Unpersisting TorrentBroadcast on executors only in distributed mode
[info] - Unpersisting TorrentBroadcast on executors and driver in distributed mode
[info] ClosureCleanerSuite:
[info] - closures inside an object
[info] - closures inside a class
[info] - closures inside a class with no default constructor
[info] - closures that don't use fields of the outer class
[info] - nested closures inside an object
[info] - nested closures inside a class
[info] - toplevel return statements in closures are identified at cleaning time
[info] - return statements from named functions nested in closures don't raise exceptions
[info] KryoSerializerDistributedSuite:
[info] - kryo objects are serialised consistently in different processes
[info] ZippedPartitionsSuite:
[info] - print sizes
[info] StorageStatusListenerSuite:
[info] - block manager added/removed
[info] - task end without updated blocks
[info] - task end with updated blocks
[info] - unpersist RDD
[info] UISuite:
[info] - basic ui visibility !!! IGNORED !!!
[info] - visibility at localhost:4040 !!! IGNORED !!!
[info] - attaching a new tab !!! IGNORED !!!
[info] - jetty selects different port under contention
[info] - jetty binds to port 0 correctly
[info] - verify appUIAddress contains the scheme
[info] - verify appUIAddress contains the port
[info] ShuffleSuite:
[info] - groupByKey without compression
[info] - shuffle non-zero block size
[info] - shuffle serializer
[info] - zero sized blocks
[info] - zero sized blocks without kryo
[info] - shuffle on mutable pairs
[info] - sorting on mutable pairs
[info] - cogroup using mutable pairs
[info] - subtract mutable pairs
[info] - sort with Java non serializable class - Kryo
[info] - sort with Java non serializable class - Java
[info] BitSetSuite:
[info] - basic set and get
[info] - 100% full bit set
[info] - nextSetBit
[info] - xor len(bitsetX) < len(bitsetY)
[info] - xor len(bitsetX) > len(bitsetY)
[info] - andNot len(bitsetX) < len(bitsetY)
[info] - andNot len(bitsetX) > len(bitsetY)
[info] OpenHashMapSuite:
[info] - size for specialized, primitive value (int)
[info] - initialization
[info] - primitive value
[info] - non-primitive value
[info] - null keys
[info] - null values
[info] - changeValue
[info] - inserting in capacity-1 map
[info] FailureSuite:
[info] - failure in a single-stage job
[info] - failure in a two-stage job
[info] - failure in a map stage
[info] - failure because task results are not serializable
[info] - failure because task closure is not serializable
[info] DistributedSuite:
[info] - task throws not serializable exception
[info] - local-cluster format
[info] - simple groupByKey
[info] - groupByKey where map output sizes exceed maxMbInFlight
[info] - accumulators
[info] - broadcast variables
[info] - repeatedly failing task
Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 4, localhost): ExecutorLostFailure (executor lost)
Driver stacktrace:
[info] - repeatedly failing task that crashes JVM
[info] - caching
[info] - caching on disk
[info] - caching in memory, replicated
[info] - caching in memory, serialized, replicated
[info] - caching on disk, replicated
[info] - caching in memory and disk, replicated
[info] - caching in memory and disk, serialized, replicated
[info] - compute without caching when no partitions fit in memory
[info] - compute when only some partitions fit in memory
[info] - passing environment variables to cluster
[info] - recover from node failures
[info] - recover from repeated node failures during shuffle-map
[info] - recover from repeated node failures during shuffle-reduce
[info] - recover from node failures with replication
[info] - unpersist RDDs
[info] ServerClientIntegrationSuite:
[info] - fetch a ByteBuffer block
[info] - fetch a FileSegment block via zero-copy send
[info] - fetch a non-existent block
[info] - fetch both ByteBuffer block and FileSegment block
[info] - fetch both ByteBuffer block and a non-existent block
[info] AkkaUtilsSuite:
[info] - remote fetch security bad password
[info] - remote fetch security off
[info] - remote fetch security pass
[info] - remote fetch security off client
[info] ExternalSorterSuite:
[info] - empty data stream
[info] - few elements per partition
[info] - empty partitions with spilling
[info] - empty partitions with spilling, bypass merge-sort
[info] - spilling in local cluster
[info] - spilling in local cluster with many reduce tasks
[info] - cleanup of intermediate files in sorter
[info] - cleanup of intermediate files in sorter, bypass merge-sort
[info] - cleanup of intermediate files in sorter if there are errors
[info] - cleanup of intermediate files in sorter if there are errors, bypass merge-sort
[info] - cleanup of intermediate files in shuffle
[info] - cleanup of intermediate files in shuffle with errors
[info] - no partial aggregation or sorting
[info] - partial aggregation without spill
[info] - partial aggregation with spill, no ordering
[info] - partial aggregation with spill, with ordering
[info] - sorting without aggregation, no spill
[info] - sorting without aggregation, with spill
[info] - spilling with hash collisions
[info] - spilling with many hash collisions
[info] - spilling with hash collisions using the Int.MaxValue key
[info] - spilling with null keys and values
[info] - conditions for bypassing merge-sort
[info] BlockServerHandlerSuite:
[info] - ByteBuffer block
[info] - FileSegment block via zero-copy
[info] - pipeline exception propagation
[info] BlockManagerSuite:
[info] - StorageLevel object caching
[info] - BlockManagerId object caching
[info] - master + 1 manager interaction
[info] - master + 2 managers interaction
[info] - removing block
[info] - removing rdd
[info] - removing broadcast
[info] - reregistration on heart beat
[info] - reregistration on block update
[info] - reregistration doesn't dead lock
Some(org.apache.spark.storage.BlockResult@40020e3)
[info] - correct BlockResult returned from get() calls
[info] - in-memory LRU storage
[info] - in-memory LRU storage with serialization
[info] - in-memory LRU for partitions of same RDD
[info] - in-memory LRU for partitions of multiple RDDs
[info] - tachyon storage
[info]  + tachyon storage test disabled. 
[info] - on-disk storage
[info] - disk and memory storage
[info] - disk and memory storage with getLocalBytes
[info] - disk and memory storage with serialization
[info] - disk and memory storage with serialization and getLocalBytes
[info] - LRU with mixed storage levels
[info] - in-memory LRU with streams
[info] - LRU with mixed storage levels and streams
[info] - negative byte values in ByteBufferInputStream
[info] - overly large block
[info] - block compression
[info] - block store put failure
[info] - reads of memory-mapped and non memory-mapped files are equivalent
[info] - updated block statuses
[info] - query block statuses
[info] - get matching blocks
[info] - SPARK-1194 regression: fix the same-RDD rule for cache replacement
[info] - return error message when error occurred in BlockManagerWorker#onBlockMessageReceive
[info] - return ack message when no error occurred in BlocManagerWorker#onBlockMessageReceive
[info] - reserve/release unroll memory
[info] - safely unroll blocks
[info] - safely unroll blocks through putIterator
[info] - safely unroll blocks through putIterator (disk)
[info] - multiple unrolls by the same thread
[info] DiskBlockManagerSuite:
Created root dirs: /tmp/1409067066245-0,/tmp/1409067066245-1
[info] - basic block creation
[info] - enumerating blocks
[info] - block appending
[info] - block remapping
[info] - consolidated shuffle can write to shuffle group without messing existing offsets/lengths
[info] JdbcRDDSuite:
[info] - basic functionality
[info] CheckpointSuite:
[info] - basic checkpointing
[info] - RDDs with one-to-one dependencies
[info] - ParallelCollection
[info] - BlockRDD
[info] - ShuffledRDD
[info] - UnionRDD
[info] - CartesianRDD
[info] - CoalescedRDD
[info] - CoGroupedRDD
[info] - ZippedPartitionsRDD
[info] - PartitionerAwareUnionRDD
[info] - CheckpointRDD with zero partitions
[info] FileSuite:
[info] - text files
[info] - text files (compressed)
[info] - SequenceFiles
[info] - SequenceFile (compressed)
[info] - SequenceFile with writable key
[info] - SequenceFile with writable value
[info] - SequenceFile with writable key and value
[info] - implicit conversions in reading SequenceFiles
[info] - object files of ints
[info] - object files of complex types
[info] - object files of classes from a JAR
[info] - write SequenceFile using new Hadoop API
[info] - read SequenceFile using new Hadoop API
[info] - file caching
[info] - prevent user from overwriting the empty directory (old Hadoop API)
[info] - prevent user from overwriting the non-empty directory (old Hadoop API)
[info] - allow user to disable the output directory existence checking (old Hadoop API
[info] - prevent user from overwriting the empty directory (new Hadoop API)
[info] - prevent user from overwriting the non-empty directory (new Hadoop API)
[info] - allow user to disable the output directory existence checking (new Hadoop API
[info] - save Hadoop Dataset through old Hadoop API
[info] - save Hadoop Dataset through new Hadoop API
[info] - Get input files via old Hadoop API
[info] - Get input files via new Hadoop API
[info] PartitioningSuite:
[info] - HashPartitioner equality
[info] - RangePartitioner equality
[info] - RangePartitioner getPartition
[info] - RangePartitioner for keys that are not Comparable (but with Ordering)
[info] - RangPartitioner.sketch
[info] - RangePartitioner.determineBounds
[info] - RangePartitioner should run only one job if data is roughly balanced
[info] - RangePartitioner should work well on unbalanced data
[info] - RangePartitioner should return a single partition for empty RDDs
[info] - HashPartitioner not equal to RangePartitioner
[info] - partitioner preservation
[info] - partitioning Java arrays should fail
[info] - zero-length partitions should be correctly handled
[info] BlockIdSuite:
[info] - test-bad-deserialization
[info] - rdd
[info] - shuffle
[info] - broadcast
[info] - taskresult
[info] - stream
[info] - test
[info] StorageSuite:
[info] - storage status add non-RDD blocks
[info] - storage status update non-RDD blocks
[info] - storage status remove non-RDD blocks
[info] - storage status add RDD blocks
[info] - storage status update RDD blocks
[info] - storage status remove RDD blocks
[info] - storage status containsBlock
[info] - storage status getBlock
[info] - storage status num[Rdd]Blocks
[info] - storage status memUsed, diskUsed, tachyonUsed
[info] - StorageUtils.updateRddInfo
[info] - StorageUtils.getRddBlockLocations
[info] - StorageUtils.getRddBlockLocations with multiple locations
[info] PairRDDFunctionsSuite:
[info] - aggregateByKey
[info] - groupByKey
[info] - groupByKey with duplicates
[info] - groupByKey with negative key hash codes
[info] - groupByKey with many output partitions
[info] - sampleByKey
[info] - sampleByKeyExact
[info] - reduceByKey
[info] - reduceByKey with collectAsMap
[info] - reduceByKey with many output partitons
[info] - reduceByKey with partitioner
[info] - countApproxDistinctByKey
[info] - join
[info] - join all-to-all
[info] - leftOuterJoin
[info] - rightOuterJoin
[info] - join with no matches
[info] - join with many output partitions
[info] - groupWith
[info] - groupWith3
[info] - groupWith4
[info] - zero-partition RDD
[info] - keys and values
[info] - default partitioner uses partition size
[info] - default partitioner uses largest partitioner
[info] - subtract
[info] - subtract with narrow dependency
[info] - subtractByKey
[info] - subtractByKey with narrow dependency
[info] - foldByKey
[info] - foldByKey with mutable result type
[info] - saveNewAPIHadoopFile should call setConf if format is configurable
[info] - lookup
[info] - lookup with partitioner
[info] - lookup with bad partitioner
[info] UnpersistSuite:
[info] - unpersist RDD
[info] ProactiveClosureSerializationSuite:
[info] - throws expected serialization exceptions on actions
[info] - mapPartitions transformations throw proactive serialization exceptions
[info] - map transformations throw proactive serialization exceptions
[info] - mapPartitionsWithContext transformations throw proactive serialization exceptions
[info] - filter transformations throw proactive serialization exceptions
[info] - flatMap transformations throw proactive serialization exceptions
[info] - mapPartitionsWithIndex transformations throw proactive serialization exceptions
[info] TaskContextSuite:
[info] - Calls executeOnCompleteCallbacks after failure
[info] SorterSuite:
[info] - equivalent to Arrays.sort
[info] - KVArraySorter
[info] - Sorter benchmark !!! IGNORED !!!
[info] EventLoggingListenerSuite:
[info] - Parse names of special files
[info] - Verify special files exist
[info] - Verify special files exist with compression
[info] - Parse event logging info
[info] - Parse event logging info with compression
[info] - Basic event logging
[info] - Basic event logging with compression
[info] - End-to-end event logging
[info] - End-to-end event logging with compression
[info] KryoSerializerResizableOutputSuite:
[info] - kryo without resizable output buffer should fail on large array
[info] - kryo with resizable output buffer should succeed on large array
[info] JsonProtocolSuite:
[info] - SparkListenerEvent
[info] - Dependent Classes
[info] - StageInfo backward compatibility
[info] - InputMetrics backward compatibility
[info] SizeEstimatorSuite:
[info] - simple classes
[info] - strings
[info] - primitive arrays
[info] - object arrays
[info] - 32-bit arch
[info] - 64-bit arch with no compressed oops
[info] JobProgressListenerSuite:
[info] - test LRU eviction of stages
[info] - test executor id to summary
[info] - test task success vs failure counting for different task end reasons
[info] - test update metrics
[info] PipedRDDSuite:
[info] - basic pipe
[info] - advanced pipe
[info] - pipe with env variable
[info] - pipe with non-zero exit status
[info] - basic pipe with separate working directory
[info] - test pipe exports map_input_file
[info] - test pipe exports mapreduce_map_input_file
[info] ShuffleNettySuite:
[info] - groupByKey without compression
[info] - shuffle non-zero block size
[info] - shuffle serializer
[info] - zero sized blocks
[info] - zero sized blocks without kryo
[info] - shuffle on mutable pairs
[info] - sorting on mutable pairs
[info] - cogroup using mutable pairs
[info] - subtract mutable pairs
[info] - sort with Java non serializable class - Kryo
[info] - sort with Java non serializable class - Java
[info] SparkContextInfoSuite:
[info] - getPersistentRDDs only returns RDDs that are marked as cached
[info] - getPersistentRDDs returns an immutable map
[info] - getRDDStorageInfo only reports on RDDs that actually persist data
[info] - call sites report correct locations
[info] KryoSerializerSuite:
[info] - basic types
[info] - pairs
[info] - Scala data structures
[info] - ranges
[info] - asJavaIterable
[info] - custom registrator
[info] - kryo with collect
[info] - kryo with parallelize
[info] - kryo with parallelize for specialized tuples
[info] - kryo with parallelize for primitive arrays
[info] - kryo with collect for specialized tuples
[info] - kryo with SerializableHyperLogLog
[info] - kryo with reduce
[info] - kryo with fold !!! IGNORED !!!
[info] - kryo with nonexistent custom registrator should fail
[info] - default class loader can be set by a different thread
[info] MetricsConfigSuite:
[info] - MetricsConfig with default properties
[info] - MetricsConfig with properties set
[info] - MetricsConfig with subProperties
[info] SparkListenerSuite:
[info] - basic creation and shutdown of LiveListenerBus
[info] - bus.stop() waits for the event queue to completely drain
[info] - basic creation of StageInfo
[info] - basic creation of StageInfo with shuffle
[info] - StageInfo with fewer tasks than partitions
[info] - local metrics
[info] - onTaskGettingResult() called when result fetched remotely
[info] - onTaskGettingResult() not called when result sent directly
[info] - onTaskEnd() should be called for all started tasks, even after job has been killed
[info] - SparkListener moves on if a listener throws an exception
[info] BlockFetcherIteratorSuite:
[info] - block fetch from local fails using BasicBlockFetcherIterator
[info] - block fetch from local succeed using BasicBlockFetcherIterator
[info] - block fetch from remote fails using BasicBlockFetcherIterator
[info] - block fetch from remote succeed using BasicBlockFetcherIterator
[info] SamplingUtilsSuite:
[info] - reservoirSampleAndCount
[info] - computeFraction
[info] BlockHeaderEncoderSuite:
[info] - encode normal block data
[info] - encode error message
[info] StorageTabSuite:
[info] - stage submitted / completed
[info] - unpersist
[info] - task end
[info] RandomSamplerSuite:
[info] - BernoulliSamplerWithRange
[info] - BernoulliSamplerWithRangeInverse
[info] - BernoulliSamplerWithRatio
[info] - BernoulliSamplerWithComplement
[info] - BernoulliSamplerSetSeed
[info] - PoissonSampler
[info] PythonRDDSuite:
[info] - Writing large strings to the worker
[info] VectorSuite:
[info] - random with default random number generator
[info] - random with given random number generator
[info] ShuffleMemoryManagerSuite:
[info] - single thread requesting memory
[info] - two threads requesting full memory
[info] - threads cannot grow past 1 / N
[info] - threads can block to get at least 1 / 2N memory
[info] - releaseMemoryForThisThread
[info] CompressionCodecSuite:
[info] - default compression codec
[info] - lz4 compression codec
[info] - lz4 compression codec short form
[info] - lzf compression codec
[info] - lzf compression codec short form
[info] - snappy compression codec
[info] - snappy compression codec short form
[info] ClientSuite:
[info] - correctly validates driver jar URL's
[info] SparkContextSchedulerCreationSuite:
[info] - bad-master
[info] - local
[info] - local-*
[info] - local-n
[info] - local-*-n-failures
[info] - local-n-failures
[info] - bad-local-n
[info] - bad-local-n-failures
[info] - local-default-parallelism
[info] - simr
[info] - local-cluster
[info] - yarn-cluster *** FAILED ***
[info]  unable to create new native thread (SparkContextSchedulerCreationSuite.scala:134)
[info] - yarn-standalone *** FAILED ***
[info]  unable to create new native thread (SparkContextSchedulerCreationSuite.scala:134)
[info] - yarn-client *** FAILED ***
[info]  unable to create new native thread (SparkContextSchedulerCreationSuite.scala:134)
[info] - mesos fine-grained *** FAILED ***
[info]  unable to create new native thread (SparkContextSchedulerCreationSuite.scala:158)
[info] - mesos coarse-grained *** FAILED ***
[info]  unable to create new native thread (SparkContextSchedulerCreationSuite.scala:158)
[info] - mesos with zookeeper *** FAILED ***
[info]  unable to create new native thread (SparkContextSchedulerCreationSuite.scala:158)
[info] AsyncRDDActionsSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.rdd.AsyncRDDActionsSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] PrimitiveKeyOpenHashMapSuite:
[info] - size for specialized, primitive key, value (int, int)
[info] - initialization
[info] - basic operations
[info] - null values
[info] - changeValue
[info] - inserting in capacity-1 map
[info] RDDSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.rdd.RDDSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] CacheManagerSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.CacheManagerSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] WorkerWatcherSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.deploy.worker.WorkerWatcherSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] CoarseGrainedSchedulerBackendSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.scheduler.CoarseGrainedSchedulerBackendSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] SortShuffleSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.SortShuffleSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] JsonProtocolSuite:
[info] - writeApplicationInfo
[info] - writeWorkerInfo
[info] - writeApplicationDescription
[info] - writeExecutorRunner
[info] - writeDriverInfo
[info] - writeMasterState
[info] - writeWorkerState
[info] ReplayListenerSuite:
[info] - Simple replay
[info] - Simple replay with compression
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.scheduler.ReplayListenerSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
[info]  at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
[info]  at scala.concurrent.forkjoin.ForkJoinPool.fullExternalPush(ForkJoinPool.java:1905)
[info]  at scala.concurrent.forkjoin.ForkJoinPool.externalPush(ForkJoinPool.java:1834)
[info]  at scala.concurrent.forkjoin.ForkJoinPool.execute(ForkJoinPool.java:2955)
[info]  at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinPool.execute(AbstractDispatcher.scala:374)
[info]  at akka.dispatch.ExecutorServiceDelegate$class.execute(ThreadPoolBuilder.scala:212)
[info]  at akka.dispatch.Dispatcher$LazyExecutorServiceDelegate.execute(Dispatcher.scala:43)
[info]  ...
[info] DistributionSuite:
[info] - summary
[info] ParallelCollectionSplitSuite:
[info] - one element per slice
[info] - one slice
[info] - equal slices
[info] - non-equal slices
[info] - splitting exclusive range
[info] - splitting inclusive range
[info] - empty data
[info] - zero slices
[info] - negative number of slices
[info] - exclusive ranges sliced into ranges
[info] - inclusive ranges sliced into ranges
[info] - identical slice sizes between Range and NumericRange
[info] - identical slice sizes between List and NumericRange
[info] - large ranges don't overflow
[info] - random array tests
[info] - random exclusive range tests
[info] - random inclusive range tests
[info] - exclusive ranges of longs
[info] - inclusive ranges of longs
[info] - exclusive ranges of doubles
[info] - inclusive ranges of doubles
[info] DriverRunnerTest:
[info] - Process succeeds instantly
[info] - Process failing several times and then succeeding
[info] - Process doesn't restart if not supervised
[info] - Process doesn't restart if killed
[info] - Reset of backoff counter
[info] ConnectionManagerSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.network.ConnectionManagerSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at java.util.Timer.<init>(Timer.java:176)
[info]  at org.apache.spark.network.ConnectionManager.<init>(ConnectionManager.scala:71)
[info]  at org.apache.spark.network.ConnectionManagerSuite$$anonfun$1.apply$mcV$sp(ConnectionManagerSuite.scala:44)
[info]  at org.apache.spark.network.ConnectionManagerSuite$$anonfun$1.apply(ConnectionManagerSuite.scala:41)
[info]  at org.apache.spark.network.ConnectionManagerSuite$$anonfun$1.apply(ConnectionManagerSuite.scala:41)
[info]  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info]  ...
[info] AccumulatorSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.AccumulatorSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] ExecutorRunnerTest:
[info] - command includes appId *** FAILED ***
[info]  java.io.IOException: Cannot run program "/home/jay/Development/spark/bin/compute-classpath.sh" (in directory "."): error=11, Resource temporarily unavailable
[info]  at java.lang.ProcessBuilder.start(ProcessBuilder.java:1041)
[info]  at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:852)
[info]  at org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:71)
[info]  at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:37)
[info]  at org.apache.spark.deploy.worker.ExecutorRunner.getCommandSeq(ExecutorRunner.scala:125)
[info]  at org.apache.spark.deploy.worker.ExecutorRunnerTest$$anonfun$1.apply$mcV$sp(ExecutorRunnerTest.scala:37)
[info]  at org.apache.spark.deploy.worker.ExecutorRunnerTest$$anonfun$1.apply(ExecutorRunnerTest.scala:28)
[info]  at org.apache.spark.deploy.worker.ExecutorRunnerTest$$anonfun$1.apply(ExecutorRunnerTest.scala:28)
[info]  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]  ...
[info]  Cause: java.io.IOException: error=11, Resource temporarily unavailable
[info]  at java.lang.UNIXProcess.forkAndExec(Native Method)
[info]  at java.lang.UNIXProcess.<init>(UNIXProcess.java:186)
[info]  at java.lang.ProcessImpl.start(ProcessImpl.java:130)
[info]  at java.lang.ProcessBuilder.start(ProcessBuilder.java:1022)
[info]  at org.apache.spark.util.Utils$.executeAndGetOutput(Utils.scala:852)
[info]  at org.apache.spark.deploy.worker.CommandUtils$.buildJavaOpts(CommandUtils.scala:71)
[info]  at org.apache.spark.deploy.worker.CommandUtils$.buildCommandSeq(CommandUtils.scala:37)
[info]  at org.apache.spark.deploy.worker.ExecutorRunner.getCommandSeq(ExecutorRunner.scala:125)
[info]  at org.apache.spark.deploy.worker.ExecutorRunnerTest$$anonfun$1.apply$mcV$sp(ExecutorRunnerTest.scala:37)
[info]  at org.apache.spark.deploy.worker.ExecutorRunnerTest$$anonfun$1.apply(ExecutorRunnerTest.scala:28)
[info]  ...
[info] ThreadingSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.ThreadingSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] BlockFetchingClientHandlerSuite:
[info] - handling block data (successful fetch)
[info] - handling error message (failed fetch)
[info] ExternalAppendOnlyMapSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.util.collection.ExternalAppendOnlyMapSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] JobCancellationSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.JobCancellationSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] OpenHashSetSuite:
[info] - size for specialized, primitive int
[info] - primitive int
[info] - primitive long
[info] - non-primitive
[info] - non-primitive set growth
[info] - primitive set growth
[info] XORShiftRandomSuite:
[info] - XORShift generates valid random numbers
[info] - XORShift with zero seed
[info] PrimitiveVectorSuite:
[info] - primitive value
[info] - non-primitive value
[info] - ideal growth
[info] - ideal size
[info] - resizing
[info] CompactBufferSuite:
[info] - empty buffer
[info] - basic inserts
[info] - adding sequences
[info] - adding the same buffer to itself
[info] FlatmapIteratorSuite:
[ERROR] [08/26/2014 11:33:59.983] [pool-1-thread-1-ScalaTest-running-FlatmapIteratorSuite] [Remoting] Remoting error: [Startup timed out] [
akka.remote.RemoteTransportException: Startup timed out
at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
at akka.remote.Remoting.start(Remoting.scala:191)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at org.apache.spark.storage.FlatmapIteratorSuite$$anonfun$1.apply$mcV$sp(FlatmapIteratorSuite.scala:37)
at org.apache.spark.storage.FlatmapIteratorSuite$$anonfun$1.apply(FlatmapIteratorSuite.scala:35)
at org.apache.spark.storage.FlatmapIteratorSuite$$anonfun$1.apply(FlatmapIteratorSuite.scala:35)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:158)
at org.scalatest.Suite$class.withFixture(Suite.scala:1121)
at org.scalatest.FunSuite.withFixture(FunSuite.scala:1559)
at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:155)
at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:167)
at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:167)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:167)
at org.apache.spark.storage.FlatmapIteratorSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(FlatmapIteratorSuite.scala:23)
at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:255)
at org.apache.spark.storage.FlatmapIteratorSuite.runTest(FlatmapIteratorSuite.scala:23)
at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:200)
at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:200)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:200)
at org.scalatest.FunSuite.runTests(FunSuite.scala:1559)
at org.scalatest.Suite$class.run(Suite.scala:1423)
at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1559)
at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:204)
at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:204)
at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:204)
at org.apache.spark.storage.FlatmapIteratorSuite.org$scalatest$BeforeAndAfterAll$$super$run(FlatmapIteratorSuite.scala:23)
at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
at org.apache.spark.storage.FlatmapIteratorSuite.run(FlatmapIteratorSuite.scala:23)
at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:444)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:651)
at sbt.ForkMain$Run$2.call(ForkMain.java:294)
at sbt.ForkMain$Run$2.call(ForkMain.java:284)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at akka.remote.Remoting.start(Remoting.scala:173)
... 63 more
]
[info] - Flatmap Iterator to Disk *** FAILED ***
[info]  java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
[info]  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
[info]  at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
[info]  at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
[info]  at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
[info]  at scala.concurrent.Await$.result(package.scala:107)
[info]  at akka.remote.Remoting.start(Remoting.scala:173)
[info]  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
[info]  at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
[info]  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
[info]  at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
[info]  ...
[ERROR] [08/26/2014 11:34:09.995] [pool-1-thread-1-ScalaTest-running-FlatmapIteratorSuite] [Remoting] Remoting error: [Startup timed out] [
akka.remote.RemoteTransportException: Startup timed out
at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:129)
at akka.remote.Remoting.start(Remoting.scala:191)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:150)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at org.apache.spark.storage.FlatmapIteratorSuite$$anonfun$3.apply$mcV$sp(FlatmapIteratorSuite.scala:48)
at org.apache.spark.storage.FlatmapIteratorSuite$$anonfun$3.apply(FlatmapIteratorSuite.scala:46)
at org.apache.spark.storage.FlatmapIteratorSuite$$anonfun$3.apply(FlatmapIteratorSuite.scala:46)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
at org.scalatest.Transformer.apply(Transformer.scala:22)
at org.scalatest.Transformer.apply(Transformer.scala:20)
at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:158)
at org.scalatest.Suite$class.withFixture(Suite.scala:1121)
at org.scalatest.FunSuite.withFixture(FunSuite.scala:1559)
at org.scalatest.FunSuiteLike$class.invokeWithFixture$1(FunSuiteLike.scala:155)
at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:167)
at org.scalatest.FunSuiteLike$$anonfun$runTest$1.apply(FunSuiteLike.scala:167)
at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
at org.scalatest.FunSuiteLike$class.runTest(FunSuiteLike.scala:167)
at org.apache.spark.storage.FlatmapIteratorSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(FlatmapIteratorSuite.scala:23)
at org.scalatest.BeforeAndAfterEach$class.runTest(BeforeAndAfterEach.scala:255)
at org.apache.spark.storage.FlatmapIteratorSuite.runTest(FlatmapIteratorSuite.scala:23)
at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:200)
at org.scalatest.FunSuiteLike$$anonfun$runTests$1.apply(FunSuiteLike.scala:200)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
at org.scalatest.FunSuiteLike$class.runTests(FunSuiteLike.scala:200)
at org.scalatest.FunSuite.runTests(FunSuite.scala:1559)
at org.scalatest.Suite$class.run(Suite.scala:1423)
at org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1559)
at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:204)
at org.scalatest.FunSuiteLike$$anonfun$run$1.apply(FunSuiteLike.scala:204)
at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
at org.scalatest.FunSuiteLike$class.run(FunSuiteLike.scala:204)
at org.apache.spark.storage.FlatmapIteratorSuite.org$scalatest$BeforeAndAfterAll$$super$run(FlatmapIteratorSuite.scala:23)
at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
at org.apache.spark.storage.FlatmapIteratorSuite.run(FlatmapIteratorSuite.scala:23)
at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:444)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:651)
at sbt.ForkMain$Run$2.call(ForkMain.java:294)
at sbt.ForkMain$Run$2.call(ForkMain.java:284)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at akka.remote.Remoting.start(Remoting.scala:173)
... 63 more
]
[info] - Flatmap Iterator to Memory *** FAILED ***
[info]  java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
[info]  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
[info]  at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
[info]  at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
[info]  at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
[info]  at scala.concurrent.Await$.result(package.scala:107)
[info]  at akka.remote.Remoting.start(Remoting.scala:173)
[info]  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
[info]  at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
[info]  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
[info]  at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
[info]  ...
[info] - Serializer Reset *** FAILED ***
[info]  java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]
[info]  at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
[info]  at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
[info]  at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
[info]  at scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
[info]  at scala.concurrent.Await$.result(package.scala:107)
[info]  at akka.remote.Remoting.start(Remoting.scala:173)
[info]  at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:184)
[info]  at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:579)
[info]  at akka.actor.ActorSystemImpl._start(ActorSystem.scala:577)
[info]  at akka.actor.ActorSystemImpl.start(ActorSystem.scala:588)
[info]  ...
[info] DriverSuite:
[info] - driver should exit after finishing *** FAILED ***
[info]  OutOfMemoryError was thrown during property evaluation. (DriverSuite.scala:40)
[info]  Message: unable to create new native thread
[info]  Occurred at table row 0 (zero based, not counting headings), which had values (
[info]  master = local
[info]  )
[info] NextIteratorSuite:
[info] - one iteration
[info] - two iterations
[info] - empty iteration
[info] - close is called once for empty iterations
[info] - close is called once for non-empty iterations
[info] UtilsSuite:
[info] - bytesToString
[info] - copyStream
[info] - memoryStringToMb
[info] - splitCommandString
[info] - string formatting of time durations
[info] - reading offset bytes of a file
[info] - reading offset bytes across multiple files
[info] - deserialize long value
[info] - get iterator size
[info] - findOldFiles
[info] - resolveURI
[info] - nonLocalPaths
[info] - isBindCollision
[info] SizeTrackerSuite:
[info] - vector fixed size insertions
[info] - vector variable size insertions
[info] - map fixed size insertions
[info] - map variable size insertions
[info] - map updates
[info] ExecutorURLClassLoaderSuite:
[info] - child first
[info] - parent first
[info] - child first can fall back
[info] - child first can fail
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.executor.ExecutorURLClassLoaderSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] ImplicitOrderingSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.ImplicitOrderingSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[error] Uncaught exception when running org.apache.spark.scheduler.DAGSchedulerSuite: java.lang.OutOfMemoryError: unable to create new native thread
sbt.ForkMain$ForkError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
at scala.util.Try$.apply(Try.scala:161)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
at scala.util.Success.flatMap(Try.scala:200)
at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
at akka.actor.ActorSystemImpl.createScheduler(ActorSystem.scala:618)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:541)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:93)
at org.apache.spark.scheduler.DAGSchedulerSuite.<init>(DAGSchedulerSuite.scala:68)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:374)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:621)
at sbt.ForkMain$Run$2.call(ForkMain.java:294)
at sbt.ForkMain$Run$2.call(ForkMain.java:284)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
[info] SecurityManagerSuite:
[info] - set security with conf
[info] - set security with api
[info] - set security modify acls
[info] - set security admin acls
[info] SparkConfSuite:
[info] - loading from system properties
[info] - initializing without loading defaults
[info] - named set methods
[info] - basic get and set
[info] - creating SparkContext without master and app name
[info] - creating SparkContext without master
[info] - creating SparkContext without app name
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.SparkConfSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] SortingSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.rdd.SortingSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] PartitionwiseSampledRDDSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.rdd.PartitionwiseSampledRDDSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] SparkSubmitSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.deploy.SparkSubmitSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at org.apache.spark.deploy.SparkSubmitSuite.testPrematureExit(SparkSubmitSuite.scala:64)
[info]  at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$1.apply$mcV$sp(SparkSubmitSuite.scala:73)
[info]  at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$1.apply(SparkSubmitSuite.scala:73)
[info]  at org.apache.spark.deploy.SparkSubmitSuite$$anonfun$1.apply(SparkSubmitSuite.scala:73)
[info]  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]  at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]  at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info]  at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info]  ...
[info] MetricsSystemSuite:
[info] - MetricsSystem with default config
[info] - MetricsSystem with sources add
[info] PartitionPruningRDDSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.rdd.PartitionPruningRDDSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] FileServerSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.FileServerSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] BlockObjectWriterSuite:
[info] - verify write metrics
[info] - verify write metrics on revert
[info] TaskSetManagerSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.scheduler.TaskSetManagerSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] DoubleRDDSuite:
[info] Exception encountered when attempting to run a suite with class name: org.apache.spark.rdd.DoubleRDDSuite *** ABORTED ***
[info]  java.lang.OutOfMemoryError: unable to create new native thread
[info]  at java.lang.Thread.start0(Native Method)
[info]  at java.lang.Thread.start(Thread.java:714)
[info]  at akka.actor.LightArrayRevolverScheduler.<init>(Scheduler.scala:425)
[info]  at sun.reflect.GeneratedConstructorAccessor9.newInstance(Unknown Source)
[info]  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
[info]  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
[info]  at scala.util.Try$.apply(Try.scala:161)
[info]  at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
[info]  at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
[info]  ...
[info] PythonRunnerSuite:
[info] - format path
[info] - format paths
[info] FileLoggerSuite:
[info] - Simple logging
[info] - Simple logging with compression
[info] - Logging multiple files
[info] - Logging multiple files with compression
[info] - Logging when directory already exists
[info] ScalaTest
[info] Run completed in 19 minutes, 32 seconds.
[info] Total number of tests run: 574
[info] Suites: completed 81, aborted 22
[info] Tests: succeeded 563, failed 11, canceled 0, ignored 5, pending 0
[info] *** 22 SUITES ABORTED ***
[info] *** 11 TESTS FAILED ***
[error] Error: Total 597, Failed 11, Errors 23, Passed 563, Ignored 5
[error] Failed tests:
[error]  org.apache.spark.deploy.worker.ExecutorRunnerTest
[error]  org.apache.spark.SparkContextSchedulerCreationSuite
[error]  org.apache.spark.DriverSuite
[error]  org.apache.spark.storage.FlatmapIteratorSuite
[error] Error during tests:
[error]  org.apache.spark.scheduler.DAGSchedulerSuite
[error]  org.apache.spark.rdd.RDDSuite
[error]  org.apache.spark.rdd.SortingSuite
[error]  org.apache.spark.executor.ExecutorURLClassLoaderSuite
[error]  org.apache.spark.ImplicitOrderingSuite
[error]  org.apache.spark.scheduler.TaskSetManagerSuite
[error]  org.apache.spark.CacheManagerSuite
[error]  org.apache.spark.SparkConfSuite
[error]  org.apache.spark.rdd.AsyncRDDActionsSuite
[error]  org.apache.spark.JobCancellationSuite
[error]  org.apache.spark.SortShuffleSuite
[error]  org.apache.spark.scheduler.CoarseGrainedSchedulerBackendSuite
[error]  org.apache.spark.AccumulatorSuite
[error]  org.apache.spark.deploy.SparkSubmitSuite
[error]  org.apache.spark.rdd.PartitionwiseSampledRDDSuite
[error]  org.apache.spark.ThreadingSuite
[error]  org.apache.spark.rdd.PartitionPruningRDDSuite
[error]  org.apache.spark.FileServerSuite
[error]  org.apache.spark.deploy.worker.WorkerWatcherSuite
[error]  org.apache.spark.network.ConnectionManagerSuite
[error]  org.apache.spark.rdd.DoubleRDDSuite
[error]  org.apache.spark.scheduler.ReplayListenerSuite
[error]  org.apache.spark.util.collection.ExternalAppendOnlyMapSuite
[info] ZeroMQStreamSuite:
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment