Created
February 8, 2016 03:03
-
-
Save ssimeonov/69cb0b41750be7777776 to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[info] spark-streaming: found 30 potential binary incompatibilities (filtered 8) | |
[error] * method delaySeconds()Int in class org.apache.spark.streaming.Checkpoint does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.Checkpoint.delaySeconds") | |
[error] * class org.apache.spark.streaming.receiver.ActorSupervisorStrategy does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ActorSupervisorStrategy") | |
[error] * object org.apache.spark.streaming.receiver.IteratorData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.IteratorData$") | |
[error] * class org.apache.spark.streaming.receiver.ByteBufferData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ByteBufferData") | |
[error] * object org.apache.spark.streaming.receiver.ActorSupervisorStrategy does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ActorSupervisorStrategy$") | |
[error] * object org.apache.spark.streaming.receiver.Statistics does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.Statistics$") | |
[error] * class org.apache.spark.streaming.receiver.Statistics does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.Statistics") | |
[error] * class org.apache.spark.streaming.receiver.IteratorData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.IteratorData") | |
[error] * interface org.apache.spark.streaming.receiver.ActorReceiverData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ActorReceiverData") | |
[error] * class org.apache.spark.streaming.receiver.ActorReceiver does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ActorReceiver") | |
[error] * class org.apache.spark.streaming.receiver.SingleItemData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.SingleItemData") | |
[error] * object org.apache.spark.streaming.receiver.SingleItemData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.SingleItemData$") | |
[error] * trait org.apache.spark.streaming.receiver.ActorHelper does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ActorHelper") | |
[error] * object org.apache.spark.streaming.receiver.ByteBufferData does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.streaming.receiver.ByteBufferData$") | |
[error] * synthetic method org$apache$spark$streaming$util$BatchedWriteAheadLog$$getByteArray(java.nio.ByteBuffer)Array[Byte] in object org.apache.spark.streaming.util.BatchedWriteAheadLog does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.util.BatchedWriteAheadLog.org$apache$spark$streaming$util$BatchedWriteAheadLog$$getByteArray") | |
[error] * the type hierarchy of class org.apache.spark.streaming.util.OpenHashMapBasedStateMap#LimitMarker has changed in new version. Missing types {scala.Serializable} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.streaming.util.OpenHashMapBasedStateMap$LimitMarker") | |
[error] * synthetic method org$apache$spark$streaming$util$FileBasedWriteAheadLogWriter$$stream()org.apache.hadoop.fs.FSDataOutputStream in class org.apache.spark.streaming.util.FileBasedWriteAheadLogWriter does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.util.FileBasedWriteAheadLogWriter.org$apache$spark$streaming$util$FileBasedWriteAheadLogWriter$$stream") | |
[error] * the type hierarchy of class org.apache.spark.streaming.util.OpenHashMapBasedStateMap has changed in new version. Missing types {scala.Serializable} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.streaming.util.OpenHashMapBasedStateMap") | |
[error] * the type hierarchy of class org.apache.spark.streaming.util.EmptyStateMap has changed in new version. Missing types {scala.Serializable} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.streaming.util.EmptyStateMap") | |
[error] * method this(scala.reflect.ClassTag,scala.reflect.ClassTag)Unit in class org.apache.spark.streaming.util.EmptyStateMap does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.util.EmptyStateMap.this") | |
[error] * method empty(scala.reflect.ClassTag,scala.reflect.ClassTag)org.apache.spark.streaming.util.StateMap in object org.apache.spark.streaming.util.StateMap does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.util.StateMap.empty") | |
[error] * the type hierarchy of class org.apache.spark.streaming.util.StateMap has changed in new version. Missing types {scala.Serializable} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.streaming.util.StateMap") | |
[error] * method this(scala.reflect.ClassTag,scala.reflect.ClassTag)Unit in class org.apache.spark.streaming.util.StateMap does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.util.StateMap.this") | |
[error] * the type hierarchy of class org.apache.spark.streaming.scheduler.StreamingListenerBus has changed in new version. Missing types {org.apache.spark.util.AsynchronousListenerBus} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.streaming.scheduler.StreamingListenerBus") | |
[error] * method onPostEvent(java.lang.Object,java.lang.Object)Unit in class org.apache.spark.streaming.scheduler.StreamingListenerBus does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.scheduler.StreamingListenerBus.onPostEvent") | |
[error] * method onPostEvent(org.apache.spark.streaming.scheduler.StreamingListener,org.apache.spark.streaming.scheduler.StreamingListenerEvent)Unit in class org.apache.spark.streaming.scheduler.StreamingListenerBus does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.scheduler.StreamingListenerBus.onPostEvent") | |
[error] * method onDropEvent(java.lang.Object)Unit in class org.apache.spark.streaming.scheduler.StreamingListenerBus does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.scheduler.StreamingListenerBus.onDropEvent") | |
[error] * method onDropEvent(org.apache.spark.streaming.scheduler.StreamingListenerEvent)Unit in class org.apache.spark.streaming.scheduler.StreamingListenerBus does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.scheduler.StreamingListenerBus.onDropEvent") | |
[error] * method this()Unit in class org.apache.spark.streaming.scheduler.StreamingListenerBus does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.scheduler.StreamingListenerBus.this") | |
[error] * method updatePythonGatewayPort(py4j.GatewayServer,Int)Unit in object org.apache.spark.streaming.api.python.PythonDStream does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.streaming.api.python.PythonDStream.updatePythonGatewayPort") | |
[info] Compiling 27 Scala sources to /Users/sim/dev/spx/spark/mllib/target/scala-2.11/classes... | |
[info] spark-sql: found 80 potential binary incompatibilities (filtered 349) | |
[error] * the type hierarchy of class org.apache.spark.rdd.SqlNewHadoopRDD has changed in new version. Missing types {org.apache.spark.mapreduce.SparkHadoopMapReduceUtil} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.rdd.SqlNewHadoopRDD") | |
[error] * method newTaskAttemptID(java.lang.String,Int,Boolean,Int,Int)org.apache.hadoop.mapreduce.TaskAttemptID in class org.apache.spark.rdd.SqlNewHadoopRDD does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.rdd.SqlNewHadoopRDD.newTaskAttemptID") | |
[error] * method newTaskAttemptContext(org.apache.hadoop.conf.Configuration,org.apache.hadoop.mapreduce.TaskAttemptID)org.apache.hadoop.mapreduce.TaskAttemptContext in class org.apache.spark.rdd.SqlNewHadoopRDD does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.rdd.SqlNewHadoopRDD.newTaskAttemptContext") | |
[error] * method newJobContext(org.apache.hadoop.conf.Configuration,org.apache.hadoop.mapreduce.JobID)org.apache.hadoop.mapreduce.JobContext in class org.apache.spark.rdd.SqlNewHadoopRDD does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.rdd.SqlNewHadoopRDD.newJobContext") | |
[error] * method inputTypes()scala.collection.Seq in class org.apache.spark.sql.UserDefinedFunction has now a different result type; was: scala.collection.Seq, is now: scala.Option | |
[error] filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.UserDefinedFunction.inputTypes") | |
[error] * method copy(java.lang.Object,org.apache.spark.sql.types.DataType,scala.collection.Seq)org.apache.spark.sql.UserDefinedFunction in class org.apache.spark.sql.UserDefinedFunction's type has changed; was (java.lang.Object,org.apache.spark.sql.types.DataType,scala.collection.Seq)org.apache.spark.sql.UserDefinedFunction, is now: (java.lang.Object,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.UserDefinedFunction | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UserDefinedFunction.copy") | |
[error] * synthetic method copy$default$3()scala.collection.Seq in class org.apache.spark.sql.UserDefinedFunction has now a different result type; was: scala.collection.Seq, is now: scala.Option | |
[error] filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.UserDefinedFunction.copy$default$3") | |
[error] * method this(java.lang.Object,org.apache.spark.sql.types.DataType,scala.collection.Seq)Unit in class org.apache.spark.sql.UserDefinedFunction's type has changed; was (java.lang.Object,org.apache.spark.sql.types.DataType,scala.collection.Seq)Unit, is now: (java.lang.Object,org.apache.spark.sql.types.DataType,scala.Option)Unit | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UserDefinedFunction.this") | |
[error] * method SPECIALIZE_SINGLE_DISTINCT_AGG_PLANNING()org.apache.spark.sql.SQLConf#SQLConfEntry in object org.apache.spark.sql.SQLConf does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.SQLConf.SPECIALIZE_SINGLE_DISTINCT_AGG_PLANNING") | |
[error] * method DIALECT()org.apache.spark.sql.SQLConf#SQLConfEntry in object org.apache.spark.sql.SQLConf does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.SQLConf.DIALECT") | |
[error] * synthetic method apply$default$3()scala.collection.Seq in object org.apache.spark.sql.UserDefinedFunction does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.UserDefinedFunction.apply$default$3") | |
[error] * method apply(java.lang.Object,org.apache.spark.sql.types.DataType,scala.collection.Seq)org.apache.spark.sql.UserDefinedFunction in object org.apache.spark.sql.UserDefinedFunction does not have a correspondent with same parameter signature among (java.lang.Object,java.lang.Object,java.lang.Object)java.lang.Object, (java.lang.Object,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.UserDefinedFunction | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UserDefinedFunction.apply") | |
[error] * synthetic method <init>$default$3()scala.collection.Seq in object org.apache.spark.sql.UserDefinedFunction does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.UserDefinedFunction.<init>$default$3") | |
[error] * deprecated method isNaN(org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.isNaN") | |
[error] * deprecated method percentRank()org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.percentRank") | |
[error] * deprecated method cumeDist()org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.cumeDist") | |
[error] * deprecated method denseRank()org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.denseRank") | |
[error] * deprecated method inputFileName()org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.inputFileName") | |
[error] * deprecated method callUdf(java.lang.String,scala.collection.Seq)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUdf") | |
[error] * deprecated method rowNumber()org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.rowNumber") | |
[error] * deprecated method callUDF(scala.Function10,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function9,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function8,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function7,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function6,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function5,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function4,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function3,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function2,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function1,org.apache.spark.sql.types.DataType,org.apache.spark.sql.Column)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method callUDF(scala.Function0,org.apache.spark.sql.types.DataType)org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent with same parameter signature among (java.lang.String,Array[org.apache.spark.sql.Column])org.apache.spark.sql.Column, (java.lang.String,scala.collection.Seq)org.apache.spark.sql.Column | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.functions.callUDF") | |
[error] * deprecated method sparkPartitionId()org.apache.spark.sql.Column in object org.apache.spark.sql.functions does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.functions.sparkPartitionId") | |
[error] * deprecated method toSchemaRDD()org.apache.spark.sql.DataFrame in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.toSchemaRDD") | |
[error] * deprecated method insertInto(java.lang.String)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.insertInto") | |
[error] * deprecated method insertInto(java.lang.String,Boolean)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.insertInto") | |
[error] * deprecated method saveAsParquetFile(java.lang.String)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsParquetFile") | |
[error] * deprecated method saveAsTable(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode,scala.collection.immutable.Map)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable") | |
[error] * deprecated method saveAsTable(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode,java.util.Map)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable") | |
[error] * deprecated method saveAsTable(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable") | |
[error] * deprecated method saveAsTable(java.lang.String,java.lang.String)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable") | |
[error] * deprecated method saveAsTable(java.lang.String,org.apache.spark.sql.SaveMode)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable") | |
[error] * deprecated method saveAsTable(java.lang.String)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.saveAsTable") | |
[error] * deprecated method insertIntoJDBC(java.lang.String,java.lang.String,Boolean)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.insertIntoJDBC") | |
[error] * deprecated method save(java.lang.String,org.apache.spark.sql.SaveMode,scala.collection.immutable.Map)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save") | |
[error] * deprecated method save(java.lang.String,org.apache.spark.sql.SaveMode,java.util.Map)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save") | |
[error] * deprecated method save(java.lang.String,java.lang.String,org.apache.spark.sql.SaveMode)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save") | |
[error] * deprecated method save(java.lang.String,java.lang.String)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save") | |
[error] * deprecated method save(java.lang.String,org.apache.spark.sql.SaveMode)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save") | |
[error] * deprecated method save(java.lang.String)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.save") | |
[error] * deprecated method createJDBCTable(java.lang.String,java.lang.String,Boolean)Unit in class org.apache.spark.sql.DataFrame does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.DataFrame.createJDBCTable") | |
[error] * deprecated method in(scala.collection.Seq)org.apache.spark.sql.Column in class org.apache.spark.sql.Column does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.Column.in") | |
[error] * method in(Array[java.lang.Object])org.apache.spark.sql.Column in class org.apache.spark.sql.Column does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.Column.in") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$3(scala.collection.Seq,scala.Function1,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function1,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function1,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$3") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$11(scala.collection.Seq,scala.Function9,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function9,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function9,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$11") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$14(scala.collection.Seq,scala.Function12,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function12,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function12,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$14") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$6(scala.collection.Seq,scala.Function4,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function4,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function4,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$6") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$20(scala.collection.Seq,scala.Function18,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function18,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function18,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$20") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$17(scala.collection.Seq,scala.Function15,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function15,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function15,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$17") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$23(scala.collection.Seq,scala.Function21,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function21,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function21,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$23") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$9(scala.collection.Seq,scala.Function7,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function7,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function7,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$9") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$16(scala.collection.Seq,scala.Function14,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function14,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function14,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$16") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$8(scala.collection.Seq,scala.Function6,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function6,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function6,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$8") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$22(scala.collection.Seq,scala.Function20,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function20,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function20,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$22") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$19(scala.collection.Seq,scala.Function17,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function17,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function17,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$19") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$10(scala.collection.Seq,scala.Function8,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function8,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function8,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$10") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$2(scala.collection.Seq,scala.Function0,org.apache.spark.sql.types.DataType,scala.collection.immutable.Nil#)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function0,org.apache.spark.sql.types.DataType,scala.collection.immutable.Nil#)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function0,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$2") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$13(scala.collection.Seq,scala.Function11,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function11,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function11,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$13") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$5(scala.collection.Seq,scala.Function3,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function3,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function3,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$5") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$21(scala.collection.Seq,scala.Function19,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function19,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function19,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$21") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$15(scala.collection.Seq,scala.Function13,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function13,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function13,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$15") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$7(scala.collection.Seq,scala.Function5,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function5,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function5,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$7") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$24(scala.collection.Seq,scala.Function22,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function22,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function22,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$24") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$18(scala.collection.Seq,scala.Function16,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function16,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function16,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$18") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$12(scala.collection.Seq,scala.Function10,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function10,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function10,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$12") | |
[error] * synthetic method org$apache$spark$sql$UDFRegistration$$builder$4(scala.collection.Seq,scala.Function2,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF in class org.apache.spark.sql.UDFRegistration's type has changed; was (scala.collection.Seq,scala.Function2,org.apache.spark.sql.types.DataType,scala.collection.immutable.List)org.apache.spark.sql.catalyst.expressions.ScalaUDF, is now: (scala.collection.Seq,scala.Function2,org.apache.spark.sql.types.DataType,scala.Option)org.apache.spark.sql.catalyst.expressions.ScalaUDF | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.UDFRegistration.org$apache$spark$sql$UDFRegistration$$builder$4") | |
[error] * method sqlParser()org.apache.spark.sql.execution.SparkSQLParser in class org.apache.spark.sql.SQLContext has now a different result type; was: org.apache.spark.sql.execution.SparkSQLParser, is now: org.apache.spark.sql.catalyst.ParserInterface | |
[error] filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.sql.SQLContext.sqlParser") | |
[error] * method specializeSingleDistinctAggPlanning()Boolean in class org.apache.spark.sql.SQLConf does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.SQLConf.specializeSingleDistinctAggPlanning") | |
[error] * method dialect()java.lang.String in class org.apache.spark.sql.SQLConf does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.SQLConf.dialect") | |
[error] * method buildInternalScan(Array[java.lang.String],Array[org.apache.spark.sql.sources.Filter],Array[java.lang.String],org.apache.spark.broadcast.Broadcast)org.apache.spark.rdd.RDD in class org.apache.spark.sql.sources.HadoopFsRelation's type has changed; was (Array[java.lang.String],Array[org.apache.spark.sql.sources.Filter],Array[java.lang.String],org.apache.spark.broadcast.Broadcast)org.apache.spark.rdd.RDD, is now: (Array[java.lang.String],Array[org.apache.spark.sql.sources.Filter],Array[org.apache.hadoop.fs.FileStatus],org.apache.spark.broadcast.Broadcast)org.apache.spark.rdd.RDD | |
[error] filter with: ProblemFilters.exclude[IncompatibleMethTypeProblem]("org.apache.spark.sql.sources.HadoopFsRelation.buildInternalScan") | |
[error] * method createRelation(org.apache.spark.sql.SQLContext,Array[java.lang.String],scala.Option,scala.Option,scala.Option,scala.collection.immutable.Map)org.apache.spark.sql.sources.HadoopFsRelation in trait org.apache.spark.sql.sources.HadoopFsRelationProvider does not have a correspondent in old version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.sql.sources.HadoopFsRelationProvider.createRelation") | |
[warn] /Users/sim/dev/spx/spark/mllib/src/main/scala/org/apache/spark/mllib/clustering/KMeans.scala:505: method setRuns in class KMeans is deprecated: Support for runs is deprecated. This param will have no effect in 2.0.0. | |
[warn] .setRuns(runs) | |
[warn] | |
[warn] /Users/sim/dev/spx/spark/mllib/src/main/scala/org/apache/spark/mllib/clustering/KMeans.scala:531: method setRuns in class KMeans is deprecated: Support for runs is deprecated. This param will have no effect in 2.0.0. | |
[warn] .setRuns(runs) | |
[warn] | |
[warn] /Users/sim/dev/spx/spark/mllib/src/main/scala/org/apache/spark/mllib/util/MFDataGenerator.scala:109: method round in package math is deprecated: This is an integer type; there is no reason to round it. Perhaps you meant to call this with a floating-point value? | |
[warn] math.round(sampSize * testSampFact), math.round(mn - sampSize)).toInt | |
[warn] | |
[warn] /Users/sim/dev/spx/spark/mllib/src/main/scala/org/apache/spark/mllib/clustering/PowerIterationClustering.scala:390: method setRuns in class KMeans is deprecated: Support for runs is deprecated. This param will have no effect in 2.0.0. | |
[warn] .setRuns(5) | |
[warn] | |
[warn] /Users/sim/dev/spx/spark/mllib/src/main/scala/org/apache/spark/mllib/api/python/PythonMLLibAPI.scala:360: method setRuns in class KMeans is deprecated: Support for runs is deprecated. This param will have no effect in 2.0.0. | |
[warn] .setRuns(runs) | |
[warn] | |
[info] spark-mllib: found 12 potential binary incompatibilities (filtered 4) | |
[error] * synthetic method org$apache$spark$mllib$feature$Word2Vec$$trainWordsCount()Int in class org.apache.spark.mllib.feature.Word2Vec has now a different result type; was: Int, is now: Long | |
[error] filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.mllib.feature.Word2Vec.org$apache$spark$mllib$feature$Word2Vec$$trainWordsCount") | |
[error] * method trainWord2VecModel(org.apache.spark.api.java.JavaRDD,Int,Double,Int,Int,Long,Int)org.apache.spark.mllib.api.python.PythonMLLibAPI#Word2VecModelWrapper in class org.apache.spark.mllib.api.python.PythonMLLibAPI has now a different result type; was: org.apache.spark.mllib.api.python.PythonMLLibAPI#Word2VecModelWrapper, is now: org.apache.spark.mllib.api.python.Word2VecModelWrapper | |
[error] filter with: ProblemFilters.exclude[IncompatibleResultTypeProblem]("org.apache.spark.mllib.api.python.PythonMLLibAPI.trainWord2VecModel") | |
[error] * class org.apache.spark.mllib.api.python.PythonMLLibAPI#Word2VecModelWrapper does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingClassProblem]("org.apache.spark.mllib.api.python.PythonMLLibAPI$Word2VecModelWrapper") | |
[error] * the type hierarchy of class org.apache.spark.ml.source.libsvm.LibSVMRelation has changed in new version. Missing types {org.apache.spark.sql.sources.TableScan} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.ml.source.libsvm.LibSVMRelation") | |
[error] * method buildScan()org.apache.spark.rdd.RDD in class org.apache.spark.ml.source.libsvm.LibSVMRelation does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.source.libsvm.LibSVMRelation.buildScan") | |
[error] * method appendColumn(org.apache.spark.sql.types.StructType,java.lang.String,org.apache.spark.sql.types.DataType)org.apache.spark.sql.types.StructType in object org.apache.spark.ml.util.SchemaUtils does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.util.SchemaUtils.appendColumn") | |
[error] * method this(java.lang.String,org.apache.spark.mllib.linalg.DenseMatrix)Unit in class org.apache.spark.ml.feature.PCAModel does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.feature.PCAModel.this") | |
[error] * the type hierarchy of object org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data has changed in new version. Missing types {scala.runtime.AbstractFunction1} | |
[error] filter with: ProblemFilters.exclude[MissingTypesProblem]("org.apache.spark.ml.feature.PCAModel$PCAModelWriter$Data$") | |
[error] * method apply(java.lang.Object)java.lang.Object in object org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data.apply") | |
[error] * method apply(org.apache.spark.mllib.linalg.DenseMatrix)org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data in object org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data.apply") | |
[error] * method copy(org.apache.spark.mllib.linalg.DenseMatrix)org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data in class org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data.copy") | |
[error] * method this(org.apache.spark.ml.feature.PCAModel#PCAModelWriter,org.apache.spark.mllib.linalg.DenseMatrix)Unit in class org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data does not have a correspondent in new version | |
[error] filter with: ProblemFilters.exclude[MissingMethodProblem]("org.apache.spark.ml.feature.PCAModel#PCAModelWriter#Data.this") | |
java.lang.RuntimeException: spark-streaming: Binary compatibility check failed! | |
at scala.sys.package$.error(package.scala:27) | |
at com.typesafe.tools.mima.plugin.SbtMima$.reportErrors(SbtMima.scala:64) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:35) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:34) | |
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) | |
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) | |
at sbt.std.Transform$$anon$4.work(System.scala:63) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) | |
at sbt.Execute.work(Execute.scala:235) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) | |
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
java.lang.RuntimeException: spark-sql: Binary compatibility check failed! | |
at scala.sys.package$.error(package.scala:27) | |
at com.typesafe.tools.mima.plugin.SbtMima$.reportErrors(SbtMima.scala:64) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:35) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:34) | |
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) | |
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) | |
at sbt.std.Transform$$anon$4.work(System.scala:63) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) | |
at sbt.Execute.work(Execute.scala:235) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) | |
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
java.lang.RuntimeException: spark-mllib: Binary compatibility check failed! | |
at scala.sys.package$.error(package.scala:27) | |
at com.typesafe.tools.mima.plugin.SbtMima$.reportErrors(SbtMima.scala:64) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:35) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:34) | |
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) | |
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) | |
at sbt.std.Transform$$anon$4.work(System.scala:63) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) | |
at sbt.Execute.work(Execute.scala:235) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) | |
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
java.lang.RuntimeException: spark-core: Binary compatibility check failed! | |
at scala.sys.package$.error(package.scala:27) | |
at com.typesafe.tools.mima.plugin.SbtMima$.reportErrors(SbtMima.scala:64) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at com.typesafe.tools.mima.plugin.MimaPlugin$$anonfun$mimaReportSettings$3.apply(MimaPlugin.scala:23) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:35) | |
at scala.Function5$$anonfun$tupled$1.apply(Function5.scala:34) | |
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47) | |
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40) | |
at sbt.std.Transform$$anon$4.work(System.scala:63) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:226) | |
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17) | |
at sbt.Execute.work(Execute.scala:235) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:226) | |
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159) | |
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:266) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
[error] (streaming/*:mimaReportBinaryIssues) spark-streaming: Binary compatibility check failed! | |
[error] (sql/*:mimaReportBinaryIssues) spark-sql: Binary compatibility check failed! | |
[error] (mllib/*:mimaReportBinaryIssues) spark-mllib: Binary compatibility check failed! | |
[error] (core/*:mimaReportBinaryIssues) spark-core: Binary compatibility check failed! | |
[error] Total time: 104 s, completed Feb 7, 2016 9:55:21 PM | |
[error] running /Users/sim/dev/spx/spark/dev/mima ; received return code 1 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment