Created
September 29, 2020 20:20
-
-
Save paulfryzel/0dc55556cb1df6198d05523a6617b761 to your computer and use it in GitHub Desktop.
$ sbt clean compile test
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[info] welcome to sbt 1.3.13 (AdoptOpenJDK Java 11.0.7) | |
[info] loading global plugins from C:\Users\paul\.sbt\1.0\plugins | |
[info] loading settings for project sparksql-scalapb-build from plugins.sbt ... | |
[info] loading project definition from C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\project | |
[info] loading settings for project root from build.sbt,sonatype.sbt,version.sbt ... | |
[info] set current project to root (in build file:/C:/Users/paul/Documents/GitHub/forks/sparksql-scalapb/) | |
[info] Executing in batch mode. For better performance use sbt's shell | |
[success] Total time: 0 s, completed Sep 29, 2020, 4:12:35 PM | |
[info] Compiling 9 Scala sources to C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\target\scala-2.12\classes ... | |
[info] Done compiling. | |
[success] Total time: 8 s, completed Sep 29, 2020, 4:12:43 PM | |
[info] Compiling 8 protobuf files to C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\target\scala-2.12\src_managed\test | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\all_types2.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\base.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\defaultsv3.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\wrappers.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\defaults.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\maps.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\demo.proto | |
[info] Compiling schema C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\all_types3.proto | |
protoc-jar: protoc version: 3.11.4, detected platform: windows-x86_64 (windows 10/amd64) | |
protoc-jar: embedded: bin/3.11.4/protoc-3.11.4-windows-x86_64.exe | |
protoc-jar: executing: [C:\Users\paul\AppData\Local\Temp\protocjar893352014095741815\bin\protoc.exe, --plugin=protoc-gen-jvm_0=C:\Users\paul\AppData\Local\Temp\protocbridge14870202225594409922.bat, --jvm_0_out=C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\target\scala-2.12\src_managed\test, -IC:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\main\protobuf, -IC:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\target\protobuf_external_src, -IC:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\target\protobuf_external, -IC:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\base.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\wrappers.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\demo.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\defaultsv3.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\all_types2.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\maps.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\defaults.proto, C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\src\test\protobuf\all_types3.proto] | |
[info] Compiling 60 Scala sources to C:\Users\paul\Documents\GitHub\forks\sparksql-scalapb\sparksql-scalapb\target\scala-2.12\test-classes ... | |
[info] Done compiling. | |
[info] DefaultsSpec: | |
[info] Proto2 RDD[DefaultsRequired] | |
[info] - should have non-null default values after converting to Dataframe | |
[info] Proto2 RDD[DefaultsOptional] | |
[info] - should have null values after converting to Dataframe | |
[info] Proto3 RDD[DefaultsV3] | |
[info] - should have non-null default values after converting to Dataframe | |
[info] SchemaOptionsSpec: | |
[info] converting df with primitive wrappers | |
[info] - should unpack primitive wrappers by default | |
[info] converting df with primitive wrappers | |
[info] - should retain value field when option is set | |
[info] schema | |
[info] - should use scalaNames when option is set | |
+------------+ | |
| attributes| | |
+------------+ | |
|[foo -> bar]| | |
+------------+ | |
[info] MapsSpec: | |
[info] converting maps to df | |
[info] - should work | |
root | |
|-- data: binary (nullable = true) | |
root | |
|-- sha1(data): string (nullable = true) | |
+------+---+------+----+-------------------+----+-----+----------+ | |
| name|age|gender|tags| addresses|base|inner| data| | |
+------+---+------+----+-------------------+----+-----+----------+ | |
|Owen M| 35| MALE| []|[[, San Francisco]]|null| null|[01 02 03]| | |
+------+---+------+----+-------------------+----+-----+----------+ | |
root | |
|-- name: string (nullable = true) | |
|-- age: integer (nullable = true) | |
|-- gender: string (nullable = true) | |
|-- tags: array (nullable = true) | |
| |-- element: string (containsNull = true) | |
|-- addresses: array (nullable = true) | |
| |-- element: struct (containsNull = true) | |
| | |-- street: string (nullable = true) | |
| | |-- city: string (nullable = true) | |
|-- base: struct (nullable = true) | |
|-- inner: struct (nullable = true) | |
| |-- inner_value: string (nullable = true) | |
|-- data: binary (nullable = true) | |
+------+---+--------------------+------+----+----+-----+----------+-------+----+ | |
| name|age| addresses|gender|tags|base|inner| data|address|nums| | |
+------+---+--------------------+------+----+----+-----+----------+-------+----+ | |
|Owen M| 35|[[foo, bar], [baz...| MALE| []|null| [V1]|[01 02 03]| null| []| | |
+------+---+--------------------+------+----+----+-----+----------+-------+----+ | |
+-------+--------------------+----------+ | |
|eventId| action| foo| | |
+-------+--------------------+----------+ | |
| xyz|[type.googleapis....|[pK, foo]| | |
+-------+--------------------+----------+ | |
[info] PersonSpec: | |
[info] mapping datasets | |
[info] - should work | |
[info] Creating person dataset | |
[info] - should work | |
[info] Creating enum dataset | |
[info] - should work | |
[info] Creating bytestring dataset | |
[info] - should work | |
[info] Dataset[Person] | |
[info] - should work | |
[info] as[SimplePerson] | |
[info] - should work for manual building *** FAILED *** | |
[info] org.apache.spark.sql.AnalysisException: Try to map struct<name:string,age:int,addresses:array<struct<street:string,city:string>>,gender:string,tags:array<string>,base:struct<>,inner:struct<inner_value:string>,data:binary,address:struct<street:string,city:string>,nums:array<int>> to Tuple5, but failed as the number of fields does not line up.; | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$.fail(Analyzer.scala:3098) | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveDeserializer$$validateTopLevelTupleFields(Analyzer.scala:3115) | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$$anonfun$apply$31$$anonfun$applyOrElse$170.applyOrElse(Analyzer.scala:3067) | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$$anonfun$apply$31$$anonfun$applyOrElse$170.applyOrElse(Analyzer.scala:3059) | |
[info] at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$1(TreeNode.scala:309) | |
[info] at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:72) | |
[info] at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:309) | |
[info] at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformExpressionsDown$1(QueryPlan.scala:96) | |
[info] at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:118) | |
[info] at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:72) | |
[info] ... | |
[info] as[Person] | |
[info] - should work for manual building *** FAILED *** | |
[info] org.apache.spark.sql.AnalysisException: Try to map struct<name:string,age:int,addresses:array<struct<street:string,city:string>>,gender:string,tags:array<string>,base:struct<>,inner:struct<inner_value:string>,data:binary,address:struct<street:string,city:string>,nums:array<int>> to Tuple8, but failed as the number of fields does not line up.; | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$.fail(Analyzer.scala:3098) | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveDeserializer$$validateTopLevelTupleFields(Analyzer.scala:3115) | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$$anonfun$apply$31$$anonfun$applyOrElse$170.applyOrElse(Analyzer.scala:3067) | |
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveDeserializer$$anonfun$apply$31$$anonfun$applyOrElse$170.applyOrElse(Analyzer.scala:3059) | |
[info] at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDown$1(TreeNode.scala:309) | |
[info] at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:72) | |
[info] at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:309) | |
[info] at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformExpressionsDown$1(QueryPlan.scala:96) | |
[info] at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$mapExpressions$1(QueryPlan.scala:118) | |
[info] at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:72) | |
[info] ... | |
[info] converting from rdd to dataframe | |
[info] - should work | |
[info] selecting message fields into dataset should work | |
[info] - should work | |
[info] serialize and deserialize | |
[info] - should work on dataset of bytes | |
[info] UDFs that involve protos | |
[info] - should work when using ProtoSQL.udfs | |
[info] UDFs that returns protos | |
[info] - should work when using ProtoSQL.createDataFrame | |
[info] UDFs that returns protos | |
[info] - should work when reading local files | |
[info] OuterCaseClass | |
[info] - should use our type encoders | |
[info] AllTypesSpec: | |
[info] AllTypes | |
[info] - should work for int32 *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(Int32Test(Some(1),Some(1229914175),Some(0),Some(0),Some(-2147483648),-1,1,2147483647,-2147483648,1666010783,Vector(),Vector(),Vector(),Vector(),Vector())) // 3 shrinks | |
[info] ) | |
[info] - should work for int64 *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(Int64Test(Some(1),Some(-9223372036854775808),None,Some(-9223372036854775808),Some(0),-3885044029316536083,6832096517673684120,8348352819039621292,1,6809668556522415544,Vector(-8854076712796512039),Vector(),Vector(),Vector(0),Vector())) // 3 shrinks | |
[info] ) | |
[info] - should work for bools *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(BoolTest(Some(true),true,Vector(true))) // 3 shrinks | |
[info] ) | |
[info] - should work for strings *** FAILED *** | |
[info] TestFailedException was thrown during property evaluation. | |
[info] Message: Array(null) did not contain the same elements as Vector(StringTest(None,욂Ⱪ炠,Vector())) | |
[info] Location: (AllTypesSpec.scala:43) | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(StringTest(None,욂Ⱪ炠,Vector())) // 2 shrinks | |
[info] ) | |
[info] - should work for floats *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(FloatTest(Some(1.950832E-16),-1.8113771E-22,Vector())) // 2 shrinks | |
[info] ) | |
[info] - should work for doubles *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(DoubleTest(Some(1.0324295084573865E243),1.2392118057976329E-49,Vector(-4.2813577237598723E-72))) | |
[info] ) | |
[info] - should work for bytes *** FAILED *** | |
[info] TestFailedException was thrown during property evaluation. | |
[info] Message: Array(null) did not contain the same elements as Vector(BytesTest(None,<ByteString@3f9cd00d size=3 contents="\2153\244">,Vector(<ByteString@75e79093 size=2 contents="\2425">, <ByteString@6966f434 size=2 contents="\000\005">))) | |
[info] Location: (AllTypesSpec.scala:43) | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(BytesTest(None,<ByteString@3f9cd00d size=3 contents="\2153\244">,Vector(<ByteString@75e79093 size=2 contents="\2425">, <ByteString@6966f434 size=2 contents="\000\005">))) // 3 shrinks | |
[info] ) | |
[info] - should work for enums *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(EnumTest(Some(BAZ),BAZ,Vector(BAZ),Some(FOREIGN_BAZ),FOREIGN_BAR,Vector())) // 2 shrinks | |
[info] ) | |
[info] - should work for messages *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(MessageTest(Some(NestedMessage(Some(-1575184072))),NestedMessage(Some(1)),Vector(NestedMessage(Some(-1560820724))),Some(TopLevelMessage(Some(1))),TopLevelMessage(Some(1242203210)),Vector(TopLevelMessage(Some(1))))) // 3 shrinks | |
[info] ) | |
[info] - should work for oneofs *** FAILED *** | |
[info] TestFailedException was thrown during property evaluation. | |
[info] Message: Array(null) did not contain the same elements as Vector(OneofTest(OneofBytes(<ByteString@2acddafe size=8 contents="\177A\365\177#\001oc">))) | |
[info] Location: (AllTypesSpec.scala:43) | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(OneofTest(OneofBytes(<ByteString@2acddafe size=8 contents="\177A\365\177#\001oc">))) // 2 shrinks | |
[info] ) | |
[info] - should work for levels *** FAILED *** | |
[info] TestFailedException was thrown during property evaluation. | |
[info] Message: Array(null) did not contain the same elements as Vector(Level1(None,Some(频獰⮟))) | |
[info] Location: (AllTypesSpec.scala:43) | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(Level1(None,Some(频獰⮟))) // 4 shrinks | |
[info] ) | |
[info] - should work for any *** FAILED *** | |
[info] TestFailedException was thrown during property evaluation. | |
[info] Message: Array(null) did not contain the same elements as Vector(AnyTest(None)) | |
[info] Location: (AllTypesSpec.scala:43) | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(AnyTest(None)) // 3 shrinks | |
[info] ) | |
[info] - should work for time types *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(WellKnownTypes(Some(Timestamp(1589022557055990817,1,UnknownFieldSet(Map()))),Some(Duration(1,-2147483648,UnknownFieldSet(Map()))),Vector(Timestamp(7578350744175665721,-2147483648,UnknownFieldSet(Map()))),Vector(Duration(5320457540890993270,1564860612,UnknownFieldSet(Map()))))) // 1 shrink | |
[info] ) | |
[info] - should work for wrapper types *** FAILED *** | |
[info] IllegalArgumentException was thrown during property evaluation. | |
[info] Message: requirement failed: Join keys from two sides should have same types | |
[info] Occurred when passed generated values ( | |
[info] arg0 = Vector(WrappersTest(Some(false),Some(<ByteString@3ad274c2 size=0 contents="">),None,Some(1.2095752E-18),Some(14745403),Some(0),Some(),Some(1),Some(1),Vector(),Vector(<ByteString@15b8507f size=0 contents="">),Vector(),Vector(),Vector(),Vector(9223372036854775807),Vector(),Vector(),Vector())) // 1 shrink | |
[info] ) | |
[info] - should work for maps | |
[info] ScalaTest | |
[info] Run completed in 17 seconds, 294 milliseconds. | |
[info] Total number of tests run: 36 | |
[info] Suites: completed 5, aborted 0 | |
[info] Tests: succeeded 20, failed 16, canceled 0, ignored 0, pending 0 | |
[info] *** 16 TESTS FAILED *** | |
[error] Failed: Total 36, Failed 16, Errors 0, Passed 20 | |
[error] Failed tests: | |
[error] scalapb.spark.PersonSpec | |
[error] scalapb.spark.AllTypesSpec | |
[error] (sparkSqlScalaPB / Test / test) sbt.TestsFailedException: Tests unsuccessful | |
[error] Total time: 55 s, completed Sep 29, 2020, 4:13:38 PM |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment