Created
July 30, 2019 06:26
-
-
Save rednaxelafx/05ac33f59be070c4801de4db000c0448 to your computer and use it in GitHub Desktop.
Apache Spark master running on OpenJDK11u
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ bin/spark-shell | |
WARNING: An illegal reflective access operation has occurred | |
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/home/krismok/code/work/oss-spark-readonly/assembly/target/scala-2.12/jars/spark-unsafe_2.12-3.0.0-SNAPSHOT.jar) to constructor java.nio.DirectByteBuffer(long,int) | |
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform | |
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations | |
WARNING: All illegal access operations will be denied in a future release | |
19/07/30 06:03:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
Setting default log level to "WARN". | |
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). | |
Spark context Web UI available at <snip>:4040 | |
Spark context available as 'sc' (master = local[*], app id = local-1564466603611). | |
Spark session available as 'spark'. | |
Welcome to | |
____ __ | |
/ __/__ ___ _____/ /__ | |
_\ \/ _ \/ _ `/ __/ '_/ | |
/___/ .__/\_,_/_/ /_/\_\ version 3.0.0-SNAPSHOT | |
/_/ | |
Using Scala version 2.12.8 (OpenJDK 64-Bit Server VM, Java 11.0.4) | |
Type in expressions to have them evaluated. | |
Type :help for more information. | |
scala> spark.range(10).select('id + 100 / 10).show() | |
+---------+ | |
|(id + 10)| | |
+---------+ | |
| 10| | |
| 11| | |
| 12| | |
| 13| | |
| 14| | |
| 15| | |
| 16| | |
| 17| | |
| 18| | |
| 19| | |
+---------+ | |
scala> spark.range(10).select(('id + 100) / 10).show() | |
+-----------------+ | |
|((id + 100) / 10)| | |
+-----------------+ | |
| 10.0| | |
| 10.1| | |
| 10.2| | |
| 10.3| | |
| 10.4| | |
| 10.5| | |
| 10.6| | |
| 10.7| | |
| 10.8| | |
| 10.9| | |
+-----------------+ | |
scala> spark.range(10).select(('id + 100) / 10 as 'x).createOrReplaceTempView("foo") | |
scala> spark.table("foo").filter('x > 10.5).show() | |
+----+ | |
| x| | |
+----+ | |
|10.6| | |
|10.7| | |
|10.8| | |
|10.9| | |
+----+ | |
scala> spark.table("foo").filter('x > 10.5).write.format("parquet").save("the_foo_table") | |
scala> spark.read.parquet("the_foo_table").show | |
+----+ | |
| x| | |
+----+ | |
|10.8| | |
|10.9| | |
|10.7| | |
|10.6| | |
+----+ | |
scala> :q |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment