This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| For Linux and macOS, let's use grep: | |
| java -XshowSettings:properties -version 2>&1 > /dev/null | grep 'java.home' | |
| And for Windows, let's use findstr: | |
| java -XshowSettings:properties -version 2>&1 | findstr "java.home" | |
| ---------------------------------------------------------------------------------------------------- |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Spring Boot is an open source Java-based framework used to develop a stand-alone and production-grade spring application that you can just run. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| # MySQL installation in WSL2 ubuntu | |
| # How to access mysql with default password in Ubuntu 20.04 :: | |
| -------------------------------------------------------- | |
| sudo apt update | |
| sudo apt upgrade | |
| sudo apt install mysql-server | |
| sudo apt install mysql-client | |
| mysql --version | |
| sudo usermod -d /var/lib/mysql/ mysql |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| start-dfs.sh | |
| start-yarn.sh | |
| jps | |
| sudo mkdir /tmp/spark-events | |
| sudo chown hduser:hadoop -R tmp | |
| hduser@thanoojubuntu-Inspiron-3521: start-master.sh | |
| hduser@thanoojubuntu-Inspiron-3521: start-slave.sh spark://thanoojubuntu-Inspiron-3521:7077 | |
| starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark-2.4.6-bin-hadoop2.7/logs/spark-hduser-org.apache.spark.deploy.worker.Worker-1-thanoojubuntu-Inspiron-3521.out | |
| hduser@thanoojubuntu-Inspiron-3521:/tmp$ spark-shell --master spark://thanoojubuntu-Inspiron-3521:7077 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Pre-requisites: | |
| 1. Expecting Hadoop and Hive is well configured | |
| 2. hive CLI is working as expected | |
| then, we can try running HiveServer2 | |
| hduser@thanoojubuntu-Inspiron-3521:~/softwares/apache-hive-2.3.3-bin/conf$ hive --service hiveserver2 --hiveconf hive.server2.thrift.port=10000 --hiveconf hive.root.logger=INFO,console | |
| 2020-11-20 21:25:32: Starting HiveServer2 | |
| SLF4J: Class path contains multiple SLF4J bindings. | |
| SLF4J: Found binding in [jar:file:/home/hduser/softwares/apache-hive-2.3.3-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| package com.autogenclass; | |
| import java.io.File; | |
| import java.io.IOException; | |
| import java.nio.file.Files; | |
| import java.nio.file.Paths; | |
| import java.util.ArrayList; | |
| import java.util.List; | |
| import java.util.stream.Collectors; | |
| import java.util.stream.Stream; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| hduser@thanoojubuntu-Inspiron-3521:~$ nc -lk 9999 | |
| helllo word hello python hello spark hello pyspark hellow streaming pyspark |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| package com.kafkaconnectone; | |
| import java.util.Map.Entry; | |
| import java.util.Properties; | |
| import org.apache.kafka.clients.producer.KafkaProducer; | |
| import org.apache.kafka.clients.producer.Producer; | |
| import org.apache.kafka.clients.producer.ProducerRecord; | |
| import org.apache.kafka.common.KafkaException; | |
| import org.apache.kafka.common.errors.AuthorizationException; |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| By default hive uses MR engine but, we can set to taz or even spark engine (in-memory computation) | |
| But, | |
| hive has SQL like HiveQL (HQL) and more usage when you are a SQL developer | |
| even though we have UDFs, we do not have extra backyard area to do some core/complex business logic | |
| and Spark has Spark SQL and we can move from DF to RDD and RDD to DF to perform core/complex business logic | |
| No resume capability | |
| Hive can not drop encripted databases |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| import subprocess | |
| from pyspark.sql import functions as f | |
| from operator import add | |
| from pyspark.sql import Row, SparkSession | |
| from pyspark.sql.types import StructField, StringType, StructType | |
| def sparkwithhiveone(): | |
| sparkwithhive = getsparkwithhive() | |
| try: | |
| assert (sparkwithhive.conf.get("spark.sql.catalogImplementation") == "hive") |