Skip to content

Instantly share code, notes, and snippets.

View j-thepac's full-sized avatar

Deepak j-thepac

View GitHub Profile
package ch06
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions.lit
import org.apache.spark.sql.types.StringType
//nc -l -p 9999
object SimpleStream extends App {
val spark = SparkSession.builder
.master("local")
.appName("StructuredNetworkWordCount")
@j-thepac
j-thepac / DStreamSoacket.scala
Created April 5, 2022 03:47
Data into Socket
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.streaming.dstream.DStream
//nc -l -p 9999
object DStreamSoacket extends App {
val conf = new SparkConf().setMaster("local[*]").setAppName("App name")
val sc = new SparkContext(conf)
val ssc = new StreamingContext(sc, Seconds(5))
val lines: DStream[String] = ssc.socketTextStream(hostname = "localhost", port = 9999)
val wordCounts = lines.flatMap(_.split(" ")).map(word => (word, 1)).reduceByKey(_ + _)
@j-thepac
j-thepac / DStreamSimpleText.scala
Created April 5, 2022 03:45
Send Simple text Files as Dstream
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.dstream.DStream
object DStreamSimpleText extends App {
val conf = new SparkConf().setMaster("local[*]").setAppName("App name")
val sc = new SparkContext(conf)
val ssc = new StreamingContext(sc, Seconds(5))
val filestream: DStream[String] = ssc.textFileStream(
"/Users/deepakjayaprakash/Downloads/testing"
) // read new file
@j-thepac
j-thepac / DStreamSendRDD.scala
Created April 5, 2022 03:44
Send RDD in Streaming
import org.apache.spark.rdd.RDD
import org.apache.spark.{SparkConf}
import org.apache.spark.streaming.{Seconds, StreamingContext}
object DStreamSendRDD extends App {
val conf = new SparkConf().setMaster("local[*]").setAppName("StreamingTransformExample")
val ssc = new StreamingContext(conf, Seconds(5))
val rdd1 = ssc.sparkContext.parallelize(Array(1, 2, 3))
val rddQueue = scala.collection.mutable.Queue[RDD[Int]]() //scala.collection.mutable.Queue[RDD[MyObject]]()
@j-thepac
j-thepac / stream.scala
Created April 4, 2022 10:17
spark read stream data
import org.apache.spark.sql.SparkSession
object SimpleStream extends App {
val spark = SparkSession.builder
.master("local")
.appName("StructuredNetworkWordCount")
.getOrCreate()
spark.sparkContext.setLogLevel("ERROR")
@j-thepac
j-thepac / sttpApiClient.scala
Created March 12, 2022 06:10
Simple Http Client Implementation in Scala
// Ref -https://sttp.softwaremill.com/en/latest/examples.html
/*
libraryDependencies += "com.softwaremill.sttp.client3" %% "core" % "3.5.1"
Bazel Dependencies :
com.softwaremill.sttp.client3.core
com.softwaremill.sttp.client3.model
com.softwaremill.sttp.client3.shared
*/

Create xml file .../spark-3.0.0-bin-hadoop2.7/conf/hive-site.xml

<configuration> 
    <property> 
      <name>hive.metastore.warehouse.dir</name> 
      <value>/Users/deepakjayaprakash/Downloads/spark_database/</value> 
    </property> 
  </configuration>

Save

@j-thepac
j-thepac / Coding Principles.md
Last active May 19, 2024 13:43
Coding Principles

Principles

  • Loose Coupling
  • High Cohesion
  • Change is Local
  • It is Easy to Remove

Smells

  • Rigidity ( A -> B -> C . something hardcoded in C )
  • Fragility
  • Immobility

postgres-Django

Ref:Tutorial

Azure App Service

Creat APP

    git clone https://github.com/Azure-Samples/djangoapp
    cd djangoapp
    az webapp up \

--resource-group DjangoPostgres-tutorial-rg \

az extension add --name db-up --debug

Error:

Building wheel for pymssql (PEP 517): finished with status 'error'
  ERROR: Failed building wheel for pymssql
Failed to build pymssql
ERROR: Could not build wheels for pymssql which use PEP 517 and cannot be installed directly

Exception information: