Skip to content

Instantly share code, notes, and snippets.

View jaceklaskowski's full-sized avatar
:octocat:
Enjoying developer life...

Jacek Laskowski jaceklaskowski

:octocat:
Enjoying developer life...
View GitHub Profile
@jaceklaskowski
jaceklaskowski / langchain.md
Last active February 18, 2024 13:37
langchain

LangChain in Action

Install poetry

Install poetry to manage dependencies (langchain, jupyter)

Requires pipx

Create Project

@jaceklaskowski
jaceklaskowski / keras.md
Last active November 18, 2023 22:43
Keras

Keras

Why Keras

Considered Keras and PyTorch to immerse into the mindset of Python and learn something brand new. Deep Learning seemed challenging enough 😉

Looked at the github repos and found that keras is 99.9% Python (with 0.1% Shell) while PyTorch at just 48.4% Python with tons of other languages.

I've also got the O'Reilly book about Keras by Aurelien Geron.

@jaceklaskowski
jaceklaskowski / macos_maintenance.md
Last active December 23, 2023 20:19
macOS maintenance

macOS Maintenance

Various commands I've been using to manage the local development environment on macOS.

brew update && brew upgrade && brew cleanup

Tools

@jaceklaskowski
jaceklaskowski / dask-pip-install.md
Created February 19, 2023 09:29
What `python -m pip install -e .` does

What python -m pip install -e . does

While installing dask-distributed, you can see the following output:

$ python -m pip install -e .
Obtaining file:///Users/jacek/dev/oss/dask/distributed
  Preparing metadata (setup.py) ... done
Requirement already satisfied: ...
...
@jaceklaskowski
jaceklaskowski / dask.md
Last active February 19, 2023 14:55
Dask
@jaceklaskowski
jaceklaskowski / python.md
Last active June 22, 2023 08:07
Random Python Notes
@jaceklaskowski
jaceklaskowski / hadoop-spark-properties.md
Last active April 22, 2021 11:58
Hadoop Properties for Spark in Cloud (s3a, buckets)

Hadoop Properties for Spark in Cloud

The following is a list of Hadoop properties for Spark to use HDFS more effective.

spark.hadoop.-prefixed Spark properties are used to configure a Hadoop Configuration that Spark broadcast to tasks. Use spark.sparkContext.hadoopConfiguration to review the properties.

  • spark.hadoop.mapreduce.fileoutputcommitter.algorithm.version = 2

Google Cloud Storage

@jaceklaskowski
jaceklaskowski / workshop.md
Last active January 19, 2019 15:50
Kafka Streams Workshop

Workshop

Exercise: KStream.transformValues

Use KStream.transformValues

Exercise: Using Materialized

val materialized = Materialized.as[String, Long, ByteArrayKeyValueStore]("poznan-state-store")
@jaceklaskowski
jaceklaskowski / person.md
Last active January 8, 2019 19:37
PersonSerde
  case class Person(id: Long, name: String)

  class PersonSerializer extends Serializer[Person] {
    override def configure(configs: util.Map[String, _], isKey: Boolean): Unit = {}

    override def serialize(topic: String, data: Person): Array[Byte] = {
      println(s">>> serialize($topic, $data)")
      s"${data.id},${data.name}".map(_.toByte).toArray
    }
@jaceklaskowski
jaceklaskowski / anatolyi.md
Created December 16, 2017 12:14
Anatolyi - Facebook Profiles
// Let's create a sample dataset with just a single line, i.e. facebook profile
val facebookProfile = "ActivitiesDescription:703 likes, 0 talking about this, 4 were here; Category:; Email:[email protected]; Hours:Mon-Fri: 8:00 am - 5:00 pm; Likes:703; Link:https://www.facebook.com/pvhvac; Location:165 W Wieuca Rd NE, Ste 310, Atlanta, Georgia; Name:PV Heating & Air; NumberOfPictures:0; NumberOfReviews:26; Phone:(404) 798-9672; ShortDescription:We specialize in residential a/c, heating, indoor air quality & home performance.; Url:http://www.pvhvac.com; Visitors:4"
val fbs = Seq(facebookProfile).toDF("profile")

scala> fbs.show(truncate = false)
+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------