Skip to content

Instantly share code, notes, and snippets.

View PawaritL's full-sized avatar
🦆

Pawarit Laosunthara PawaritL

🦆
View GitHub Profile
@PawaritL
PawaritL / geojson_example.py
Created July 16, 2021 10:59
geojson_example.py
geojson = {
"type": "Polygon",
"coordinates": [
[
[179.0, 0.0],
[-179.0, 0.0],
[-179.0, 1.0],
[179.0, 1.0],
[179.0, 0.0]
]
@PawaritL
PawaritL / antimeridian_crossing.py
Last active July 16, 2021 10:43
Checks if two provided longitude coordinates crosses the antimeridian (180th meridian)
def check_crossing(lon1: float, lon2: float, validate: bool = False, dlon_threshold: float = 180.0):
"""
Assuming a minimum travel distance between two provided longitude coordinates,
checks if the 180th meridian (antimeridian) is crossed.
"""
if validate and any([abs(x) > 180.0 for x in [lon1, lon2]]):
raise ValueError("longitudes must be in degrees [-180.0, 180.0]")
return abs(lon2 - lon1) > dlon_threshold
@PawaritL
PawaritL / README.md
Last active January 13, 2023 16:24
Parse nested JSON into your ideal, customizable Spark schema (StructType)

Is Spark's JSON schema inference too inflexible for your liking?

Common Scenarios:

  • Automatic schema inference from Spark is not applying your desired type casting
  • You want to completely drop irrelevant fields when parsing
  • You want to avoid some highly nested fields simply by casting some outer fields as strings

Step 1: Provide your (ideal) JSON data example

REFERENCE_EXAMPLE = {
  "firstName": "Will",
@PawaritL
PawaritL / example_input.py
Last active October 26, 2022 07:13
Converts a GeoJSON polygon to its antimeridian-compatible constituent polygon(s)
geojson = {
"type": "Polygon",
"coordinates": [
[
[179.0, 0.0],
[-179.0, 0.0],
[-179.0, 1.0],
[179.0, 1.0],
[179.0, 0.0]
]