Skip to content

Instantly share code, notes, and snippets.

View 7yl4r's full-sized avatar

Tylar 7yl4r

View GitHub Profile
@7yl4r
7yl4r / minutes.md
Created December 8, 2022 18:59
SE US MBON Meeting 2022-12-08
  • MBON data portal training is upcoming.
  • when developing products remember to keep international users in mind
    • USAD funding might be available
    • Adriano Lima from the Air Centre can also help develop products
      • he has technical ability
      • was already able to leverage seascapR to create shiny app
      • Portuguese tranlation for could be helpful
        • Ben has had good experiences using google translate for i18n
        • Thiago also willing to help
  • We need better data management
@7yl4r
7yl4r / notes-2022_11_17.md
Created November 17, 2022 20:41
IMaRS + Planet meeting
@7yl4r
7yl4r / _tasks.md
Created November 17, 2022 18:40
MBON data management checklists

The bare minimum to register an MBON-facilitated dataset:

  • a unique dataset_id
  • supporting MBON node(s)
  • a contact email
  • a plain-text description of the dataset

Multiple google forms are used to collect different aspects of metadata.

MBON-facilitated dataset forms:

@7yl4r
7yl4r / DasDds_output.txt
Created October 26, 2022 17:58
no NFS but still failing
root@4c2b94c118bb:/usr/local/tomcat/webapps/erddap/WEB-INF# bash DasDds.sh OC_c7fe_e1ee_913d
////**** EDStatic Low Level Startup
localTime=2022-10-26T17:57:42+00:00
erddapVersion=2.18
Java 1.8.0_322 (64 bit, Oracle Corporation) on Linux (4.18.0-348.7.1.el8_5.x86_64).
MemoryInUse= 50 MB (highWaterMark= 50 MB) (Xmx ~= 958 MB)
logLevel=info: verbose=true reallyVerbose=false
got bigParentDirectory from ERDDAP_bigParentDirectory
got emailFromAddress from ERDDAP_emailFromAddress
root@302d01eec869:/erddapData/logs# ls -lah /test_local_data_dir/
total 64M
drwxr-xr-x. 2 root root 126 Oct 26 16:22 .
drwxr-xr-x. 1 root root 118 Oct 26 16:30 ..
-rwxr-xr--. 1 4747 4504 22M Oct 26 16:19 MODA_2002185_2002189_7D_FK_OC.nc
-rwxr-xr--. 1 4747 4504 22M Oct 26 16:19 MODA_2002190_2002196_7D_FK_OC.nc
-rwxr-xr--. 1 4747 4504 22M Oct 26 16:19 MODA_2002197_2002203_7D_FK_OC.nc
root@302d01eec869:/erddapData/logs# head -1 /test_local_data_dir/MODA_2002185_2002189_7D_FK_OC.nc
�HDF
@7yl4r
7yl4r / DasDds_output.txt
Last active October 26, 2022 17:56
NFS + Docker ERDDAP datasets.xml issues
root@302d01eec869:/usr/local/tomcat/webapps/erddap/WEB-INF# bash DasDds.sh OC_c7fe_e1ee_913c
////**** EDStatic Low Level Startup
localTime=2022-10-26T16:38:54+00:00
erddapVersion=2.18
Java 1.8.0_322 (64 bit, Oracle Corporation) on Linux (4.18.0-348.7.1.el8_5.x86_64).
MemoryInUse= 50 MB (highWaterMark= 50 MB) (Xmx ~= 958 MB)
logLevel=info: verbose=true reallyVerbose=false
got bigParentDirectory from ERDDAP_bigParentDirectory
got emailFromAddress from ERDDAP_emailFromAddress
root@95c41cfa96c1:/usr/local/tomcat/webapps/erddap/WEB-INF# bash DasDds.sh OC_c7fe_e1ee_913c | tee OC_c7fe_e1ee_913c_dasdds_log.txt
////**** EDStatic Low Level Startup
localTime=2022-10-22T16:18:13+00:00
erddapVersion=2.18
Java 1.8.0_322 (64 bit, Oracle Corporation) on Linux (4.18.0-348.7.1.el8_5.x86_64).
MemoryInUse= 50 MB (highWaterMark= 50 MB) (Xmx ~= 958 MB)
logLevel=info: verbose=true reallyVerbose=false
got bigParentDirectory from ERDDAP_bigParentDirectory
got emailFromAddress from ERDDAP_emailFromAddress

This example demonstrates a rebase and how it differs from a merge. Understanding this is critical for working within most collaborative git workflows.

Example Setup

I have a local repo (named origin) which I have forked from an upstream remote (named upstream):

tylar@laptop:~/pyobis$ git remote -v
origin  [email protected]:7yl4r/pyobis.git (fetch)
origin  [email protected]:7yl4r/pyobis.git (push)
upstream [email protected]:iobis/pyobis.git (fetch)
@7yl4r
7yl4r / implementation-01.py
Created July 20, 2022 19:04
pyOBIS pagination implementations
def search(
scientificname=None,taxonid=None,nodeid=None,datasetid=None,startdate=None,enddate=None,
startdepth=None,enddepth=None,geometry=None,year=None,flags=None,fields=None,size=5000,
offset=0,mof=False,hasextensions=None,**kwargs
):
args = {
'taxonid': taxonid,'obisid': obisid,'datasetid': datasetid,
'scientificname': scientificname,'startdate': startdate,
'enddate': enddate,'startdepth': startdepth,'enddepth': enddepth,
'geometry': geometry,'year': year,'fields': fields,

next steps

  1. find funding for 2030 effort
  2. producing products (product = map layers in AWS)

product production details

  1. want to auto-update the layer when data submitted 2. crowd-source submission system goes to AWS then converted to xyz data files in AWS buckets 1. then can use VM in paperspace via parsec 4. not working in ArcGIS bc it is too big - too many polygons (300billion +).
  2. tried ArcOnline & Enterprise also. Enterprise could maybe do it w/ more cores.