Skip to content

Instantly share code, notes, and snippets.

View obeleh's full-sized avatar

Sjuul Janssen obeleh

View GitHub Profile
@obeleh
obeleh / gist:4451005
Last active March 10, 2021 09:47
Python autovivicating tree with parent reference (variation of the one-line tree)

I was reading the gist @https://gist.github.com/2012250

I found the autovivification functionality of it pretty cool. If only I could have a parent reference...

Obviously this was not going to be a one-line tree. But that wasn't the goal

A simple variant:

from collections import defaultdict
@obeleh
obeleh / tlsimap.py
Last active May 19, 2020 15:15 — forked from kiowa/tlsimap.py
Original version tried to import SSLFakeFile and SSLFakeSocket which aren't present in imaplib 2.58 so I've added them manually (after simply googling them)
""" Python IMAP with TLS/SSL support """
##
## Author: Alexander Brill <[email protected]>
## Copyright (C) 2004 Alexander Brill
##
## This program is free software; you can redistribute it and/or
## modify it under the terms of the GNU General Public License
## as published by the Free Software Foundation; either version 2
## of the License, or (at your option) any later version.
##
# this program uploads to google storage using boto and eventlet all the jpg files of a selected folder
import eventlet
bcon = eventlet.import_patched("boto.gs.connection")
import glob
FOLDER = "/Users/myself/Documents/" # replace this with your chosen folder
BUCKET_NAME = "whateveryourbucketname" # replace this with your bucket name
def upload(myfile):
c = bcon.GSConnection()
@obeleh
obeleh / gist:10820635
Created April 16, 2014 07:00
Python simple deepcopy

A while ago I needed a deepcopy function in python. I found out however that for my usecase I could better build my own. I want to share it so that others might benefit as well.

If the data you're copying is simple data, deepcopy might be overkill. With simple I mean if your data is representable as Json. Let me illustrate with code:

I've used [json-generator](http://www.json-generator.com/ to get some sample json data.)

def deepCopyList(inp):
    for vl in inp:
        if isinstance(vl, list):

Keybase proof

I hereby claim:

  • I am obeleh on github.
  • I am obeleh (https://keybase.io/obeleh) on keybase.
  • I have a public key ASClAogO1GpyxiZ2LahxpI1taQ--PR57vOu5RDLZvAQ--Ao

To claim this, I am signing this object:

Keybase proof

I hereby claim:

  • I am obeleh on github.
  • I am obeleh (https://keybase.io/obeleh) on keybase.
  • I have a public key ASBwd1i0mSdpcbZI_ZzKPEkjCyDyE97edqo8SZ3EWpp1jwo

To claim this, I am signing this object:

About docker ADD...

Recently I was asked to review parts of an automated tests PR that contained a Dockerfile. And even after using Docker for about 6 years I learned something new when I found this SO question

Because image size matters, using ADD to fetch packages from remote URLs is strongly discouraged; you should use curl or wget instead. That way you can delete the files you no longer need after they’ve been extracted and you won’t have to add another layer in your image. For example, you should avoid doing things like:

ADD http://example.com/big.tar.xz /usr/src/things/
RUN tar -xJf /usr/src/things/big.tar.xz -C /usr/src/things
RUN make -C /usr/src/things all
@obeleh
obeleh / max_open_files.md
Last active April 28, 2022 08:28
Max Open Files

Unfortunately this is still in a state of trial and error. What I can describe are the steps that I took to get it work and the steps that I took to validate it works:

Source articles for this article:

First off... There are multiple levels where you can specify the maximum nr of open files:

  • Based on user
  • Soft limit / hard limit
  • Based on process