Skip to content

Instantly share code, notes, and snippets.

@kmadac
kmadac / HowTo.md
Last active December 16, 2022 09:57
Vagrant environment for Ceph cluster

How to prepare dev/test Ceph environment

When you are using Ceph in production, it is important to have environment where you can test your upcoming upgrades, configuration changes, integration of new clusters or any other significant changes without touching real production clusters. Such environment can be simply built with the tool called Vagrant, which can very quickly build virtualized environment describe in one relatively simple config file.

We are using Vagrant on Linux with libvirt and hostmanager plugins. Libvirt is a toolkit to manage Linux KVM VMs. Vagrant can also create virtualized networks to interconnect those Vms as well as storage devices, so you can have almost identical copy of your production cluster if you need it.

Let‘s create 5 nodes Ceph cluster. All nodes First 3 nodes will be dedicated for control node daemons, all nodes will also be OSD nodes (2 x 10gb disks on each node be default), and one node will be client node. Client nodes can be used for

configoptions="
--enable-cli \
--enable-debug \
--enable-bcmath \
--enable-calendar \
--enable-exif \
--enable-ftp \
--enable-mbstring \
--enable-pcntl \
--enable-soap \
#!/usr/bin/env python
# creates file account_key.json for simp_le.py
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
from acme import jose
key=jose.JWKRSA(key=rsa.generate_private_key(public_exponent=65537, key_size=4096, backend=default_backend()))
with open('account_key.json', 'wb') as persist_file:
persist_file.write(key.json_dumps())
# using: rqworker -P /Path/To/Dir/With/NewWorker -w NewWorker.NewWorker -c settings
from rq import Worker
# import CustomLibs
class NewWorker(Worker):
def __init__(self, *args, **kwargs):
# Lets prepare database connection here
# take settings from environment variables, or configuration file
# self.db = CustomLibs.DB.Connect(*settings)
super(NewWorker, self).__init__(*args, **kwargs)
@kmadac
kmadac / ponyhostingdb.py
Last active August 29, 2015 14:07
PonyORM first try
from pony.orm import *
from datetime import date, datetime
db = Database('sqlite', ':memory:')
sql_debug(True)
class UhUser(db.Entity):
uname = PrimaryKey(unicode)
password_sha1 = Required(unicode)
email_contacts = Required(unicode)
@kmadac
kmadac / gist:05bdbc795a4bab773857
Created July 29, 2014 12:54
OnTap 7mode audit log regexp
\w{3}\s{1,2}\d{1,2} \d\d:\d\d:\d\d (\w)+ \[.+?\]: (\w+)@\[(.+)_\d+\]:IN:ssh2 shell:(SSH INPUT COMMAND is )*(.+)
### Examples of log file:
Jul 29 09:48:39 filer1 [rshd_0: debug]: user@[10.229.144.23_46673]:IN:ssh2 shell:SSH INPUT COMMAND is rdfile /etc/rc
Jul 29 10:14:23 filer2 [telnet_0: debug]: user@[10.229.144.23_47147]:IN:ssh2 shell:ping 10.10.10.10
Jul 29 10:15:08 filer3 [filer3: rshd_0:debug]: user@[10.229.144.23_39181]:IN:ssh2 shell:SSH INPUT COMMAND is snapvault status
Jul 4 12:45:11 filer4 [filer4: rshd_0:debug]: user@[10.229.144.25_34672]:IN:ssh2 shell:SSH INPUT COMMAND is ifstat e4b
### Use in grep:
grep -P '\w{3}\s{1,2}\d{1,2} \d\d:\d\d:\d\d (\w)+ \[.+?\]: (\w+)@\[(.+)_\d+\]:IN:ssh2 shell:(SSH INPUT COMMAND is )*(.+)' auditlog
@kmadac
kmadac / ChmMigSim.py
Last active August 29, 2015 14:03
Migration Change process simulation
__author__ = 'kmadac'
import random
from bisect import bisect
people = 5
max_changes_per_person_intime = 5
#
#change_complete_time_days = ([(14, 3), (27, 5), (41, 3), (60, 1), (90, 1)])
# real data from migration of real cluster
@kmadac
kmadac / outer_var.py
Created May 28, 2014 21:52
Access variables in outer class
class Meter(object):
def __init__(self, name, parent=None):
self.name = name
self.parent = parent
def __str__(self):
return self.name
def __repr__(self):
return self.__str__()
@kmadac
kmadac / README
Last active August 29, 2015 13:59
backup mysql database for particlar domain by mysqldump
Script will create testing databases and creates backup of all databses which belongs to domain testdom123.sk.
Result of the script is file testdom123.sk.sql.
Restore can be done by applying the file back to mysql:
mysql -u root -plol < testdom123_sk.sql
.idea/*
*.pyc