- Create certificates
- Edit Docker options
- Restart Docker
- Copy client certificates from host
- (optional) Add remote endpoint in Portainer
Tested on a standard $5/mo DigitalOcean VPS running Ubuntu 16.04.
| FROM user/baseimage:version-tag | |
| # 1. This is VERY basic | |
| # 2. Need to load RVM first during build | |
| # 3. Create empty folders for runtime files | |
| RUN \ | |
| rm /bin/sh && ln -s /bin/bash /bin/sh && \ | |
| mkdir -p \ | |
| /home/app/webapp \ |
Tested on a standard $5/mo DigitalOcean VPS running Ubuntu 16.04.
| node_modules | |
| dist/ | |
| yarn.lock | |
| wwwroot |
| #!/bin/bash | |
| # | |
| # Docker Daemon varsayilan olarak ag iletisimi olmadan | |
| # sadece Unix socket uzerinden calismaktadir. | |
| # Docker Client ve Daemon arasinda HTTPS uzerinden | |
| # guvenli iletisimin kurulmasi icin TLS aktif hale getirilmelidir. | |
| # Client ve Server/daemon arasindaki guvenli iletisim agi | |
| # icin gereken client ve server sertifikasyonlarini | |
| # olusturmaliyiz. | |
| # |
| --- | |
| METHOD 1 | |
| This should roughly sort the items on distance in MySQL, and should work in SQLite. | |
| If you need to sort them preciser, you could try using the Pythagorean theorem (a^2 + b^2 = c^2) to get the exact distance. | |
| --- | |
| SELECT * | |
| FROM table | |
| ORDER BY ((lat-$user_lat)*(lat-$user_lat)) + ((lng - $user_lng)*(lng - $user_lng)) ASC |
| # This is just a cheat sheet: | |
| # On production | |
| sudo -u postgres pg_dump database | gzip -9 > database.sql.gz | |
| # On local | |
| scp -C production:~/database.sql.gz | |
| dropdb database && createdb database | |
| gunzip < database.sql.gz | psql database |
| namespace :json do | |
| desc "Export all data to JSON files" | |
| task :export => :environment do | |
| Rails.application.eager_load! | |
| ActiveRecord::Base.descendants.each do |model| | |
| file = File.open(File.join(Rails.root, "db", "export", "#{model.table_name}.json"), 'w') | |
| file.write model.all.to_json | |
| file.close | |
| end |
| stats = Sidekiq::Stats.new | |
| stats.queues | |
| stats.enqueued | |
| stats.processed | |
| stats.failed |
For this configuration you can use web server you like, i decided, because i work mostly with it to use nginx.
Generally, properly configured nginx can handle up to 400K to 500K requests per second (clustered), most what i saw is 50K to 80K (non-clustered) requests per second and 30% CPU load, course, this was 2 x Intel Xeon with HyperThreading enabled, but it can work without problem on slower machines.
You must understand that this config is used in testing environment and not in production so you will need to find a way to implement most of those features best possible for your servers.