Skip to content

Instantly share code, notes, and snippets.

View denzhel's full-sized avatar
🇺🇦

Dennis Zheleznyak denzhel

🇺🇦
View GitHub Profile
@denzhel
denzhel / scheduled_ec2_shutdown.md
Last active April 22, 2021 09:01
scheduled_ec2_shutdown

To schedule the shutdown and startup of your EC2 instances, you can use this Lambda Python code. Please note that this code looks for EC2 instance by their Tag(line 8), e.g role = runner.

import boto3

# Define variables
client = boto3.client('ec2', region_name="us-east-1")
filters =[ { 'Name': 'tag:<KeyName>', 'Values': ['<ValueName>'] } ]

# Get the list of instances by searching for the tag configured above
@denzhel
denzhel / travis_scala_sbt_error.md
Created April 22, 2021 04:39
fix travis scala download error

If you're getting the following error while using travis CI:

Downloading sbt launcher for 1.2.6:
221  From  https://repo.scala-sbt.org/scalasbt/maven-releases/org/scala-sbt/sbt-launch/1.2.6/sbt-launch-1.2.6.jar
222    To  /home/travis/.sbt/launchers/1.2.6/sbt-launch.jar
223Downloading sbt launcher 1.2.6 md5 hash:
224  From  https://repo.scala-sbt.org/scalasbt/maven-releases/org/scala-sbt/sbt-launch/1.2.6/sbt-launch-1.2.6.jar.md5
225    To  /home/travis/.sbt/launchers/1.2.6/sbt-launch.jar.md5
226cat: /home/travis/.sbt/launchers/1.2.6/sbt-launch.jar.md5: No such file or directory
227md5sum: 'standard input': no properly formatted MD5 checksum lines found
@denzhel
denzhel / delete_files_older_than.md
Last active April 22, 2021 04:34
delete files older than

To delete files older than a certain date, you can use the following:

sudo find . -mtime +<AmountOfDays> -type f -delete

#linux #bash

@denzhel
denzhel / generate_data_elasticsearch.md
Created February 15, 2021 01:34
Generate random data for elasticsearch

If you need to generate some random data, use this tool:

git clone https://github.com/oliver006/elasticsearch-test-data.git
cd elasticsearch-test-data
pip install -r requirements.txt

Pass in the required info:

/es_test_data.py --es-url=http://:9200
@denzhel
denzhel / unique_keys_mongo_query.md
Created February 11, 2021 06:37
Return only unique keys in Mongo query

If you want to return only unique keys/fields in mongo query while searching inside a collection:

db.users.distinct("name")

If you want to know how many entries were returned:

db.users.distinct("name").length
@denzhel
denzhel / mongo_increase_screen_buffer.md
Created February 9, 2021 07:30
Increase Mongo shell screen buffer

If you're trying to query Mongo and get this message:

Type "it" for more

Run this command:

DBQuery.shellBatchSize = 200;
@denzhel
denzhel / git_show_commits_file.md
Last active January 28, 2021 06:28
Show all commits that changed a file

If you want to know when and who changed a specific file in your Github repo:

git log --follow -- <fileName>
@denzhel
denzhel / elasticsearch_disk_usage.md
Created January 28, 2021 06:26
Query elasticsearch cluster for disk usage

Use this to query the disk usage of your elasticsearch cluster:

curl -XGET "http://localhost:9200/_cat/allocation?v&pretty"

Output:

shards disk.indices disk.used disk.avail disk.total disk.percent host          ip            node
  3213      460.2gb   584.9gb      1.3tb      1.9tb           29 1.1.1.1       1.1.1.1       host1.example.com
  3213      479.5gb   599.3gb      1.3tb      1.9tb           29 1.1.1.2       1.1.1.2       host2.example.com
@denzhel
denzhel / mongo_export_collections.md
Last active January 26, 2021 21:32
Export all collections from a database

If you want to export specific collections from a specific database, you can use this bash loop:

for collection in \ 
$(mongo <databaseName> --quiet --eval "rs.slaveOk(); db.getCollectionNames().join('\n')" | grep <collectionPrefix) ; \
do mongoexport --collection=$collection --db=<databaseName> --out=$collection ; \
done
@denzhel
denzhel / es_download_docs_from_index.md
Last active January 26, 2021 13:15
Elasticsearch download all documents from an index

To download all documents from an index, first you need to create a search ID:

curl -X POST "localhost:9200/<indexName>/_search?scroll=5m&pretty&size=10000"

This will keep a search session for 5 minutes

Then, send another query to scroll through the pages of data until there is no more:

curl -X GET "localhost:9200/_search/scroll?pretty&scroll=5m&scroll_id=<searchIdFromThePreviousStep>"