Skip to content

Instantly share code, notes, and snippets.

@valeriocos
valeriocos / logs_inspector.py
Last active June 3, 2019 10:37
Inspect logs info
# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2019 Bitergia
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
@valeriocos
valeriocos / get-oauth2-token-meetup-api
Last active May 4, 2022 07:03
Get a bearer/OAuth2 token for Meetup application-only requests in Python3
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Copyright (C) 2015-2019 Bitergia
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
#
@valeriocos
valeriocos / scava_stats.py
Last active July 9, 2019 09:04
Generates a list of stats (i.e., min, max, avg, sum, median, last) for a set of metrics per project, and save the data obtained to an CSV file.
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
#
# Get metrics from Scava and publish them in Elasticsearch
# If the collection is a OSSMeter one add project and other fields to items
#
# Copyright (C) 2018 Bitergia
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
omm_section,omm_score,scava_project_name,ow2_project_name
OMM_COM,"1,9",scheduling,proactive
OMM_CMM,4,scheduling,proactive
OMM_DOC,3,scheduling,proactive
OMM_INF,"2,9",scheduling,proactive
OMM_MGT,"4,4",scheduling,proactive
OMM_LIC,"1,2",scheduling,proactive
OMM_DEV,"0,8",scheduling,proactive
OMM_TST,"0,8",scheduling,proactive
OMM_REL,"4,3",scheduling,proactive
@valeriocos
valeriocos / dump-all.sh
Last active September 26, 2019 10:43
[crossminer] Dump Elasticsearch indexes (remember to create the BACKUP_FOLDER, you need to install elasticdump)
#!/bin/bash
BACKUP_FOLDER=backup
ELASTICSEARCH=https://admin:admin@localhost:9200
echo "scava-conf-smells"
NODE_TLS_REJECT_UNAUTHORIZED=0 elasticdump \
--input=$ELASTICSEARCH/scava-conf-smells \
--output=$BACKUP_FOLDER/scava-conf-smells.mapping.json \
--limit=100 \
@valeriocos
valeriocos / upload-all.sh
Created September 26, 2019 10:42
[crossminer] Upload Elasticsearch indexes (remember to create the BACKUP_FOLDER, you need to install elasticdump)
#!/bin/bash
BACKUP_FOLDER=backup
ELASTICSEARCH=https://admin:admin@localhost:9200
echo "scava-conf-smells"
NODE_TLS_REJECT_UNAUTHORIZED=0 elasticdump \
--input=$BACKUP_FOLDER/scava-conf-smells.mapping.json \
--output=$ELASTICSEARCH \
--output-index=scava-conf-smells \
@valeriocos
valeriocos / external_data.json
Last active September 28, 2019 07:39
An example of external data
[
{
"query_fields": [
{
"field": "Author_name",
"value": "Miguel Ángel Fernández"
}
],
"set_extra_fields": [
{
[
{
"conditions": [
{
"field": "author_name",
"value": "Valerio Cosentino"
}
],
"date_range": {
"field": "grimoire_creation_date",
import os
import subprocess
current_directory = os.getcwd()
REPO_URL = "https://github.com/chaoss/grimoirelab-perceval"
REPOSITORY_NAME = os.path.join(current_directory, REPO_URL.split("/")[-1])
# checkout to commit
sha = "076953e95735401b4d9266562f9ae406a30751a0"
[
{
"conditions": [
{
"field": "origin",
"value": "/tmp/perceval_mc84igfc/gittest"
}
],
"set_extra_fields": [