Skip to content

Instantly share code, notes, and snippets.

View oddlyfunctional's full-sized avatar

Marcos Felipe Pimenta Rodrigues oddlyfunctional

View GitHub Profile
const downloadURI = (uri, name) => {
const link = document.createElement("a");
link.download = name;
link.href = uri;
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
}
const downloaded = new Set()
@oddlyfunctional
oddlyfunctional / README.md
Last active November 26, 2021 06:09
Normalize reports db

How to run

Indexing normalized db

  • Get the database name that you want to copy from and paste it onto the originalDbName variable at normalize.js:16
  • Create a normalized_db database using the schema at schema.sql
  • Run node --max-old-space-size=16384 normalize.js (had to set my memory up to the max otherwise it would fail when copying abeam)
  • Run time psql normalized_db < normalized-indexes.sql to measure how long it takes to index the normalized db

Indexing denormalized db

New reports system

In order to improve the response time of our reports, as well as enabling us to write more features such as visualizations, we are comparing two possible solutions for storing, querying and exporting precompiled datasets: Elasticsearch and PostgreSQL.

Requirements

Within a dataset of up to millions of documents and adding up to GBs, we want

/*
* Usage:
$ node decode.js https://test-idp1.gakunin.nii.ac.jp/idp/profile/SAML2/Redirect/SSO?SAMLRequest=nVPBjtowEP2VyPckNgvqyiKsKKgq0rYbkWwPe6mMMyxuEzv1THbp39cJUNFqlwOnWDPPzzPvvUzv9k0dvYBH42zGRMLZ3WyKqqlbOe9oZ9fwqwOkKMAsyqGRsc5b6RQalFY1gJK0LOZf7uUo4bL1jpx2NYvmiOAp8C6cxa4BX4B%2FMRoe1%2FcZ2xG1KNOUAnmMrUie1c%2FOGptYYxKlkx9tWuzMZuNqoF2C6NL%2BhVGaPxQli5bhlrGKhqH%2FoTLVm1yhnIbJtqaGI9EaKuNBU1oUDyxaLTP2XUy4vp2Mt2IrPqgbzicCbkMLsYOVRVKWMjbiIx5zEd%2FwUnApxnLEk8l48sSi%2FLj4R2MrY58vq7Q5gFB%2BLss8Piz17eRCALCDCXJ43J%2Bpf5lWnSRns%2F8EjkWiXecRNgohHKfpGf%2FJ8a%2BBcLXMXW3072sc%2F%2BR8o%2Bgyuq%2BYKt4OUNn2OyOBpZCXunavCw%2BKIGPkO2DpabBjCqEaMhnyRLC%2FKpML17TKG%2Bxlhr3SdBL6nHhRBx3XsL1G9oswLXVPHcp5%2BLw6X%2FWhCSGEqvTKYus8HY15a57ZofeOHH%2B75%2F%2Ft7A8%3D
<?xml version="1.0"?><samlp:AuthnRequest xmlns:samlp="urn:oasis:names:tc:SAML:2.0:protocol" AssertionConsumerServiceURL="https://test-sp1.gakunin.nii.ac.jp/Shibboleth.sso/SAML2/POST" Destination="https://test-idp1.gakunin.nii.ac.jp/idp/profile/SAML2/Redirect/SSO" ID="_150c854f1f1

batch.submit

This method allows to perform other methods as batch operations.

URL

https://yourdomain.com/api/batch.submit/{method}

Request

Adding a new identity provider

  1. Make sure the org has an entry in org_domain
  2. Prepare the configuration for passport-saml, all keys are accepted, but the minimum necessary are: idpIssuer (idP's ID), entryPoint (idP's endpoint for sending SAML requests), cert (idP's public certificate) and callbackUrl (Coursebase's endpoint for the SAML response).
  3. Serialize the configuration object with JSON.stringify(config) (if you'll copy and paste, make sure to do it on node environment since the browser can print the serialized strings with some escaping issues). For example:
let config = {
  idpIssuer: "https://app.onelogin.com/saml/metadata/c7ad6f53-52e0-4ff1-996c-3222c0850812",
  entryPoint: "https://greyhound-dev.onelogin.com/trust/saml2/http-redirect/sso/930729",
  cert: "-----BEGIN CERTIFICATE-----\\nMIID4jCCAsqgAwIBAgIUAipD1o1iXRi/BVk4iHeBGhei+S4wDQYJKoZIhvcNAQEF\\nBQAwRzESMBAGA1UECgwJR3JleWhvdW5kMRUwEwYDVQQLDAxPbmVMb2dp

Setting up local environment to test SAML

You'll need to:

  1. clone this branch for api2
  2. clone this branch for ui
  3. run these migrations
  4. add this entry to your /etc/hosts: 127.0.0.1 coursebase.onelogin.com
  5. setup your .env in coursebase repo to use HTTP_SERVER_PORT=3000
  6. setup your .env in ui repo to use DEFAULT_API_V2_URL=http://coursebase.onelogin.com:3000/v2
open Task;
type task('a) = Task.t('a);
type issuer;
type nameId;
type email;
type user;
type validatedUser = ValidatedUser(user);
type unvalidatedUser;
@oddlyfunctional
oddlyfunctional / graphql-clients-benchmark.md
Last active May 26, 2021 04:57
Comparison between GraphQL clients

Client-side GraphQL

graphql_ppx

Let's first analyze how to write the queries. I experimented the graphql_ppx with a few different features (nullable and non-nullable variables, fragments, convert response into record automatically). The type safety is very neat, both the input variables and the response have specific types (no Js.Json.t!), and the queries/mutations are validated against the schema! This makes it really easy to write queries, although I worry about integrating new versions of the schema since the repos are separated (we should probably re-fetch the schema as part of the CI).

As for the drawbacks, there are not many:

  • The ppx really messes up my language server sometimes.
  • I haven't tried a lot but I couldn't find easily a way to use refmt to format the queries.
  • I couldn't find anything about custom directives, the lack of which would prevents us from using some features from the GraphQL clients (for example, managing local state using Apollo's @client directive).
  • I haven't