Skip to content

Instantly share code, notes, and snippets.

@titovanton
Created January 15, 2020 20:00
Show Gist options
  • Select an option

  • Save titovanton/e51e276a4bc65a6044fe77e54657c30b to your computer and use it in GitHub Desktop.

Select an option

Save titovanton/e51e276a4bc65a6044fe77e54657c30b to your computer and use it in GitHub Desktop.

Repo review

Repo is located by the following link.

README

Local setup

I've started to write this, then I found, that the developer uses Docker, inattentive me... So you can skip it, or read for the detailed explanation purpose.

There is no a reference to some sort of environment, which may cause incompatibility between developers machines: some of them may work on Win(like me), some of them may work on MacOS(most popular as I get it). The old school python way to make an environment to work in is python virtual environment package, so you can switch between different project on 1 machine and install needed packages corresponding to a particular project.

There are new methods to make an environment: Vagrant, Docker, etc. If you make 1 Vagrant/Docker project per project, then they not use shared python packages dir. More then that and as a pros to python venv approach, you incapsulate other services and tools(such as webserver, DBs, etc...) in to the Vagrant/Docker project. I approach Docker to run projects locally, also there is k8w to deploy Docker instances, which is also useful, but is up to DevOps.

Actual claims to the Docker setup

  1. If you use Docker, then enforce other developers to use it, don't let them make differences between your and their setup, which cause needless communications between them: how to install XXX on YYY... The same with any other setup way, not only with Docker...
  2. Ok, I'm gonna use Docker for the setup purposes, but I see "Install RabbitMQ". Either the README is not up to date and the developer uses Redis instead(because I did not find RabbitMQ in the docker-compose.yml), or I need to install it for the celery purpose outside the docker, why? Django settings.py will give me the answer, but it makes some mess on the start...
  3. Dockerfile code style, you just need to checkout their documentation, to sense the style. Nothing criminal, the file is small, but any way.
  4. Don't forget to create your own settings file. at </ibis_crawl_engine/conf> - what is that? I've thought it is a sort of .env file, but then I see .env file and its sample in the README.
  5. I found hardcoded versions in the requirements.txt which is a good sign, it may be a pain to maintain an old Django versions without hardcoded dependencies versions. But the file looks like an output of pip freeze. Very hard to navigate through. Are we sure, that we need all of these? Nothing criminal.
  6. Messy indents between services in the docker-compose.yml and again, nothing criminal.

I'm not sure if it's going to be just Docker up, to make it work, regarding the README is not up to date, so I'm not gonna try because of time limit.

Nginx

I would cache static files. And again, some messy indents(between rows), seems it going to chase me throughout this review of the project. But the config style is different for the better to Dockerfile.

PEP8

It is hard to write completely crap with flake8, but there are things which flake does not track: right imports ordering:

  1. snippet is for the standard libs
  2. then there should be 1 or 2 snippets for 3d party packages, such as Django, celery etc.
  3. then we import local packages, in order that import .module must go the last.
  4. import rows must go before the from - true for any snippets above.

He uses comments - that is good.

In the alchemyapi.py I found very strange declaration of the ENDPOINTS dict, it is a long spaghetti of ENDPOINTS[XXX][YYY], what is the purpose to write so, instead of PEP8 style(?):

ENDPOINTS = {
  'attr1': 'fee',
  'attr2': 123,
  'some_obj': {
    ...
  },
  ...
}

Strange thing with blank rows(indents between snippets of code) in the settings.params module, flake must track this style error, makes me feel like he does not use Flake8, but not 100% sure.

Now I'm in the crawl_engine.views and the perfect import block makes me feeling, that there were 2 different developers. The developer uses generics right way, does not implement everything via ViewSets, even if we need only 1 HTTP type endpoint - that is a good sign. Uses reloaded get_queryset instead of writing ugly custom method. All other modules in that crawl_engine app looks very neatly.

Resume

I like this developer more then I expected, he looks pretty experienced for the avr price rate, which I found in your profile ;) All claims above are fixable and believe me, I've seen a lot of crappy legacy code to say that.

  1. He uses Docker which is just awesome tool
  2. He uses tests engine I like - pytest. And he writes tests! That is unusual for a low price programmers and is a good sign. Not sure if the project covered with tests enough and they are up to date, but good.
  3. He tracks his code with flake8- good sign as well, but not a panacea of course.
  4. Every celery job is dockerized - that is how I expect it should be.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment