Skip to content

Instantly share code, notes, and snippets.

@juslintek
Last active February 5, 2021 11:59
Show Gist options
  • Save juslintek/e8c8ed23c26a9264b72884963d4b6c0b to your computer and use it in GitHub Desktop.
Save juslintek/e8c8ed23c26a9264b72884963d4b6c0b to your computer and use it in GitHub Desktop.
Alna task

Alna Software task

The tasks do not require full completion. Do as much as you can. This test is a full-stack test with a little DevOps. We understand that you might not know how to use some of the tools. It's completely ok.

  • When applying for frontend only, you can simulate the backend however you desire.
  • When applying for backend only, you can simulate the frontend however you desire.
  • If you're a full-stack developer, it would be nice to demonstrate knowledge of vue-js.

DevOps:

You can skip this part if you don't know docker, and create a default Laravel project. Setup two projects:

  • first frontend which uses vue-js. If you want to add styling, you can use TailwindCss
  • second backend use Laravel for the backend.
  • Connect them up with docker-compose into a single network.

Frontend:

The user can:

  • submit a list of websites with some selector
  • submit as many lists as they want in parallel
  • track the progress of any running job in the backend
  • see the results of the crawled websites, grouped by crawl start time

Bonus points:

  • Use web-sockets/socket.io to push updates to the frontend to update the progress automatically.
  • Show the progress of the current job progress in progress bar or percentage.

Backend:

Backend should have all RESTful endpoints for covering frontend

Backend should have at least three endpoints:

  • one for handling crawl job submission
  • second for checking job status
  • third for displaying a list of parsed texts and page titles from submitted URLs as articles.

Must use the Laravel framework.

Bonus points:

Crawl job submission endpoint

  • Handles multiple URLs per job sequentially.
  • Dispatches Laravel job and passes submitted data to it.
  • Endpoint returns job id with 202 HTTP response code.

Job Status monitoring endpoint

  • Returns progress information of how many URLs have been done and how much there are left.
  • Returns the percentage of completion.
  • Is accessible via websocket/socket.io and pushes messages to frontend.

Tips

  • To display progress you can use request handlers progress feature, for example: curl progress.
  • Store progress result in some cache for on-demand retrieval.
@Neophen
Copy link

Neophen commented Feb 5, 2021

Alna Software task

The tasks do not require full completion. Do as much as you can. This test is a full-stack test with a little DevOps. We understand that you might not know how to use some of the tools. It's completely ok.

  • When applying for frontend only, you can simulate the backend however you desire.
  • When applying for backend only, you can simulate the frontend however you desire.

For fullstack position:

Please integrate frontend with backend using a REST API.

DevOps:

You can skip this part if you don't know docker, and create a default Laravel project.
Setup two projects:

  • first frontend which uses vue-js. If you want to add styling, you can use TailwindCss
  • second backend use Laravel for the backend.
  • Connect them up with docker-compose into a single network.

Frontend:

The user can:

  • submit a list of websites with some selector
  • submit as many lists as they want in parallel
  • track the progress of any running job in the backend
  • see the results of the crawled websites, grouped by crawl start time

Bonus points:

  • Use web-sockets/socket.io to push updates to the frontend to update the progress automatically.
  • Show the progress of the current job progress in progress bar or percentage.

Backend:

Backend should have all RESTful endpoints for covering frontend

Backend should have at least three endpoints:

  • one for handling crawl job submission
  • second for checking job status
  • third for displaying a list of parsed texts and page titles from submitted URLs as articles.

Must use the Laravel framework.

Bonus points:

Crawl job submission endpoint

  • Handles multiple URLs per job sequentially.
  • Dispatches Laravel job and passes submitted data to it.
  • Endpoint returns job id.

Job Status monitoring endpoint

  • Returns progress information of how many URLs have been done and how much there are left.
  • Returns the percentage of completion.
  • Is accessible via websocket/socket.io and pushes messages to frontend.

Tips

  • To display progress you can use request handlers progress feature, for example: curl progress.
  • Store progress result in some cache for on-demand retrieval.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment