Skip to content

Instantly share code, notes, and snippets.

@hsestupin
Last active August 29, 2015 13:57
Show Gist options
  • Save hsestupin/9464175 to your computer and use it in GitHub Desktop.
Save hsestupin/9464175 to your computer and use it in GitHub Desktop.
Trying to choose the most appropriate persistent message queue for the following needs
1) It should be persistent at the most deep meaning of that word.
All events need to be available forever.
Persistent storage will never be cleaned up in future.
2) Several producers will generate approximately 10 millions new events per day
3) When system starts there will be some default number of different
consumers which will be fed with some filtered stream of those events.
4) In runtime user will have capablity to define its own custom consumers
which will start eating events from the very beginning. The starting point
for consumer could be some timestamp mark. Performance of reading batches of
events should not depend on choosen timestamp
5) ofcourse events have to be ordered by creation time.
6) each message is simply just set of [id, create_timestamp, json_body].
The message body size could vary.
I've already considered RabbitMQ and apache kafka but unfortunately none
of them perfectly fits to my needs. May be I should look at some modern
database which will be suitable enough for this kind of tasks.
At this time I'm thinking about using PostgreSQL table but i'm guessing
this decision is so far from the right choice. So any advice will be appreciated.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment