Add the EnrichCommand middleware to your command router:
defmodule MyApp.Router do
use Commanded.Commands.Router
This is way more complicated than it should be. The following conditions need to be met :
In this particular case, I'm interested in bringing in the 'default' template of jsdoc as a sub-directory in my project so I could potentially make changes to the markup it genereates while also being able to update from upstream if there are changes. Ideally their template should be a separate repo added to jsdoc via a submodule -- this way I could fork it and things would be much easier.... but, it is what it is.
After much struggling with git, subtree and git-subtree, I ended up finding this http://archive.h2ik.co/2011/03/having-fun-with-git-subtree/ -- it basically sets up separate branches from tracking remote, the particular sub-directory, and uses git subtree contrib module to pull it all togther. Following are
| defmodule EventStore.CategoryStreamLinker do | |
| @moduledoc """ | |
| Links streams from aggregate instances to their respective category streams. | |
| example: events from stream_uuid of `contractors_contract-07c52787-da0c-444f-9783-5d380f7093f9` will be | |
| linked to stream_uuid of `contractors_contract`. | |
| """ | |
| use Commanded.Event.Handler, | |
| application: My.App, |
| // helps us in parsing the frontmatter from text content | |
| const matter = require('gray-matter') | |
| // helps us safely stringigy the frontmatter as a json object | |
| const stringifyObject = require('stringify-object') | |
| // helps us in getting the reading time for a given text | |
| const readingTime = require('reading-time') | |
| // please make sure you have installed these dependencies | |
| // before proceeding further, or remove the require statements | |
| // that you don't use |
| -module(generic_proxy). | |
| -export([run/0]). | |
| -define(PORT_FROM, 63790). | |
| -define(PORT_TO, 6379). | |
| -define(BACKLOG, 10000). | |
| run() -> | |
| {ok, Socket} = gen_tcp:listen(0, [ |
| // This file was generated from JSON Schema using quicktype, do not modify it directly. | |
| // To parse and unparse this JSON data, add this code to your project and do: | |
| // | |
| // txns, err := UnmarshalTransactionList(bytes) | |
| // bytes, err = txns.Marshal() | |
| package main | |
| import "encoding/json" |
| package domain // domain layer | |
| import "google.golang.org/protobuf/proto" | |
| // Event represents a domain event that has been retreived from the event store. | |
| type Event interface { | |
| ID() string | |
| Revision() uint64 | |
| Data() proto.Message | |
| } |
This code is extracted from one of my private projects as an example of how to implement encryption of PII in event streams using two keys: a master key for each "data subject" that is stored in Vault and never transported to the systems that process the PII, and a key unique to each event that is stored (itself encrypted) with the event.
To be clear, the key that is stored with the data is encrypted by another key that is not stored with the data. The idea is that each "data subject" has an encryption key that is stored in Vault (external). When you encrypt data, the library will:
| using System; | |
| using System.Collections.Generic; | |
| using System.Configuration; | |
| using System.IO; | |
| using System.Text; | |
| using System.Threading; | |
| using System.Xml; | |
| using System.Xml.Serialization; | |
| using System.Linq; | |
| using domain; |
| { | |
| "environment": "staging" | |
| } |