Part of collection: Hyper-converged Homelab with Proxmox
How to create a Erasure Coded Pool in Ceph and use 'directory pinning' to connect it to the CephFS filesystem.
To use a Erasure Coded Pool with CephFS, a directory inside the CephFS filesystem needs to be connected to a Erasure Coded Pool, this is called 'directory pinning'.
The directory needs to be created inside the CephFS filesystem folder on one of the Proxmox Hosts.
mkdir /mnt/pve/cephfs/test-pool
This only has to be done only on one of the Proxmox Ceph cluster nodes!
Now create an Erasure Coded Ceph Pool, this example uses the Ceph Dashboard; but it can also be done via the command-line.
Open the Ceph Dashboard, if it's installed, it can be found at: http://192.168.1.11:8443
Create the pool
Fill in the details and click + next to 'Erasure code profile'.
Create EC Profile 1
Select the desired settings; this example stores the Data on large but slower SSD drives, avoiding storing it on the faster NVMe in the cluster.
Create EC Profile 2
Add the folder to the CepFS File system using 'directory pinning'.
Open the terminal on a Host:
Add the Data Pool to CephFS
ceph fs add_data_pool cephfs test-pool
Connect the Data Pool to the directory.
setfattr -n ceph.dir.layout -v pool=test-pool /mnt/pve/cephfs/test-pool/
Now the folder is ready to be used with Proxmox VM's via Virtio-fs or other methods.
Performance is not yet battle tested, but ‘brute force’ testing looks very good, even from a Vm with virtiofs, I get almost identical performance as on the host.