First I'll say that whoever is telling you you need to understand farkas lemma to understand loop carried dependencies is completely fucking with you. Even the people that should understand it probably don't (I can think of 5 people in my professional circle that are "polyhedral" people and I guarantee they don't remember/don't care about farkas).
Secondly, farkas clicked for me after thinking about barrier functions and duality. Bear with me because I haven't thought about this in a while:
Ax = b
is a linear system of constraints and you're looking for a solution right? Constrained optimization is "hard" because in principle your search algo should always satisfy the constraints.
What's easier is unconstrained optimization i.e. gradient descent. How can you (sort of) turn Ax = b into an unconstrained optimization problem? Write down a penalty function that "penalizes" for violating each of the constraints and then just use gradient descent.