Immutable values are facts, events that happened, snapshots. Manipulating immutable values by means of pure functions is a deterministic process, meaning that for the same input you always get the same output. But when you introduce time in the picture, now you have streams of events that happen and concurrency and race-conditions.
Non-determinism basically means that you can't rely on the ordering of events. And by not having an order, all bets are off.
One thing we can do is to control ordering for the things we care
about. As an example this is how synchronization works on the JVM - by
using synchronize blocks, volatile reads/writes, compare-and-set, etc,
all the runtime does is to introduce memory barriers that enforce
ordering between certain events, creating what is known as
happens-before relationships. Based on these we can then come up with
algorithms that process stuff in parallel, but then synchronizes at
the end by aggregating the results of workers in a way as to seem
referentially transparent (e.g. Task.map2(fork,fork) would return
the same T). This is the essence of IO.
Another thing we can do is to treat time like a stream of events. We
can observe the passage of time, we can even record it for a finite
period and refer back to important events. And we can take snapshots
of the world, which will be values we may want to understand, so we
could continuously evolve state machines specified by pure functions
being triggered as the rivers of information flow. And so you evolve
your state machines and then you react on certain states. And
sometimes you care about determinism and ordering and sometimes you
don't and some tools make that an explicit choice. For example there's
a difference between streams of events being concatenated, meaning
flatten/join, or merged. This is reactive programming and Observable
is one interpretation of it, being basically a producer-consumer
relationship. And Task can be viewed as a stream that emits a single
event, so that can make things fun.
Does that make sense? :-)