This seems like egregious hype mongering. Two examples:
a) "Pure functions are easily parallelizable since they encourage immutable data structures which reduce the side-effects that make code hard to run on multiple processors. This is how Bitcoin will reach its infinite scalability."
Pretty abrupt transition there from a sentence that had me nodding along to one that gives me whiplash. Wait, wha? Pure functions are used in lots of places outside Bitcoin. None of them brags infinite scalability. So perhaps you need more than pure functions?
b) "@TensorFlow uses the functional programming paradigm of lazy evaluation. A tensor flow graph exists separately to the computation of that graph.
"Bitcoin can and will be used to create true artificial intelligence."
I don't think I need to say anything about the last sentence. For the rest, the author needs to make up their mind whether the paradigm is functional programming or lazy evaluation. Lazy eval is just a mechanism. Tensorflow uses it in the part of the system that is stateless. Other parts of ML are incredibly (and incredibly subtly) stateful.
My homebrew blog server has a part with lazy evaluation. It's a few lines of code. It's not going to give me infinite scalability anytime soon.
 Since many ML systems don't explain their models, there's a state dependency from any results they provide to all training data they've ever seen.