Showing posts with label Functional Programming. Show all posts
Showing posts with label Functional Programming. Show all posts

Sunday, December 13, 2015

Comments on : Functional programming in java script


I highly recommend a wonderful set of interactive exercises here !

It starts with set of exercises for functional programming but later on takes an unexpected turn and goes on to demonstrate features of Microsoft's reactive extensions library for java script. Nevertheless, I jumped into it initially being curious about functional programming techniques common to java script.

Starting with functional programming, there are at least 25 exercises that show how defining a small set of simple abstractions can lead to short and powerful programs. I could work out defining and using the five primary functions (map, reduce, filter, zip and concatAll).

Totally convincing exposition of the power of functional programming - the way that truly complex business logic can be described by surprisingly few lines of code involving just intelligent application of those 5 primary functions ! Basically this was the most 'fun' part of this tutorial.

Later on, not knowing how asynchronous coding and the functional programming techniques are related, the reactive extensions library is introduced. It was striking to learn and find similarities between traversing arrays (using the functional techniques) with processing events (using special objects called 'Observables'). Observables are supposed to be (possibly infinite length) sequences of events. The sequences relate to arrays and thus we can use functional programming primitives to process them. Being observables, though, allows the object and underlying sequence to change over time. Whenever the observable changes, the listeners can run code to process the event.

So the crux of the tutorial is to show how the same kind of programming techniques can help process any kind of data collections or events. It is a unified way of dealing with synchronous data like arrays as well as asynchronous objects.

I am quite used to multi-threaded programming with C++ but not exactly with writing asynchronous java script code that is commonly involved with event handling. So even though I have not really faced the difficulties in handling asynchronous actions in code first hand, I can still appreciate that asynchronicity is bound to present considerable complexity in writing efficient and correct code. The tutorial also comes with examples that show how complex the code can get once certain asynchronous calls are made in parallel and we usually define success and failure callbacks on each request. If it is required that some activity happens once *all* parallel tasks complete then then we have to check and track each request status and perform accordingly.

I still found the narrative promising - that the unified programming model is very helpful in managing that complexity. Functional programming strikes one as hard, sometimes I wondered why one should bother about functional stuff when the equivalent procedural code was so easy and a mechanical activity to write (even if it was 20 times the size). One has to get over a significant learning curve to understand why.

But that is how programming has evolved it seems over the years, in pursuing procedural code for a lower learning curve, we now realize that some things cannot be too simple.



Saturday, January 18, 2014

Beginning with Haskell

I just gave myself permission to spend some time learning a new programming language, and here are my notes on my initial effort. I bet anyone who has read this, has on their radar, to learn some functional programming language. I chose Haskell. Judging by the experiences of others attempting the same thing, and with vastly more neural firepower than me, maybe that is a brave decision !

So, yes, Haskell is daunting from the outside, but I am only first checking out if it is really possible to make any progress without being an expert in Category Theory or Type Systems ... After all, Haskell is supposed to be very general purpose. Can it give me some prop by allowing a humble "Hello World" come out easy ?

Here is one of the first programs I attempted : calculating the 'n'th Fibonacci number.

With Haskell, this is a first attempt,

fib n
       | n == 0 = 0
       | n == 1 = 1
       | n > 1 = fib(n - 1) + fib(n - 2)


So terse and clean. I could challenge someone to write a specification of the Fibonacci program in plain English and in fewer characters. Even without any introduction to the syntax, we can see through this affair easily, as if someone just wrote the pseudo code. Of course, as horribly performing as a naive C++ program following the same pattern ! Calculating the 30th Fibonacci takes upwards of 5 seconds and if someone asks for the 60th Fibonacci, this program won't return before the Haley's comet and much later. Exponential algorithmic complexity can bite hardest, even with Haskell.

So here is a result after more hours of reading this fantastic introductory online book named "Learn You a Haskell" :

fib_improved n = fst $ fib_helper n (0, 1)

fib_helper 0 (a, b) = (a, b)
fib_helper n (a, b) = fib_helper (n - 1) (b, a + b)


Somehow, this is equally short as the earlier version but is super fast. Asking for the 100th Fibonacci instantly gets me this result : "354224848179261915075" !

The last line of this code tells the whole story, which is using recursion to iterate over a "tuple" repeatedly and transforming the tuple in the process. Probably this textual explanation does a far worse job (at explaining) than what the single line of Haskell makes obvious. The first two lines are just special cases and presentation.

The patterns I see so far are twofold :
1. Recursion is cool, but the algorithmic considerations are paramount. Haskell, with its principles about immutable data, presents additional twists to the efficient coding game. We need to worry about unnecessary copying (to implement data immutability), and how not to blow up the stack even while using recursion.
2. One of the techniques to make things efficient in Haskell is to pass additional arguments to function calls and "build" an answer through them, instead of first calling the function and then processing the returned data from a function. That is what the second version of the Fibonacci program does differently from the first.

Encouraging !