Hey, I'm the author of this library. It's definitely inspired by mori (and clojure and Haskell) and the reason I ended up building something different was to present a more JavaScript friendly API (and academically, to learn about HAMT). I've built this over the last couple weeks, and we are not using it internally yet - but I wanted to ensure development of it happened in public.

A question: how influential has Swanodette's work with ClojureScript/Om in relation to React been to the direction that you (yourself and Facebook) are taking the evolution of React and its supporting libraries in?

From what I've seen, I'm guessing: quite a bit, but I'm not sure that I've noticed it being mentioned. In particular, I'm guessing that the performance characteristics of Om might be especially inspiring.

Edit: Oh, and good work on the library! It's great to see efforts to make the case of immutability within the world of JS.

I contribute to React. I'm definitely a big fan of ClojureScript and I borrow ideas from it when I can. I currently have a few crazy ideas that are direct results of looking into the Clojure (or Haskell) ecosystem and finding that they've solved my problems, but better. In general, some of the problems we face, at language/library/ecosystem level, might simply not exist in these languages. It's a shame they're not more popular.

Regarding popularity, I feel that sometimes the web development community can be ironically dogmatic, considering that it's a free and open platform (see: HN/Reddit's initial reaction to React). Looking back, I still find it incredible that React actually gained so much traction when it defies so many of the pre-established "best practices". I think lots of people, including me, are learning more and more to take a step back before dismissing an idea.

So if you were one of those that dismissed Clojure/ClojureScript because "parentheses in front of my function call?", "snake case isn't for me", or "it's too different from idiomatic js" (I hope this one isn't too much of a strawman here), then I urge you to give it another try. Learning React doesn't have to be the only time you open your mind to new ideas and reconsider best practices.

Personally, a good amount! I know a lot of the React team are also big fans of Om. React has always been inspired by functional programming and I think in another universe we would have preferred to write it in Scala or Haskell or Clojure. With React (and this Immutable lib) we've been trying to incorporate ideas of functional programming into JavaScript product while still feeling familiar and "native to JS".

I wrote an immutable vector library once, but it turned out impractical to do game physics with because it was too slow. Switching to self-modifying vectors fixed the problem. I think the performance hit of allocating new JS objects might be too high for games.

ClojureScript (and thus Mori) has been continuously tuned for 3 different major JS engines - JavaScriptCore, V8, and SpiderMonkey. Running some basic benchmarks overall Mori comes out ahead on most operations. But there's nothing surprising here, immutable-js is brand spanking new :). Also there are micro-optimizations (which add up) in ClojureScript that are impossible to expose to JS users w/o a compiler.

In anycase this is great stuff and I'm excited to see how JavaScript and more specifically React devs leverage these to explore the ideas currently be played around with in ClojureScript/Om/Elm/etc.

The second is a less interesting pure JS implementation that I show as faster than mori and slower than hamt on the hashtrie-benchmark. I've been meaning to test them on a wider variety of benchmarks (more aggregate operations and vs transients) but I only ran across them last week and I've been busy.

Thanks for the links. Sadly doing mori benchmarks like this does't say as much about the performance of ClojureScript data structures than it does about mori specifically - in many cases you are likely just benchmarking multi-arity function dispatch overhead because you don't have the ClojureScript compiler doing direct dispatch for you. If you want accurate benchmarking of ClojureScript data structures you will need to write the benchmark in ClojureScript for the time being and call them from JavaScript.

Also benchmarking in Node.js doesn't reflect the wide variance you find in JS engines. In several cases in the past we avoided V8 specific optimizations because it punished other JS engines.

UPDATE: I also ran some benchmarks and I can't replicate the perf degradation as the size of the hash map increases. My suspicion is that the random key generation may result in many hash collisions w/ Murmur3 which would explain the dramatic perf degradation which is unlikely to occur in the wild.

With the js-hashtrie-benchmark, I wanted to get a rough idea of the relative performance characteristics of the various libraries. Compiling the Mori benchmarks in ClojureScript may produce technically more accurate results, while also giving Mori a sizable compiler optimization advantage over the other libraries, but I was mainly interested in how the various implementations performed when called from JS.

I'm not a benchmarking or ClojureScript expert, so please let me know if you find any problems with my benchmarks or can make any improvements to make them more accurate. Hashing is one area that I need to investigate more in my libraries and in the benchmarks.

Getting more than one datapoint would be nice too. None of my code is optimized specifically for Node, so seeing benchmarks on other engines would be interesting.

The feature set is comparable sans yet to come SortedSet and SortedMap (aka Red Black Tree). Performance should also be comparable if not slightly faster than mori in some iteration cases, as it's tuned for JS instead of ClojureScript and I've forgone some of the functional purity for speed. But this is anecdotal and I don't have any perf tests yet.

I'm curious at what you are hinting at here. Especially when you are discussing the data structure, I've not seen too much magic that an optimizing compiler can really achieve. That is, the compilers are usually more geared to help speed up the code, not so much the data.

does facebook intend to use this internally? The code is (C) Facebook, so Facebook is at least interested enough to pay for the development, right? or is this just some sort of R&D that won't have near term impact at facebook?

Mori also provides clojure like functionality for the new collections greatly simplifying how you would use them - if you don't use Clojure you might think of the additional functions as an underscore for immutable collections.

I don't see a reason to use any other immutable js lib when Mori exists.

That would be true for any properly annotated JS fed to Google Closure Compiler, right?

I definitely need to learn more about it, as dead code elimination for JavaScript seems very, very hard due to its dynamic nature and I'm curious how they do it. Does anyone know about other JS compilers which try to perform dead code elimination?

Let's say you have a regular JS array. There is a reference to it in memory and it has a certain value. Now you change an element in that array. You've just mutated it. In JS, the reference in memory doesn't change, but now the value of what it's pointing to has. So in order to know if the value has changed, you need to do a comparison on every element on that array, which is expensive.

Now let's say you have a immutable array (such as from the Immutable FB library or Mori). If you change an element in that array, you get a new array and a new array reference in memory. In other words, if you check that the reference in memory is the same, you are guaranteed that the value is the same (in this case, the array). So checking for equality is fast and cheap because you are only checking the reference, not each value in the array.

One way this is important in React is that now if your component state (or entire app state) is represented by an immutable data structure, you can just check for reference equality in the shouldComponentUpdate() method when deciding to re-render that component. If the reference is equal, you are guaranteed that the data backing that component hasn't changed and you can return false, telling React not to re-render.

What's surprising is that people have made these data structures also very memory efficient. You can see David Nolan talking about how they work and memory comparisons here: https://www.youtube.com/watch?v=mS264h8KGwk

Note that you can have cheap equality detection by using e.g. flags or revision counters and so on (but there are limits due to JavaScript's nature..). The problem is that JavaScript currently doesn't support this properly out of the box, but Object.observe is coming in the next standard.

I personally like the safety benefits of immutability. You can give objects to functions and not worry about the functions changing the object. The only way to guarantee that otherwise is to clone the object, but that takes quite a bit of performance.

Persistent data structures allow for the appearance of cloning (you have a new thing) but are much more efficient behind the scenes (you don't actually copy everything). So if you find yourself doing a lot of defensive cloning, then Mori or this library would probably be a lot faster for you.

In multi-threaded environments, immutable objects are thread-safe.
When multiple threads are sharing data, you would normally have to worry about synchronization. However, immutable data structures guarantee that the shared data will not change, so synchronization issues simply go away.

That said, I'm not sure how this applies to JS, which is single-threaded.

I'm not familiar with React, but why would that not be the case without this lib? {} !=== {}

EDIT: Maybe I understand. You're saying a parent component might do something like: foo.bar = 'blah'; and then doing a simple equality check foo === foo is not enough, you have to check that the properties have not changed either. Is that correct?

Because a data structure is useless if you can't change it. You do want to 'change' the data. Immutable structures let you take a data structure and 'modify' it, getting another structure. Anything with a pointer to the first one has a guarantee that it won't ever change.

With C-style language structures immutable structures wouldn't be much use. With Clojure, for example, immutability is one of a few mutually complementary features of the language in which it makes perfect sense.

Not good enough. When the time comes that you do need to add something, then what? Clone it? In languages that allow you to freeze a collection, adding something to a frozen collection is often a O(n) operation. Most of the collections in this library provide operations that grow collections in (almost) constant time.

I have to admit, I find it odd that we name immutable data structures based on what they don't do instead of what they do.

Of course, what they do is efficient persistence. You want a snapshot of the data? You want a non-volatile reference? You've already got one! That's the feature. They should be called persistent data structures.

That is true! In fact, an earlier version of this library was called "persistent-js", and I know this is a pretty subtle distinction - I found it easier to talk to people about when I talked about it via immutability and so I ended up naming the library that way.

React's documentation also discuss their immutability helpers [1]. This new library looks like a better way of implementing this though. Will the documentation be amended to discuss use of immutable-js?

Can someone give me a good example of where immutable data structures are better than mutable ones. I know that you are more likely to mess up and have side effects with mutable data structures, but so far all I have heard is theory.

The big win is they enable lazy evaluation: you can defer the evaluation of an expression until it's needed, or discard the result and compute it again later, or cache the result of an expression and not need to re-evaluate, a flexibility which can be useful in many situations.

You also get undo/redo stacks basically for free, since you simply maintain references to the previous n immutable states.

As Om demonstrates, they also make possible significant optimizations of code that takes various view of the same underlying data, like rendering routines.

I actually started by writing it in TS and later moved to just a TS declaration file and a raw JS source. This means it can be used smoothly in both environments (similar to using a definitelytyped resource).