the the and the and the 1 Chevron Canada there isn't a hashtag of my name that's as the and as you may

00:24

have noticed I love open source software I believe in continuous reinvestment like our previous stock I'm also husband of a

00:34

very patient white and she's actually here so that everyone you you around it was and we have

00:48

birds and they like the pair program 1 of

00:52

the matches my color scheme and we did in fact use the bird based on my colors here we have a

00:59

new feathery friend and unfortunately when I tried to pair with them turns out he can like 1 shot headphones and power adapters so you cat everything but is lots of fun the he actually help us navigate of

01:14

the Portland so turns out I accidentally wash my passport can't fly here quickly I'm on this coast so driving up from San Jose was actually quite fun the so announcement I

01:26

I know what's free and they they believe heavily in open source and a large part of the gap between run some pretty massive ever applications to some pretty exciting for me to so at 1st I was kind of

01:44

apprehensive gap in my my 1st e-mail address 17 years ago was the idea of you and that was pretty apprehensive but being there everyone has this work that I track the smart motivated and excited it's really cool to see so what am I doing at

02:02

Yahoo fixing embassy alienation but this is also by I think my wife appreciates this that I have a little bit more time to spend with their and this is this is well-defined so the yahoo desiring

02:19

everyone anyone want to come join and have some fun we are riding the Canary wave on as many applications as we can I talk to me talk to 1 or the other the atoms so let's see figure or something else the and so I love discussion

02:34

feel free to ask me questions but then we went little bit over before so if I cut you off that just means but the afterwards so there's a warning browser internals are like friendship can understand a few things like and say a few things I can ask where the washing as a scholarship volatile that is very useful but but I might be wrong so this talk or

03:00

talk about some performance Smith mechanical sympathy and then members miss alignments

03:06

so we all know the just a performance is all about how you enumerate arrays Anderson case a curious over all different ways of enumerating weight rates but it box

03:19

then it turns out dominated fix all the problems all the tomb of our guys fix all the problems were done thank you and

03:32

the the so although we were like neck-and-neck with react I think we can do a little bit better and so up until somewhat recently this was like JavaScript performance looked like to me I think a lot of you as

03:43

well until I started getting involved with promises and there was lots of complaints about how slow they were but they provide this fantastic fantastic mechanism by which we can abstract some complexity promises

03:58

to heard really slow there's actually blog by and who enumerated these problems and then all of a sudden there's another blog post by

04:08

sponsoring and switching the promises and basically it was because of this library called number promise

04:13

implementation and Bluebird its author had come and to basically without changing the external API made promises at least an order of magnitude faster so at this point I like a

04:27

but had done work on ours he feels like how on earth has he accomplished this so I expected when I opened the Bloomberg box defined 12 weird tricks to w JavaScript performance the when I opened the box after

04:41

some investigation let go what I say next my shocking what if I told you

04:46

promises using normal get this normal for loops were actually fast all of the speech

04:54

I liked hearing it from top to talk it's fantastic

04:57

so basically the TLB are wide and broad was faster as it did less work want something to go faster make do less work the question often is what is the work so the Bluebird case it was 2 things allocate less objects and that underlie aligned with the underlying primitives to figure out what JavaScript runtimes reviewing and try to not fight them but work with them and now promises a fast but the

05:25

most important part of this was done without an external API change allowed abuse through the we

05:31

care about making things faster without breaking externalize or minimizing external API breakages as is very appealing stability

05:41

without stagnation and who sort is pretty

05:44

awesome demo of which I was a little bit nervous about this I'm giving a performance talk after that

05:50

but but I had early access to display up the profile and I saw these things as it up this is something that doesn't have we can do better the the the

06:06

but if so even with this

06:09

item poetry render branch the ceiling I believe is still pretty high we have a lot of fun work left to do and we can get way more performance of them we have

06:17

even without branch call much faster can we go who knows the so there's another part

06:26

and that is when we're developing we actually want people to develop faster we just don't want to deal with esoterica ways of making things faster we also want to be productive and it turns out the same way as we confuse ourselves is the same way as we confuse the runtime and often realizing what the runtime is doing under the hood might actually encourage good programming patents to so it turns out by reducing confusion we can actually make Ember even faster than it is today the so who here played

06:54

poker 1 who here was able to clone rare candies while for those that don't know what that is there result of a glitch of and allowed to clone items in inventory so there's this thing called where can utilize the level of command but you could use this glitch to get many rare can and level of your book 1 really quickly unfortunately 1 as you level that people commands to if you allow them to evolve to early you'd miss out on some of the important skills so as we work on Amber we we have to be sure that we're not jumping the gun and were not throwing things away they're actually good I in favor of something that looks a little bit better but actually might lead us down the wrong path so we have some important choices to

07:45

make but is often not clear which choices we can it so as Amber evolves how can make a choice that both is productive for developers and doesn't screw us when it comes to performance that can the

08:01

so often I feel this is captured performance who who here doesn't know what cargo-cult thing it's the

08:09

well anyways South Pacific Islanders around the world war 2 where I recognize that there was a correlation between the arrival of cargo and things that look like runways and planes and people working they dismissed they were quite sure what was happening actually but so they try to emulate the summit's could understand why sometimes cargo would arrive in sometimes it wouldn't the

08:30

with the on off a lot of times in jobs for performance we make the same choices

08:35

the we often hear slow down to X we fixate on the X and we fix it on not to use it but as we know

08:46

good anyone that yes axis slows

08:51

the wrong thing to say the more important question is when is excellent not just that when is x used relative magnitude matter sometimes a very slow thing will actually enable you less allocations are less work being done so the question is when is x reasonable and when is it not reasonable the but to make this choice we have to understand what exits without the missing the connection we

09:20

end up trying something and then we end up with

09:27

that Mr. a lives performance calculus to patches 5 per cent improvement together 10 % regression and we have done this countless times member I'm at fault for a lot of them I'm sure everyone was committed member has contribute to this

09:41

problem and we need to fix this so

09:45

this is cool term that I picked up from Martin Thompson of but of L max on X a high-performance trading platform and they choose a high-performance not by scaling output by understanding the underlying hardware and our underlying run runtime he stole this idea from a dude Called Jackie Stewart who is a three-time F 1 chance basically

10:06

the best drivers had the understanding of how the machine worked so that they can work in harmony with it rather than working against the runtimes we can probably work with them in a little bit more oomph of them to

10:18

buy correctly aligning the cars with the driver the driver with the car you're able to utilize the car its full capacity the terms of F

10:26

1 cars V 8 engines I same problem yes they have the sixers but

10:32

whatever and don't worry although I'm talking a lot of you and these common paradigms and patterns really hold true across most of the runtimes so the world's simplest

10:45

example a plus B I stole this from Pekka and I'm sorry that's great example so would obviate does is it looks at that source code and a new emits some assembly the this assembly is fast but could be faster but it's extraordinary state because we can be adding to integers together 2 numbers together of this together we don't know but important part here is weight generates this code it also adds hopes to record the type information for later specialization then as could gets sort

11:19

a the and the type

11:28

of information is consulted to see Hague can we specialize further so basically if a function for a and B is used often and a and B are always 3 3 2 bit integers we can specialize How

11:41

does this happen VAT there's a optimizing compiler called crankshaft that keeps track of the high the control flows portable assembly and then and it's on native assembly for a particular runtime the houses action

11:55

work this also diagram by Mr. source comes in recompile something that's quick to compile has these hooks Assaf gets faster we consult the optimizing compiler which gives us new in of code that is more efficient

12:09

the ultimately FT optimizing compilers done that we end up with ideally something like this which is many times faster but maybe more brittle so to review AST comes in

12:22

we 1st generate something that safe not specialized as it's used we can generate more optimized code for that case

12:31

mad science so how does this work

12:34

basically types instability and that science staff I do more than just

12:41

arithmetic all turns out there's this concept

12:45

of shapes and the runtime and they allow the stay the same types of optimizations apply them to the rest of our code so if we look at this

12:56

yes 6 class for to 2015 class is the shape of in this case 2 slots the 1st name slot in the last slot here we have a wrecked not react

13:07

so that also has 2 slots but these 2 shapes for entirely different just like an integer is different than a string the same things so

13:16

when we go on instantiate a few of these rats we have 1 shape or we have multiple instances

13:23

the the great the so remember this thing where if we knew the type information of a maybe we could specialize well if we have an ad heights

13:31

method that takes a and B. Recht a indirectly an advan together it's awfully similar to so as the dynamic

13:42

portions or used as the function gets hot type information is recorded and a for always dealing with the same shapes and no shapes always have the same numbers on values rather than having to do a very extensive property lookup we can they really know the offset of the integer

14:01

and we end up with hopefully something as close to this in the ideal

14:07

surface area so what this boils down to is that the runtime believe the code is stable and predictable it's able to make things fast to and also as it

14:17

turns out well factor code that humans can understand and read often has these characteristics there's another aspect to

14:27

performance and that is the time versus space are trail so the previous stuff time-based can we make something execute faster and the 2nd 1 is space so space they subspace but also space thinks of time allocations take time clean up the garbage collector in a garbage collected language takes a huge amount of time and number apps this is a big place rear causing lots of problems the things that cost us

14:57

GC space world closures objects unoptimized go to turns of unoptimized code allocates more objects of even compiled code allocates extra objects and if we have too many different shapes the functions have to be specialized to many times and we end up with compiled code actually causing more garbage collector problems than are ap data which is just bonkers so all we need to do

15:24

the embers we need to do a do less work allocate less and also under alignments with the underlying primitives I have an expectation that reasonable code should be reasonably fast up in some places there's a bug in the runtime that we can get into fixed in other cases we are just doing unreasonable things and we need to fix them the so I have 3 Miss alignments that number those today and they were going to cover so the first one is and just does way too much work the solution is do less work what work

15:56

exactly what we have a great example of this nearly a talk bars and I've input we render mitigate a whole category of extra work this is fantastic the other thing is as removing to

16:08

ever 2 . 0 we are suggesting people use actions up and bindings down it turns out that explicit data flow it is also less work we don't have all these 2 way bindings everywhere potentially wasteful potentially not even use or even worse out often applications I see that we're relying on like sloshing between 2 properties that are bound actually happens and no 1 even notices words burning time that we don't even want or don't even need the and another part of this doing too much work is singleton controllers make this problem even worse they don't really have an explicit life cycle when we transition relics were causing change events that we don't care about when I leave the user go to let's say of settings wrote and I'm to bearing down information from the users controller it doesn't know that we don't care about it anymore it actually propagates the same change events this is exceptionally costly as we move to radical components and explicit state we will just be able to delete the component because dead and a cost us much less than if we allow to live and that the light bounced off so this basically means as we

17:14

make Amber clear are Apple get faster and this is absolutely the case so the

17:20

2nd this alignment is in it and super and it in super I believe is exceptionally hard to learn I even I like to think I know a few things but Amber and I'm often surprised but who is try to some classes are a proxy turns out you have to do things in a very strange waters but it's because we're Miss aligned with the house super should actually be working this causes us to allocate extra shapes which makes the runtime not be able to specialize correctly it results in more about code there were not going to use and just a big mess and it's not quite aligned with what the new specifications classes super objects have so the solution here is to embrace super so who's written code

18:06

like this before we have 1st name and last thing defined on the objects the ones written code like this before I I everyone in this room has written and rapid for has written code this way of and it turns out that the shapes

18:23

here they come and me up and they hurt us here

18:27

because what's actually happening is we create a new person we try to know new persons we have a person proto which represents the class on the left we have the person shape and then 2 instances they are of the same shape

18:42

only go to set full name it turns out that 1st name and last name were actually sat on the slots of the parent we set similar similar where action creating a whole new shape potentially sometimes be and sometimes runtime the smart enough to catch this but as we have more and more properties we run the risk of creating multiple stiffer shapes and which is means that code that would should actually all been dealing with the same person shape may have to deal with different combinations and permutations of the person shape which is not cool also it is not clear how to define these properties on a prototype especially someone defines an array of a prototype and all instances share them the

19:22

so the solution to this I believe is to change in it and the change super so that we actually pass was in the fall or reduce you JavaScript hate was odds first-onset if not it's a joke and then we'll call super at the root super will actually set the properties to the instance what we do right now is we actually set the properties before in it is called so there's no way for you to know if there's a default property or the property in India I did create a new so another cool thing of this

19:58

before right now there is actually no set of rules that will define how to use super With this change there's 2 rules we call super if we are overriding a Framework Method any before we touch the this and the method and thinking that moving forward that is all you will need to know about how to use super so not only is it easier to explain teach and understand of but it's more aligned with the underlying runtime

20:24

thoughts the anyone to please please show me I'm excited I Ms. alignment number

20:38

3 in this 1 this 1 is rough but I believe we have a solution so basically Amber object reopen super-powerful allows member too slowly augment objects as you pull a new feature so if you add in the router you we can add features also is a great way for us to add on deprecation and to pull out existing parts of Amber installed miss out on but makes them back in so that we allow people to fall back all functionality on but it turns out it is kind of buddy requires complice complex internals and causes mass is the root cause for massive amounts of our allocations and and shapes solution is basically I would like to propose that we limit reopened to only worked until the object has its 1st instantiation so remember that chart I

21:32

showed up Tolman unit is a branch basically the most expensive things that are left are related to this lazy reopen code so who here doesn't know what

21:47

matter is in amber on for that hands who's seen it but doesn't know what it is I bet that sounds a little better so

21:57

although matter and actions for very related MIT as a good thing I

22:02

think we can fix it this is matter it's the

22:07

heart of everything that powers Amber how the listener's watches events CPs bindings chained everything works of view these made of

22:14

things and in normal application this is kind of every class has a net every instance has a meta and the meadows inherit the same way that classifier hierarchy inherits it turns out that most of these meadows are fine but the meadows for the instances are what kill us the flexibility that we need to support reopened after the 1st instantiation basically means that we are who grinding gain St the underlying runtimes so that is basically for

22:45

alive inheriting it does some crazy things that you can totally change of basically every 0 create is a new shape if we create 4 thousand records ever data users basically will have forced thousand different metals meaning code the country's manner which is all the supersensitive code can't can't possibly be optimized it turns out that if we can get rid of this reopen after 1st instantiation we can actually at every node of the same shape and we can now potentially benefit from actual runtime optimizations in his code that's and 1 trick that allows us to do this is

23:19

no 1 inherits from the meadow on an instance no 1 makes a subclass of the user instance they maybe make a subclass of user class but this little detail will allow us to shared even more of that performance problems that we

23:32

saw we thought it was bad it turns out actions for which is heavily entangled matter is a little bit crazy I and who hears run into the school before and you look at it like lot on a deal with this but as it turns out and applying the same rules we can totally make man listeners substantially faster I mean how

23:56

does is actually work each matter which already has some problems has another object called listeners which has all the problems that matter has any more is kind of a graph of the amber object but as you can see those meadows and the listeners in green they're pretty fine because we have a few classes in a system may of hundreds of them we can have thousands and thousands of instances so we can optimize those last ones the last piece of profile go away so social we will

24:26

work with 88 to handle this better I believe we will kill hopefully maybe maybe not hopefully we can kill reopen after instantiation we will optimize meadows for instantiation and listeners for subscription so I have an early spike of this I can't demo you because it is um not quite ready but the early numbers or substantially faster than what we have to I'm from quite hopeful of this and if

24:55

anyone wants to help Yahoo's hiring come see me or come see any of the other the admins and love to work with you also when doing

25:05

optimizations turns out primarch optimization is a not necessarily great in which only do what actually matters which only due to the code that will actually provide us performance but it's fun here another

25:23

thing that I have learned is that benchmarks can trick you not only can be tricky they're will almost always certainly trick you so the only way to have a reasonable certain Genia improvements are using all the tooling available to you and work with someone else 2nd it at your work the and remember actions up and bindings down that's it thank you any questions

Inhaltliche Metadaten

One overarching theme in the Ember.js philosophy is to put developers on a path to success, primarily this is done by making that path have the lowest resistance. On several fronts this pattern has been quite successful, but on one important front we fall short, performance. This is the result of framework and run-time misalignment. This talk describes how we as a community can solve this problem in our apps and the framework.