Pages

Monday, June 20, 2016

Jason Smith — Stock flow accounting with calculus

…We're back to the case where the initial stock was zero. Essentially a change in stock over a time scale (tt) is equivalent to a flow, and everything I said about scales and metrics and free parameters in this post follows.I do not understand the resistance to the idea that calculus can handle accounting. There are no definitions of stocks, flows, time intervals or accounting rules that are logically consistent that cannot be represented where a stock is an integral of a flow over a time scale. Attempts to do so just introduce logical inconsistencies (like stocks being equal to flows above).

Well maybe it takes a while to come up with valid ones and that time can also span a good portion of a finite human lifetime so the funerals are a correlation but not a causation....

Keen seems to be studying differentials of "debt!" and is trying to use differentials of "debt!" as predictive analysis to predict when "debt!" becomes unsustainable... I guess based on total system stock of "debt!" correlation with "crashes!" or defaults or something.... seems to be based on Minsky's "stability creates instability!" thing so I dont give it much hope...

doesnt seem to be based on typical system 'source/sink' dynamics seems like he is just treating the system as one big 'sink'... no 'source'.... he seems to be a evo/atheist/Darwin person so not surprising.... both Minsky and Keen appear to me as Peter Schiffs with some intellectual polish on them... "its 'gonna crash!.... some day!...."

Nobody told Jason that he could not use integration, rather that it was a bad idea. Instead, what I pointed out what that there are no transactions that occur as continuous flows, only at discrete points in time. That's real world behaviour, not a theoretical assumption.

The only way of simulating such behaviour is with a Dirac delta "function", and that is not really a function. If you care about doing the mathematics properly, you would discover that your time series do live within a well-defined space. You end up hand-waving, and not proper mathematics. (Every entity is either a set, a member of a set, or a rule about operations on sets.)

He just refuses to accept that people use discrete time models, for very good reasons. He is not telling us anything about SFC models, rather his own unwillingness to understand SFC modelling methodology.

But Brian I dont see any Derivative Action in SFC modeling am I missing something there?

I'm not saying Integral Action is worthless but it is different than Derivative and should not be used to make predictive statements or used to project some sort of outcome in our efforts to control/regulate... which I think these SFC people are using it (Integral) to form opinions on future outcomes...

"To show that there were difficulties in reasoning about speed at the time, Zeno produced a large number of paradoxes, of which we shall mention one to illustrate his point that there are obvious difficulties in thinking about motion. “Listen,” he says, “to the following argument: Achilles runs 1010 times as fast as a tortoise, nevertheless he can never catch the tortoise. For, suppose that they start in a race where the tortoise is 100 meters ahead of Achilles; then when Achilles has run the 100 meters to the place where the tortoise was, the tortoise has proceeded 10 meters, having run one-tenth as fast. Now, Achilles has to run another 10 meters to catch up with the tortoise, but on arriving at the end of that run, he finds that the tortoise is still 1 meter ahead of him; running another meter, he finds the tortoise 10 centimeters ahead, and so on, ad infinitum. Therefore, at any moment the tortoise is always ahead of Achilles and Achilles can never catch, up with the tortoise.” What is wrong with that? It is that a finite amount of time can be divided into an infinite number of pieces, just as a length of line can be divided into an infinite number of pieces by dividing repeatedly by two. And so, although there are an infinite number of steps (in the argument) to the point at which Achilles reaches the tortoise, it doesn’t mean that there is an infinite amount of time. We can see from this example that there are indeed some subtleties in reasoning about speed.

In order to get to the subtleties in a clearer fashion, we remind you of a joke which you surely must have heard. At the point where the lady in the car is caught by a cop, the cop comes up to her and says, “Lady, you were going 60 miles an hour!” She says, “That’s impossible, sir, I was travelling for only seven minutes. It is ridiculous—how can I go 60 miles an hour when I wasn’t going an hour?” How would you answer her if you were the cop?"

http://www.feynmanlectures.caltech.edu/I_08.html

I think we would just do well to seek smaller deltaT as possible or maybe a few different deltaT (day/week/month/quarter/etc... like Mike is doing in his analysis...) but still be aware of proper application of mathematical approaches (PID) and their associated limits as we collapse the deltaT....

Not sure I follow. Jason wants to use continuous time models, which convert flows to stocks by integrating them. That is, the change in the stock in a period is equal to the integral of the flow changes.

SFC models are generally discrete time (monthly, quarterly, whatever). The changes in stocks are the sum of the flows over the accounting period.

You could do an SFC model with continuous time, and use integrals. It's just a bad idea, as it makes things harder to work with. And it is unrealistic, since there are no continuous flows; when I transfer money over to someone in the banking system, it jumps instantly to them at the point of trasnaction (from a legal perspective). If I transfer someone $100, there is no point were I have handed over $50; it's all or nothing. We cannot properly simulate this with continuous time, as we need to bring in not very well defined Dirac delta functions to simulate this.

None of this has anything to do with how you actually use SFC models, or things like integral action in a PID control.

Brian maybe Jason in just using a non-zero deltaT... iow he is not letting his deltaT converge to zero... he's keeping it as small as possible but non-zero... and then doing Integral analysis based on measured data at those VERY SMALL time intervals and summing those measurements up...

Like in our analysis here of daily US Treasury spending our deltaT is one day...

Here:

"Instantaneous velocity is very different from ordinary velocity, which, to calculate, requires an interval of time. "Instantaneous velocity," like any limit, is defined at a specific value of time t. It is purely logical; it can never be observed or measured. To measure a velocity, it is necessary to know both a distance Δs and a time Δt, however small."

http://www.themathpage.com/acalc/instantaneous-velocity.htm

So we cant measure instantaneous anything... yet we still use calculus all the time...

Says above "however small"... so that implies it is small but still non-zero...

Here:

"the fundamental idea of calculus is to study change by studying "instantaneous" change, by which we mean changes over tiny intervals of time."

Our universe is discrete in nature, but that does not mean we cannot measure things through continuous integration.

Sometimes we use integration because we lack the discrete data and/or knowledge to express information through more complex algebraic structures and multidimensional matrix and apply appropriate information over it.

In the case of financial matters (which I refuse to conflate with economics, which is the domain of the REAL, even if it's outside the scope of what most 'economists' deal with) most datasets ara readily available as discrete data points due to the nature of transactions.

But it may be the case that you can aggregate data over intervals of time and apply calculus tools to try find regularities and trends which you won't be able to looking at 'photographs' of data.

Calculus isn't anything more than operations over algebraic structures at a larger macro level, it's a level of abstractions over the discrete nature of mathematical objects. And in nature is pretty much the same except for certain 'abstractions' like time and distance which are continuous in nature, but matter and particles themselves are "discrete" at least at the most macro-level w/o falling down into quantum mess.

Is curious how the discrete and the continuous blend in nature but also in our social abstractions. What is important though is finding the appropriate tool for the job, and certain tools can help discover certain attributes or regularities making us think about a problem in a different way.

Matt I imagine it's a matter of parsimony (probably why Brian above says "it's a bad idea"). The thing with finance is you have usually all the necessary data with all data points which then you can build upon, why spend the time, sometimes futilely hitting your head against a dataset, trying to abstract/extract a function from it and having to relay on other "imprecise" tools to approach the data (probably the point about Dirac delta functions).

For working with large datasets you can use large matrices and manipulate them easily (specially with the computational power we have nowadays), IDK if this is what is done with SFC, but certainly can be done with any large set of financial data.

We don't do this with natural phenomena because we don't have the data neither the power. Imagine 'accounting' for the 'transactions' of G-protein over the membrane of a single neuron over the period of a year, not even a section of the brain.

When we engage in such works we find ourselves with a mindblowing amount of information overload, or a lack of methods to collected the datasets, etc. Take the CERN collider, where more than 30 petabytes of datapoints are collected and processed in a year... You can have sometimes too much data, or too few...

And that's why we usually go the other way around from the discrete to the continuous, as a form of abstractions, losing accuracy on the process, abstracting at a larger, macro, level for parsimony reasons, when dealing with some complex phenomena.

But in finance if we got the data, all pre-processed and accesible, why complicate ourselves building continuous models which require indirection and abstraction through equations which may not end up telling us what is really happening under the hood (as we are 'making them up', because we have to make them up from observing that dataset)? In general is probably a good idea to avoid complicating things while losing resolution.

BUT, in some cases it may make sense because the mere use of certain tools allows u to observe things in a different way and approach problems which may be unapproachable otherwise (your point). Specially when we are losing resolution though the aggregation of that discrete data. What's the difference then between using integration or not? one method allows us to use some analytical tools which otherwise we don't have.

So I think having an open mind is useful. But no one is saying it would be easy to do it 'correctly', although SFC models also include equations we have to derive from observation, so they are not that different, essentially speaking.