Posts

Optimisation is brilliant. It can turn increase profitability, reduce risk, increase output and even turn an non-viable project or factory into a viable one. It can save time, produce less waste and better utilise inputs so that costs are reduced.But it’s difficult, right? That’s why so few people pay real attention to it, why so many organisations don’t invest in the process, right? Well, the basic concepts are straightforward, and there are often valuable easy wins to arise from simply looking at the problem in the right way and understanding what the key drivers of production or profitability are.

The Goal

Eliyahu M. Goldratt wrote a called “The Goal” in the 1980s describing a practical, real-world approach to optimising a production plant. Written in the Socratic method, it guides the reader through the understanding of one approach of optimisation and explains how and why it works rather than insisting on taking the author’s word for it. The approach isn’t perfect, but it is a brilliant introduction into how to look at problems, and where typical management accounting falls down as a tool to manage production.

Demonstrating a principle – Making a cup of Milo

Milo is a warm, chocolate and malt drink produced by Nestle in some markets, including South Africa, Australia and some South-east Asian countries. If you don’t have Milo in your market, you can still follow the story considered here by thinking Ovaltine or even just your favourite brand of hot chocolate.

Place the mug into the microwave and set on full power for approximately 2 minutes

Wait for two minutes

Retrieve mug of hot milk from the microwave

Add several spoons (usually 3 or 4) of Milo powder into the mug and stir

Add sugar to taste and stir some more

Return Milo and sugar to respective cupboards

Return mug to microwave for a further 10 seconds to give the drink that extra rich and foamy texture

Enjoy!

This will take approximately 215 seconds as shown by the table below:

The problem area here is step 9. We wait for a full two minutes while the rest of the plant (our brain, hands, feet and eyes) is shutdown with nothing to do.

A simple re-ordering of the steps can save 30 seconds out of this process.

The key here is that step 9 now only has a plant downtime of 100 seconds (compared with 120 before) and step 14 now has no downtime at all (the 5 seconds is required to insert the mug into the microwave).

Principles revisited

The example above is trivial, and I’m not pretending that it is anything else. However, the principle behind the process is important. This is one of the key ideas put forward in the book “The Goal”. In order to optimise an entire, complex process or plant, one needs to identify the bottlenecks and optimise those first. There is little point in shaving a few microseconds off the time taken to put the milk back in the fridge when more than half of the total time spent in production is waiting for the “bottleneck” microwave to finish it’s job. If one could make the microwave work faster, that would also be a key component of the optimisation process.

The principle is, identify the bottleneck and optimise that and the processes surrounding that.

Slightly different subject matter today. I’m not a news reported or a champion on the downtrodden. However, this story outline some useful concepts of arbitrage, borrowing and lending rates and how not to build a brand, market your company or do business.

First let me say that I don’t know all the details, and I can’t advise anybody on the basis of what I know, but I can say that I won’t be financing anything from Rudco.

Here are some facts:

Rate unrealistically low

What they’re offering is a fixed-rate homeloan at 6% in South Africa. According to the South African Reserve Bank prime is currently at 13%. ABSA, First National Bank, Standard Bank and Nedbank all have 13% as their prime rate. Home loans to the very best risks are rarely offered at less than Prime – 2%. Currently, that lower limit stands at 11% – and this is a flexible rate so the bank doesn’t have to hedge the risk of changes in interest rates. ABSA has a useful guide. SA HomeLoans have attractive deals, but nowhere at the same level as Rudco purports to.
It doesn’t require an actuarial model to figure out that Rudco is offering something that sounds too good to be true.

Website oddities and strange business practices

For a business that has already begun wooing customers, not even having a working “www.rudco.co.za” website is odd. The site was only registered in February this year. At least it has been paid up. Afewothersites with more information exist under various names but seem definitely to be connected with Rudco. Interesting that they are displaying adverts by Google – to raise a little extra cash perhaps? Or maybe to fund the 6% rate offered? If the adverts on my homeloan-offering-site were advertising FNB Home Loans (it was when I visited the site) I’d be a bit concerned.

One of the affiliated sites hasn’t paid their domain registration fees. Call me old-fashioned, but I prefer to deal with financial services companies who pay their bills on time, and who deal with other companies who pay their bills on time.

Moneyweb, with a record of calling bad apples bad apples had some things to say that should be read. This wouldn’t be the first time they had highlighted some dubious dealings in the early stages to be proved right after a lot more crying than was ever necessary.

Arbitrage on this scale is a whopper of a warning bell

Credit to Moneyweb for raising this, but I think it deserves highlighting. The investors into Rudco are prepared to lend money at lower than SA government risk-free bond rates. This means two things:

As a borrower, you could borrow as much money as you could from Rudco and invest it in Government bonds to make a profit at about to close to no risk as can be imagined. The South African government would need to default on a local currency obligations.

The investors placing their money into Rudco could, with lower expenses, less effort, less risk and more liquidity achieve higher returns (and yes, higher risk-adjusted returns too) by purchasing South African bonds.

These scenarios are a “free lunch” or arbitrage opportunity. Nobody makes money by being on the wrong end of an arbitrage opportunity. SA HomeLoans, who have attractive rates depending on particular circumstances, tie their funding directly to the rates available in the market. Almost by definition they are providing no-arbitrage pricing (provided one appropriately values the “service” or lending money retail, repackaging and effectively borrowing at wholesale rates). The larger banks may be slightly on the receiving end of an arbitrage opportunity (which is partly the reason for the existence of SA HomeLoans in the first place.) However, the banks have different cost structures and different distribution approaches, which offsets some of the superficially apparent economic profits.

The end game

I don’t have all the info, but everything I’ve seen makes suggests that Rudco are worthwhile staying away from. I’ll wait for the tears to start. Let’s stay away from Tigon and Masterbond and Fidentia and the myriad other schemes that South Africans have been conned into over the last few decades. Let’s apply some critical thinking, some logic and a clear and cold reality-check.
Well done Alec Hogg and Moneyweb for spotting this one and drawing my attention to it.

Zimbabwe’s economic problems have been long related to inflation and the exchange rate. While it is difficult to determine the extent to which the one leads the other in this case, it is clear that both the volatility and dramatic change in prices and exchange rates are the results of a real economic meltdown.

Risk management breaks down

Over the last seven days, the value of the Zimbabwean Dollar (as traded on the black market, rather than the fixed and mostly irrelevant government-sanctioned rate) has halved. Inflation measures are in the thousands of percent (3,000% to 9,000% depending on who you ask and what statistics you read) and some prices are almost irrelevant as there are no goods to trade. Queues for fuel have become the norm for years now, and it can only be a matter of time before there is no fuel left at all. Bicycles, along with the money-counting machines needed to count the tens of thousands of banknotes used to pay for basic items are possibly the only products seen more often than five and ten years ago.

What currency risk management tools are usually available?

So, as a business in Zimbabwe, what options are available? Usually, protection against unexpected or adverse changes in currencies can be hedged in large quantities in the OTC forward market. Increasingly, this sort of currency hedging is becoming available to smaller businesses through banks around the world. Although the price reflects the near-retail nature of the transactions, the protection is available to companies and industries particularly sensitive to exchange rate movements (any industry where a large portion of costs are in a different currency from revenues, or more subtly, where substitute products and complementary products are priced in a foreign currency).

But in Zimbabwe, quite understandably, no such protection is available. Anybody foolhardy enough to take a position in the current market would be unlikely to survive for long and thus the contracts issued would be exposed to tremendous credit risk. In effect, we have a failed market and the sophisticated risk management available elsewhere in the world has long fallen by the wayside as commercial transactions are increasingly based on the South African Rand or even simple barter arrangements.

So, if you are lucky enough not to be doing business in Zimbabwe, and these instruments and techniques are available, you should seriously consider using them for a competitive advantage. Better risk management leads to increased focus on operational results, better risk-adjusted returns to shareholders and a better quality of sleep.

Five Basic Steps To Currency Risk Management

The five basic steps in assessing currency exposure are:

Risk identification – identify areas or currency exposure (including the non-obvious ones such as substitute and complementary products as mentioned above). All risks should be entered into a risk register, which should be updated at regular intervals. The risk register should become an integral part of managing the business.

Develop risk measures for each risk identified. These can be qualitative, but quantitative is more useful to make management decisions. A single metric may not be sufficient, as high-probability-low-impact risks should be discernible from low-probability-high-impact risks that could be catastrophic for the business in the rare case when it happens. Which is more critical for your organisation? A one-day 5% increase in the exchange rate? Or a five-year consistent slide in the exchange rate that erodes your competitive advantage?

Gather information regarding the risk management tools available to you. Although expert advice is useful at every step of this process, “you don’t know what you don’t know” and thus independent, expert advice can be critically important here. Your bank may not be the ideal advisor here – if they have an incentive to sell you a product, their advice may not be independent. Even if it is, you may be tempted to second-guess it because you perceive them to be biased towards the risks that they are in a position to help you manage. Is Value At Risk useful to you? Can you understand the number in a management context?

Choose the optimal risk management strategies based on the expected costs, expected benefits, and risk adjusted return on the investment. Usually, considering expected costs and expected benefits results in a “no purchase” decision since under expected conditions, the protection acquired looks very expensive. Buying protection via options is more expensive than locking in current prices with forward contracts, but for some the upside profit potential is worth the price. Here is where you need to do some scenarios, and some hard number-crunching. Good decisions require good analysis, and good analysis requires a thorough understanding of the problems, the business, and the financial economic theory behind it all.

Continually monitor the risks through the risk measures and the risk register. Major changes may require a significant rethink in risk management strategy. Smaller changes may require tweaks and refinements. Part of the process can be to identify trigger levels at which the risk management strategy needs to be revisited. This saves management time by not diverting excessive attention to small changes that won’t have a significant impact on the optimal strategy.

The more thoroughly thought-through the process is, the more likely your organisation is to achieve above average risk-adjusted returns on capital.

Most people involved with insurance recognise that more premium is better (ceteris paribus of course). This is usually true (and occasionally not) but while some of the reasons are obvious, there are a variety of more subtle factors to take into account. This post will cover many of these factors, and point out a few cautionary tales around seeking large average premium size above all else.

When is value created?

For a particular product-type, it is usual for larger premiums to be more profitable than smaller premiums. By profitability here I mean the increase in shareholder wealth resulting from having sold that additional policy. The value creation at time of sale arises from:

A customer relationship has been confirmed and cemented through an agreement to do business for a few months or many years. The customer relationship was already in the process of being developed in the period up to the sale (from broad advertising campaigns, brand-development, specific distribution channel contact and the quotation process). However, this is also true for any other industry, so we will restrict the analysis in this post to the “point of sale”.

This customer relationship means that for short-term or annually renewable business, there is a non-zero probability of renewal, and this probability is likely to be higher than the probability of a random individual with no previous contact with the insurance company buying a new product under the same conditions.

The costs of renewing an existing policy are usually lower than those of creating a new policy. (Policyholder and risk details are already captured on the system, the sales process is quicker, legal and regulatory compliance (for example, around identifying customers) is already complete and payment details / credit checks have been performed.

For life insurance business, a long-term, legal contract has been entered into. Traditionally, these contracts can be cancelled at the option of the policyholder (usually with a fair and sometimes controversially unfair penalty). In spite of the cancellation option, signing a long-term contract provides some evidence that the policyholder has an intention to enter into a long-term agreement with the insurance company.

Since the costs are fixed per policy, the greater absolute charges are matched against lower costs yielding a higher margin.

Larger policies are usually more persistent

No question this is subjective, but one only needs to consider the 25% – 50% first year lapse rates on low-income products with small sums assured and small premiums. Large policies are likely to be sold to educated consumers who are less likely to be hoodwinked by smooth-talking commission-driven salespersons.

One can understand logically how this could be true, and the data supports these conclusions as well.

Fairly standard actuarial knowledge this. Higher income means better access to healthcare for current ailments. More importantly, high income now is strongly correlated with high income in the previous years, which implies consistently better access to good healthcare and thus better overall life expectancy. Moreover, higher income is correlated with higher education. Education is correlated with family having money, which is correlated with good healthcare since birth, which is positive for life expectancy. Certain diseases (particularly heart disease) are related to stress and high cholesterol, which are positively correlated with wealth and income and act in the opposite direction.

Lower mortality both means lower claims experience (for non-annuity risk products) but also means, very marginally, that persistency will be higher since dead policyholders don’t pay premiums. Since a portion of all premiums is earmarked for the repayment of initial expenses, the more premiums paid the higher the overall margin will be.

And what about the impact of discounted rates?

Absolutely right. Higher premiums often attract discount rates, including lower asset management fees, higher allocation rates and lower mortality charges. These cost elements shouldn’t be ignored in the analysis, but experience usually shows that the benefits outweigh these costs. Results may vary!

An element that is often forgotten is medical underwriting. Most underwriting manuals have limits below which certain components of the comprehensive underwriting process are omitted because they aren’t cost effective. Thus, for the largest policies, the underwriting cost are often the highest. Analysis of actual experience and the costs involved for this should provide reasonable estimates of this cost.

One final, even more subtle impact is that of statistical variation. Individual policyholders will die (and we are continuing with the focus on non-annuity risk products here) with a certain probability at each age. Thus, the overall distribution of the number of deaths in a year should follow a binomial distribution, ignoring catastrophes and the slight theoretical correlation between deaths of spouses. Since the number of deaths follows a binomial distribution, we can determine likely variation from the expected number of deaths using basic statistical methods. What this also shows us is that as the number of policyholders increases, so the percentage variation from the mean decreases due to the diversification benefit. I’m not going to go into the detail of this for now – those familiar with insurance should be comfortable so far.

So, ideally we want lots of policies. If we also want to hold the premium constant between two comparable companies (S and B) but where S has small premiums per policy and B has big premiums per policy, then S will have a greater number of policyholders than B and will experience less volatility in financial results through better diversification of risks. You can also think of this like every Rand (or Pound or Dollar or Euro) of benefit for a particular policyholder is perfectly correlated with every other unit of currency for that same policyholder. Either the policyholder lives and all units of currency don’t get paid, or the policyholder dies and every single unit of currency is paid out as the total Sum Assured. Thus, larger premiums make larger benefits make more correlation and less diversification. This slightly unusual way of looking at the problem is what most people are familiar with as concentration risk, except here we are considering concentration within individual policyholders. This increase in risk increases the economic capital required (and often the regulatory capital too) which will likely have a cost to be considered.

So large premiums matter

Most people involved in life insurance will intuitively feel that larger premiums are “better” or more profitable – here are some of the reasons why. Most of these reasons are familiar to actuaries, and if you give an actuary a little bit of time he or she will likely come up with these and some others as well. However, this article has focussed on premium size in the absence of other factors and incentives. I’ll post soon on an example of how the external environment can distort this natural operational conclusions.

Reading the article made we wonder for a moment whether Google is laying the groundwork for an ever increasingly difficult task. Increasingly difficult to the point of insurmountable in even a relatively short amount of time with their current approach. I don’t know enough about what Mr Singhal and his team does, and it’s pretty clear that are doing an incredible job and have only scraped the surface of an iceberg in terms of telling the general public (you and me!) what they get up. So I don’t know the details of what they do, but here is an important idea to generate some thought for their business, and as an example for yours.
You start with a rule, a simple rule, a good rule that ranks pages. It does a good job and is intuitive to understand. Mortals (albeit really, really smart mortals) can think through the implications of tweaking a few parameters, the weights in the calculations. Any small change can be managed with ease because the system is simple.

Then you realise that one rule isn’t enough. It’s too crude, too blunt and all that’s needed to fix some major short-comings is to introduce a second rule. Now the system is more complicated. Two, interacting rules must be considered with every change in parameter. It’s still possible to understand how changes will affect the rankings of all the millions and millions of pages ranked, but it is more difficult.

As each successful adjustment and rule-change and rule addition and new algorithm and special case and “French Revolution” and “Apple versus apple” is embedded, the system becomes more complex, less intuitive. And the number of people who can understand and appreciate it dwindles rapidly. Every change requires more thought, more consideration. The risks are greater – the system is already so finely balanced, that it is that much easier to tilt it off balance. At some point, maybe the marginal cost of making any change comes too close to the marginal benefit from the change that changes are not possible, and the State of Search Stagnates.

Some of the other avenues they seem to be exploring, based on analysing information from individual users web-habits from things like Gmail or the google-toolbar may be less susceptible. This is a new source of information, rather than just increasingly complex tweakings and additions to a system. However, the insights obtained from these new data sources must still be integrated into the search results for you and me when we search “technical business advantage” and hope to find something useful.

From the article:

‚ÄúPeople still think that Google is the gold standard of search,‚Ä? Mr. Battelle says. ‚ÄúTheir secret sauce is how these guys are doing it all in aggregate. There are 1,000 little tunings they do.‚Ä?

The lesson for the rest of us

Optimising any business process should not primarily be about making it work for now at all costs. Simple rules that have longevity and can continue to function usefully in a variety of different scenarios, over long periods of time are important. Quite frankly, a single business process of most businesses is just that, a process important to the efficient function of the business, but not the entire be-all and end-all of the business. Search is a highly competitive field, and every last ounce of performance that can be squeezed out of the algorithms is important. For search, maybe the benefits do outweigh the negatives, but for most other business decisions, it is important to optimise and structure and analyse with a long-term, sustainable, maintainable, manageable and transferable set of rules and algorithms.

Inflation and Sean Summers

I remember a story about a lady calling Sean Summers and complaining that inflation was much higher than reported. Apparently (and I heard this from him live over radio on moneyweb radio) he took her credit card details (I would probably trust Mr Summers too) and called up her purchases over the previous year. Based on a like-for-like comparison of products, her “actual inflation” was significantly lower than reported inflation.

Why we spend more

Differences in the basket used (which has to be an average by definition) will give different inflation than an individual experiences. However, there are other factors that influence how much money we spend each day. In an economic expansion, we spend more because we buy more goods and services, not only because the price of good has increased. As everybody spends more, and conspicuously spends more, the normal frequency and standard of eating out increases, the types of cars purchased moves upmarket, more food is bought from Woolies and less from Shoprite.
This means consumers have less money left at the end of the month, but it is not necessarily all attributable to inflation. Inflation that an individual experiences may well be higher than reported – and I don’t deny the possibility that the Stats SA numbers are wrong – but until you take a detailed like-for-like analysis of your own standard basket of goods to calculate your own “experienced inflation” I am not convinced by the community comments below the article on moneyweb.

Monetary Policy Committee decision

Also, the interest rate decision will take into account the nature of inflation (food prices, petrol and possibly future electricity increases) rather than only the crude number. External shocks will result in higher inflation regardless of how much we squeeze consumers and business with higher interest rates.
We may or may not see an increase inspite of the breach. Time to wait and see.

I first heard this quote when dealing with performance measurement and remuneration structures for senior management. In that scenario, the danger is that you get exactly what you measure rather than the good behaviours related to or driven by the metrics chosen. The measure starts as Earnings Per Share (EPS) growth, which is generally a good thing. However, once management do the maths, they realise that reducing dividends to zero will boost EPS growth, even if it means pursing projects with a return lower than shareholders’ cost of capital. Measure becomes a target; measure ceases to be useful.
More on that some other time – it is an interesting point itself.

Now on to the magic and mystery, and science and great skill, and analysis and mathematics and theories, and occasional quack and snake-oil salesman – Search Engine Optimisation. Isn’t there an argument to say that, if the aim of search engines with their great yet imperfect algorithms is to reward fresh, relevant and useful content for relevant search terms, then the best long-term strategy would be to continue to write and publish fresh, relevant and useful content? No quick wins, and with less of the alchemy involved SEO companies wouldn’t get as many customers, but why isn’t this the best advice for long term traffic and search engine ranking? Rather than pursuing loopholes and quirks in any particular (temporary) search system, the measure should match something more fundamental – being a useful website. Difficult for that approach to need to be changed when Google uses “nofollow” links or omits duplicate stories or starts recognising your “invisible white on white text”.
Ok, before the backlash begins, there are practical lessons that can help search engines. “Obvious” things like “search engines are not people and therefore will struggle to read text if it is really a picture embedded in a fancy Flash animation”. This is probably a bad example – I expect fresh, useful and relevant content appears less often in glitzy Flash clips.

So isn’t it time to take the difficult medicine, and build a brand and loyalty and readership and customers and repeat business and structure value and goodwill by actually earning it?

Apparently, it was Benjamin Franklin who said “In this world, nothing can be said to be certain, except death and taxes.” Without going into a detailed analysis of whether death is certain, and whether there are tax-haven countries with sufficiently low taxes to stretch the point a little, I have some comments to make on the throw-away use of the word “certain”.

Taxes are not certain. Even if some amount of tax is unavoidable, the actual tax payable is not certain. This is not a massively complex idea, but does require a shift in mindset to consider taxes as something other than merely a cost that must be paid, something that reduces profits and returns to the owners of a business. I’m not even talking about optimising the amount of tax paid through careful tax structuring (which can be a good idea, if it is legal, and if the loophole stays open long enough to be beneficial, and if the extent of structuring makes business and moral sense).

I’m talking about considering the impact that tax has on business strategy, target market selection, business mix choices and competitive advantage.

A current example for me is the taxation of life insurance companies in Lebanon. Corporate tax on profits is 15% in Lebanon. However, for life insurers, the tax authorities have deemed it too difficult to nail down a clear measure of insurer profitability (another point for another blog, but in fairness to the tax authorities, insurers are rather notorious for adjusting actuarial reserves to arrive at the desired financial result …). Thus, insurers are taxed on “assumed profit” which is set to be 5% of revenue (mostly premiums written, which are considered as revenue, and investment income).

Some things to note:

The tax calculation is thus simple, which for most business is a good thing.

If a company can make a higher margin than 5% of revenue, then they will benefit from the simplified tax system. If a company’s margins are thin and their net profit is less than 5% of premiums, they will pay a disproportionately large amount of tax.

The last point is where tax becomes interesting, and this is particularly ironic because in this case tax is more certain than usual (given it depends only on a single factor, revenue, rather than revenue and expenses). I’ll expand in my next posts on two important impacts this has for insurers and the economy as a whole.