As can be seen on the Insolvency Service’s dedicated RPB-monitoring web-page – https://www.gov.uk/government/collections/monitoring-activity-reports-of-insolvency-practitioner-authorising-bodies – their efforts to review systematically each RPB’s regulatory activities seemed to grind to a halt a year ago. The Service did report last year that their “future monitoring schedule” would be “determined by risk assessment and desktop monitoring” and they gave the impression that their focus would shift from on-site visits to “themed reviews”. Although their annual report indicates that such reviews have not always been confined to the desk-top, their comments are much more generic with no explanation as to how specific RPBs are performing – a step backwards, I think.

Themed review on fees

An example of this opacity is the Service’s account of their themed review “into the activities, and effectiveness, of the regulatory regime in monitoring fees charged by IPs”.

After gathering and reviewing information from the RPBs, the InsS reports: “RPBs responses indicate that they have provided guidance to members on fee matters and that through their regulatory monitoring; fee-related misconduct has been identified and reported for further consideration”.

For this project, the InsS also gathered information from the Complaints Gateway and has reported: “Initial findings indicate that fee related matters are being reported to the IP Complaints Gateway and, where appropriate, being referred to the RPBs”.

Ohhhkay, so that describes the “activities” of the regulatory regime (tell us something we don’t know!), but how exactly does the Service expect to review their effectiveness? The report states that their work is ongoing.

Don’t get me wrong, it’s not that I necessarily want the Service to dig deeper. For example, if the Service’s view is that successful regulation of pre-packs is achieved by scrutinising SIP16 Statements for technical compliance with the minutiae of the disclosure checklist, I dread to think how they envisage tackling any abusive fee-charging. It’s just that, if the Service thinks that they are really getting under the skin of issues, personally I hope they are doing far more behind the scenes… especially as the Service is surely beginning to gather threads on the question of whether the world would be a better place with a single regulator.

So let’s look at the stats…

How frequently are you receiving monitoring visits?

There is a general feeling that every IP will receive a monitoring visit every three years. But is this the reality?

This shows quite a variation, doesn’t it? For two years in a row, significantly less than one third of all IPs were visited in the year. Does this mean the RPBs have been slipping from the Principles for Monitoring’s 3-year norm?

The spiky CAI line in particular demonstrates how an RPB’s visiting cycle may mean that the number of visits per year can fluctuate wildly, but how nevertheless the CAI’s routine 3-yearly peaks and troughs suggest that in general that RPB is following a 3-yearly schedule. So what picture do we see, if we iron out the annual fluctuations?

This looks more reasonable, doesn’t it? As we would expect, most RPBs are visiting not-far-off 100% of their IPs over three years… with the clear exceptions of CAI, which seems to be oddly enthusiastic, and the ICAEW, which seems to be consistently ploughing its own furrow. This may be the result of the ICAEW’s style of monitoring large firms with many IPs, where each year some IPs are the subject of a visit, but this may not mean that all IPs receive a visit in three years. Alternatively, could it mean they are following a risk-based monitoring programme..?

There are benefits to routine, regular and relatively frequent monitoring visits for everyone, almost irrespective of the firm’s risk profile: it reduces the risk that a serious error may be repeated unwittingly (or even deliberately). However, this model isn’t an indicator of Better Regulation (see, for example, the Regulators’ Compliance Code at https://www.gov.uk/government/publications/regulators-compliance-code-for-insolvency-practitioners). With the InsS revisiting their MoU (and presumably also the Principles for Monitoring) with the RPBs, I wonder if we will see a change.

Focussing on the Low-Achievers?

The alternative to the one-visit-every-three-years-irrespective-of-your-risk-profile model is to take a more risk-based approach, to spend one’s monitoring efforts on those that appear to be the highest risk. This makes sense to me: if a firm/IP has proven that they are more than capable of self-regulation – they keep up with legislative changes, keep informed even of the non-legislative twists and turns, and don’t leave it solely to the RPBs to examine whether their systems and processes are working, but they take steps quickly to resolve issues on specific cases and across entire portfolios and systems – why should licence fees be spent on 3-yearly RPB monitoring visits, which pick up non-material non-compliances at best? Should not more effort go towards monitoring those who seem consistently and materially to fail to meet required standards or to adapt to new ones?

But perhaps that’s what being done already. Are many targeted visits being carried out?

It seems that for several years few targeted visits have been conducted, although perhaps the tide is turning in Scotland and Ireland. The ACCA also performed a number, although now that the IPA team is carrying out monitoring visits on ACCA-licensed IPs, I’m not surprised to see the number drop.

It seems that targeted visits have never really been the ICAEW’s weapon of choice. At first glance, I was a little surprised at this, considering that their monitoring schedule seems less 3-yearly rigid than the other RPBs. Aren’t targeted visits a good way to monitor progress outside the routine visit schedule? Evidently, the ICAEW is not using targeted visits to focus effort on low-achievers. Perhaps they are tackling them in another way…

Wielding Different Sticks

I think this demonstrates that the ICAEW isn’t lightening up: they may be carrying out less frequent monitoring visits on some IPs, but their post-visit actions are by no means infrequent. So perhaps this indicates that the ICAEW is focusing its efforts on those seriously missing the mark.

The ICAEW’s preference seems to be in requiring their IPs to carry out ICRs. Jo’s and my experiences are that the ICAEW often requires those ICRs to be carried out by an external reviewer and they require a copy of the reviewer’s report to be sent to the ICAEW. They also make more use than the other RPBs of requiring IPs to undertake/confirm that action will be taken. I suspect that these are often required in combination with ICR requests so that the ICAEW can monitor how the IP is measuring up to their commitments.

And in case you’re wondering, external ICRs cost less than an IPA targeted visit (well, the Compliance Alliance’s do, anyway) and I like to think that we hold generally to the same standards, so external ICRs are better for everyone.

In contrast, the IPA appears to prefer referring IPs for disciplinary consideration or for further investigation (the IPA’s constitution means that technically no penalties can arise from monitoring visits unless they are first referred to the IPA’s Investigation Committee). However, the IPA makes comparatively fewer post-visit demands of its IPs. But isn’t that an unfair comparison, because of course the ICAEW carried out more monitoring visits in 2017? What’s the picture per visit?

No better and no worse?

Hmm… I’m not sure this graph helps us much. Inevitably, the negative outcomes from monitoring visits are spiky. We’re not talking about vast numbers of RPB slaps here (that’s why I’ve excluded the smaller RPBs – sorry guys, nothing personal!) and the “All” line (which does include the other RPBs) does illustrate a smoother line overall. But the graph does suggest that ICAEW-licensed IPs are over three times as likely to receive a negative outcome from a monitoring visit than IPA-licensed IPs.

Before you all get worried about your impending or just-gone RPB visit, you should remember that a single monitoring visit can lead to more than one negative outcome. For example, as I mentioned above, the RPB could instruct an ICR or targeted visit as well as requiring the IP to make certain undertakings. One would hope that much less than 25% of all IPs visited last year had a clean outcome!

This doubling-up of outcomes may be behind the disparity between the RPBs: perhaps the ICAEW is using multiple tools to address a single IP’s problems more often than the other two RPBs… although why should this be? Alternatively, perhaps the ICAEW’s record again suggests that the ICAEW is focusing their efforts on the most wayward IPs.

Choose Your Poison

I observed in my last blog (https://tinyurl.com/y8b4cgp7) that the complaints outcomes indicated that the IPA was far more likely to sanction its IPs over complaints than the ICAEW was. I suggested that maybe this was because the IPA licenses more than its fair share of IVA specialists. Nevertheless, I find it interesting that the monitoring outcomes indicate the opposite: that the ICAEW is far more likely to sanction on the back of a visit than the IPA is.

Personally, I prefer a regime that focuses more heavily on monitoring than on complaints. Complaints are too capricious: to a large extent, it is pot luck whether someone (a) spots misconduct and (b) takes the effort to complain. As I mentioned in the previous blog, the subjects of some complaints decisions are technical breaches… and which IP can say hand-on-heart that they’ve never committed similar?

Also by their nature, complaints are historic – sometimes very historic – but it might not matter if an IP has since changed their ways or whether the issue was a one-off: if the complaint is founded, the decision will be made; the IP’s later actions may just help to reduce the penalty.

In my view, the monitoring regime is far more forward-looking and much fairer. Monitors look at fresh material, they consider whether the problem was a one-off incident or systemic and whether the IP has since made changes. The monitoring process also generally doesn’t penalise IPs for past actions, but rather what’s important are the steps an IP takes to rectify issues and to reduce the risks of recurrence. The process enables the RPBs to keep an eye on if, when and how an IP makes systems- or culture-based changes, interests that are usually absent from the complaints process.

As the following graph shows, the number of appointment-taking IPs has fallen for the third year in a row:In ICAS’ 2017 monitoring report (https://www.icas.com/regulation/insolvency-monitoring-annual-reports), that RPB puts the decrease down to the number of IPs who have retired, which I suspect is probably the case across the board. And we’re not seeing their number being replaced by new appointment-takers. I can’t say I’m surprised at that either: regulatory burdens and personal risks continue to mushroom, formal insolvency cases (especially those with assets) appear more sparse and the media has nothing good to say about the profession. Why would anyone starting out choose formal insolvency as their career choice?

Admittedly, it’s not an alarming fall… not yet… but one has to wonder how the Insolvency Service proposes to address this trend, given that one of their regulatory objectives introduced in 2015 was to encourage an independent and competitive profession.

But what is life like for current IPs? Is there no good news?

Another dramatic fall in complaints

Much more striking is the fall in the numbers of complaints referred to the RPBs:No one – the Insolvency Service, RPBs or R3 – is shouting about this good news: the fact that the complaint number has halved since 2015, the first full year of the Complaints Gateway’s operation? I would have thought that the InsS could have easily spun it into a story about the success of the Gateway or of their policing of insolvency regulation generally, no? 😉

Where are the rem and pre-pack complaints?

I wonder if the subject matter of the complaints is one reason why the InsS may not be keen to draw attention to complaints trends.

The following analyses the complaints put through the Gateway:If we were asked what areas of apparent misconduct we thought were the top of the InsS’s hit-list, I suspect most of us would answer: IP fees and pre-packs. But, as you can see, these two topics have never featured large in complaints.

Despite the fees regime becoming more and more complex and involving the delivery of more information and rights to creditors to question or challenge fees, you can see that the complaints about fees have dropped: there were 19 in 2014 and only one last year. And last year, there were no complaints about pre-packs.

This graph demonstrates what might be behind the drop in complaint numbers: there is a marked decrease in complaints about SIP3 and communication breakdowns. I think that’s certainly good news to shout about.

So in what areas could we perhaps try harder to avoid attracting complaints?

Complaint danger zones?

The following analysis supports the perception that IVAs are attracting fewer complaints than in recent years, although IVAs are still number one. In fact, it demonstrates that all insolvency proceedings are attracting fewer complaints.However, when looked at as a percentage of complaints received…… it would seem that complaints about ADMs and PTDs aren’t dropping quite as quickly as those for other processes. Putting the two analyses together leads me to wonder whether ethics-related complaints involving ADMs now form a disproportionately large category of complaints, particularly in view of the relatively small number of ADMs compared with IVAs and LIQs. Press coverage would also appear to support this area as a growing concern.

Creditors are lodging more complaints

The following graph gives us a little more insight into the origin of complaints:This shows that creditors are the only category of complainant that has seen an increase in the number of complaints lodged over the past year. Could the profession do more to help creditors understand insolvency processes and especially ethics?

The Insolvency Service has reported for a few years now that the Insolvency Code of Ethics has been under review. As we know, the JIC/RPBs launched a consultation on a draft Code last year – the consultation closure date has almost hit its anniversary! The InsS 2017 review reported that a revised Insolvency Code of Ethics “is expected to be issued later this year”. It seems to me that a fresh and clear revised Code could help us address the number of complaints lodged.

Not every complaint is a complaint

I highlighted last year that it seemed the InsS had been sifting out a greater number of complaints as not meeting the criteria for referring over to the relevant RPB. This shows how that trend has developed:Wow! So for the first time, the InsS rejected more complaints that it referred: almost half of all complaints were rejected (48%) and only 41% were referred. Compare this to the first few months of the Gateway’s operation when only 25% were rejected and 72% were referred. Nevertheless, setting aside the number of rejected complaints, it is good to see that even the trend for the number of complaints received is a nice downwards slope. And in case you’re wondering, I suspect that the remaining 11% of complaints received are still being processed by the IS – a fair old number, but pleasingly a lot less than existed at the end of 2016.

Of course, the Gateway is still relatively young and it is good to read that the InsS is continually refining its sifting processes, as can be seen from the following graph:This indicates that a large part of the increase in rejected complaints is because more complainants have not responded to the Insolvency Service’s requests for further information.

For 2017, the Insolvency Service added a new category of rejections: complaints that were about the effect of an insolvency procedure. Although there will always be some creditors and debtors who complain about the fairness of insolvency processes, perhaps an unintended benefit of the Complaints Gateway is that the InsS receives first-hand expressions of dissatisfaction about the design of the insolvency process… although let’s hope the InsS considers using such intelligence to amend legislation where sensible, rather than try to force IPs to fudge legislative flaws via Dear IPs and the like.

You might expect that, as the Insolvency Service rejects more complaints, so the percentage of sanctions arising from complaints that make it past the sifting process should increase.

Roughly one complaint out of every five results in a sanction

Well, you’d be right.The trendline here suggests that a complaint was twice as likely to end up in a sanction in 2017 as it was 10 years’ ago.

You might be wondering what is going on with ACCA-licensed IPs: how can over half of their complaints result in a sanction compared to an average elsewhere of around 10-20%?!

I agree that the figures are odd. However, it should be remembered that complaints are not always closed in the year that they are opened. And in this respect, the ACCA’s stats appear particularly odd. For example, in last year’s InsS report, it was stated that the ACCA had only one 2013 complaint remaining open, but in this year’s report, apparently there are now thirteen 2013 open complaints against ACCA-licensed IPs! The ACCA went through some enormous changes last year, as their complaints-handling and monitoring functions were taken over by the IPA with effect from 1 January 2017. Could this structural change be behind the unusual stats? Or perhaps the ACCA had been handling some particularly sticky complaints in 2014 and 2015, when their sanctions were low, and those investigations have now come to fruition.

The same effect of sanction clustering could be operating within the other RPBs in view of the spiky lines above. Therefore, perhaps it would be wise to avoid drawing conclusions about apparent inconsistencies between RPBs’ complaints processes based on 2017’s figures alone. However, averaging out the figures over the past three years, we can see that 23% of complaints against IPA-licensed IPs resulted in a sanction, whereas only 5% of complaints against ICAEW-licensed IPs did so. I believe that the IPA licenses more than its fair share of IVA-specialists, so this might account for at least some of the difference.

Increased sanctions are not just a Gateway-sifting effect

But what about my suggestion above: that the increased number of sifted-out complaints has led to a larger proportion of complaints allowed through the Gateway leading to a sanction?

That’s not the whole story:This shows that the number of complaints sanctions per IP has also been on an upward trend: around 1 in 100 IPs received a sanction in 2008, whereas this figure was closer to 1 in 20 in 2017.

What is behind this trend? I really don’t believe that it’s because more IPs now conduct themselves in ways meriting sanctions (or because there are a few IPs who behave badly more often). And as we’ve seen, the number of complaints lodged doesn’t support a theory that more people complain now.

It must be because expectations have been raised, don’t you think? Or perhaps because the increased prescription in rules and SIPs has led to more traps?

Hidden measuring-sticks?

For example, the InsS report describes one IP’s disciplinary order, stating that the IP had breached SIP16 “by failing to provide a statement as to whether the connected party had been made aware of their ability to approach the pre-pack pool and/or had approached the pre-pack pool and whether a viability statement had been requested from the connected party but not provided”. Firstly, SIP16 doesn’t strictly require IPs to state whether connected parties have been made aware of the pool. Secondly, SIP16 states that the SIP16 Statement should include “one of” two listed statements, only one being whether the pool had been approached. Yes, I’ll accept that it seems the IP did not provide information on the existence of a viability statement, although I would have thought that, if a copy of a viability statement were not provided with the SIP16 Statement, then surely the likelihood is that the IP was not provided with one. I appreciate I am splitting hairs here, but if a SIP is not crystal-clear on what is required of IPs, is it any wonder that slip-ups will be made? And if a disciplinary consent order were generated every time an IP had omitted to meet every last letter of the SIPs and Rules, then I suspect no IP would be found entirely blameless. Ok yes, there exists a mysterious fanaticism around SIP16 compliance and we would do well to check, check and check again that SIP16 Statements are complete (and hang the cost?). However, I think this demonstrates how standards have changed: 10 years’ ago, would an IP have been fined £2,500 and have his name in lights for omitting one line from a report (hint: SIP16 began life in 2009)?

In my next blog, I’ll explore the RPB statistics on monitoring visits.

In an unprecedented step, the IPA and the ICAEW have issued largely consistent articles on fees, SIP9 and reporting. I think some of the points are well worth repeating, not only because in the past few months, I’ve seen more IPs get into a fix over fees than anything else, the new rules having simply compounded the complexities, but also because the articles contain some important new messages.

In this post, I explore how you can make your fee proposals bullet-proof:

The effort seems to have originated from a well-received presentation at the autumn’s R3 SPG Forum, given by the ICAEW’s Manager, Alison Morgan (nee Timperley) and the IPA’s Senior Monitoring Manager, Shelley Bullman.

As the ICAEW and the IPA monitor c.90% of all appointment-taking IPs, I think this is a fantastic demonstration of how the RPBs can get out to us useful guidance. Of course, such articles do not have the regulatory clout of SIPs or statute (see below). However, I believe it is an essential part of the RPBs’ role to reach out to members in this way in written form. Although roadshow presentations are valuable, they can only reach the ears of a proportion of those in need and the messages soon settle into a foggy memory (if you’re lucky!).

Do the articles represent the RPBs’ views?

The IPA article ends with a disclaimer that “IPA staff responses” cannot fetter the determinations of the IPA’s committees and the ICAEW article is clearly authored by Alison Morgan, rather than being something that can strictly be relied upon as representing the ICAEW’s views (for the sake of simplicity, I have referred throughout to the articles as written by “the monitors”).

That’s a shame, but I know only so well how extraordinarily troublesome it is to push anything through the impenetrable doors of an RPB – that’s why SIPs seem to emerge so often long after the horse has bolted… and I suspect why we are still waiting for an insolvency appendix to the new CCAB MLR guidance. However, at a time when the Insolvency Service’s mind is beginning to contemplate again the question of a single regulator, issuing prompt and authoritative guidance serves the RPBs’ purposes, not only ours.

Pre-Administration Costs

Over the past few years, I’ve seen an evolving approach from the RPBs. In the early days, the focus was on the process of getting pre-administration costs approved. The statutory requirement for pre-administration costs to be approved by a resolution separate from the Proposals has taken a while to sink in… and the fact that the two articles repeat this requirement suggests that it is still being overlooked on occasion.

Then, the focus turned to the fact that it was, not only pre-administration fees that required approval, but also other costs. I still see cases where IPs only seek approval of their own costs, apparently not recognising that, if the Administration estate is going to be paying, say, agents’ or solicitors’ costs incurred pre-administration, these also need to go through the approval process.

What pre-administration work is an allowable expense?

Now, it seems that the monitors’ focus has returned to the IP’s own fees. Their attention seems fixed on the definition of pre-administration costs being (R3.1):

“fees charged, and expenses incurred by the administrator, or another person qualified to act as an insolvency practitioner in relation to the company, before the company entered administration but with a view to it doing so.”

The IPA article states that this “would exclude any insolvency or other advice that may or may not lead directly to the administration appointment” and the ICAEW article states that it “would exclude any general insolvency or other advice”.

I do wonder at the fuzzy edges: if a secured creditor who is hovering over the administration red button asks an IP to speak with a director, doesn’t the IP’s meeting with the director fit the description? Or if an IP seeks the advice of an agent or solicitor about what might happen if an administration were pursued, wouldn’t this advice count? But nevertheless, the monitors do have a point. If a firm were originally instructed to conduct an IBR, this work would not appear to fall into the definition of pre-administration costs. Also, if an IP originally took steps to help a company into liquidation but then the QFCH decided to step in with an Administration, the pre-liquidation costs could not be paid from the Administration estate.

What pre-administration costs detail is often missing?

As mentioned above, the monitors remind us that pre-administration costs require a decision separate from any approval of the Proposals – there is no wriggle-room on this point and deemed consent will not work. The monitors also list other details required by statute that are sometimes missing, of which these are my own bugbears:

R3.35(10): a statement that the payment of any unpaid pre-administration costs as an expense of the Administration is subject to approval under R3.52 and is not part of the Proposals subject to approval under Para 53 of Schedule B1

R3.36(a): details of any agreement about pre-administration fees and/or expenses, including the parties to the agreement and the date of the agreement

R3.36(b): details of the work done

R3.36(c): an explanation of why the work was done before the company entered administration and how it had been intended to further the achievement of an Administration objective

R3.36(d) makes clear that details of paid pre-administration costs, as well as any that we don’t envisage paying from the Administration estate, should be provided

R3.36(e): the identities of anyone who has made a payment in respect of the pre-administration costs and which type(s) of costs they discharged

R3.36(g) although it will be a statement of the obvious if you have provided the above, you also need to detail the balance of unpaid costs (per category)

Pre-CVL Costs

Another example of an evolving approach relates to the scope of pre-CVL costs allowable for payment from the liquidation estate. Again, over recent years we have seen the RPB monitors get tougher on the fact that the rules (old and new) do not provide that the IP’s costs of advising the company can be charged to the liquidation estate. This has been repeated in the recent articles, but the IPA’s article chips away further still.

A new category of pre-CVL work that is not allowable as an expense?

R6.7 provides that the following may be paid from the company’s assets:

R6.7(1): “Any reasonable and necessary expenses of preparing the statement of affairs under Section 99” and

R6.7(2): “Any reasonable and necessary expenses of the decision procedure or deemed consent procedure to seek a decision from the creditors on the nomination of a liquidator under Rule 6.14”.

Consequently, the IPA article states that:

“Pre-appointment advice and costs for convening a general meeting of the company cannot be drawn from estate funds after the date of appointment, even if you have sought approval for them.”

So how do you protect yourself from tripping up on this?

If you’re seeking a fixed fee for the pre-CVL work, make sure that your paperwork reflects that the fee is to cover only the costs of the R6.7(1) and (2) work listed above. Of course, SIP9 also requires an explanation of why the fixed fee sought is expected to produce a fair and reasonable reflection of the R6.7(1)/(2) work undertaken. Does this mean that you should be setting the quantum lower than you would have done under the 1986 Rules, given that you should now exclude the costs of obtaining the members’ resolutions? Well, personally, I don’t see that the effort expended under the 2016 Rules is any less than it was before, even if you cut out the work in dealing with the members, but you will need to consider (and, at least in exceptional cases, document) how you assess that the quantum reflects the “reasonable and necessary” costs of dealing with the R6.7(1)/(2) work.

Alternatively, if you’re seeking pre-CVL fees on a time costs basis, make sure that you isolate the time spent in carrying out only the R6.7(1)/(2) work and that you don’t seek to bill anything else to the liquidation estate.

Although the articles don’t cover it, I think it’s also worth mentioning that, as liquidator, you need to take care when discharging any other party’s pre-CVL costs that they fall into the R6.7(1)/(2) work.

Proposing a Decision on Office Holders’ Fees

What Rules/SIP9 detail is commonly missing from fee proposals?

The articles list some relatively common shortcomings in fee proposals (whether involving time costs or otherwise):

lack of detail of anticipated work and why the work is necessary

no statement about whether the anticipated work will provide a financial benefit to creditors and, if so, what benefit

no indication of the likely return to creditors (SIP9 requires this “where it is practical to do so” – personally, I cannot see how it would be impractical if you’re providing an SoA/EOS and proposed fees/expenses)

generic listings of tasks to be undertaken that include items irrelevant to the case in question

last-minute delivery of information, resulting in the approving body having insufficient time to make an informed judgment

The IPA article states that “presenting the fee estimate to the meeting is not considered to be giving creditors as a body sufficient time to make a reasoned judgement”. Personally, I would go further and question whether giving the required information to only some of the creditors (i.e. only those attending a meeting) meets the requirement in R18.16(4) to “deliver [it] to the creditors”. At the R3 SPG Forum, one of the monitors also expressed the view that, if fee-related information is being delivered along with the Statement of Affairs at the one business day point for a S100 decision, this is “likely to be insufficient time”.

fee estimates not based on the information available or providing for alternative scenarios or bases

I wonder whether the monitors are referring primarily to the fairly common approaches to investigation work, where an IP might estimate the time costs where nothing of material concern is discovered and those that might arise where an action to be pursued is identified down the line. You might also be tempted to set out different scenarios when dealing with, say, a bankrupt’s property: will a straightforward deal be agreed or will you need to go the whole hog with an order for possession and sale?

Some IPs’ preference for seeking fee approval only once is understandable – it would save the costs of reverting to creditors and potentially of hassling them to extract a decision – but at the SPG Forum the monitors recommended a milestone approach to deal with such uncertainties: a fee estimate to deal with the initial assessment and later an “excess fee” request for anything over and above this once the position is clearer. This approach would often require a sensitive touch, as you would need to be careful how you presented your second request as regards the next steps you proposed to undertake to pursue a contentious recovery and the financial benefit you were hoping to achieve. But it better meets what is envisaged by SIP2 and would help to justify your decision either to pursue or to drop an action.

Alternatively, perhaps the monitors have in mind the fees proposed on the basis of only a Statement of Affairs containing a string of “uncertain”-valued assets. Depending on what other information you provide, it could be questioned whether creditors have sufficient information to make an informed judgment.

no disclosure of anticipated expenses

Under the Rules, this detail must be “deliver[ed] to the creditors” prior to the determination of the fee basis, whether time costs or otherwise, for all but MVLs and VAs… and SIP9 and SIPs3 require it in those other cases as well. It is important to remember also that this relates to all expenses, not simply Category 2 disbursements, and including those to be paid directly from the estate, e.g. to solicitors and agents.

How do the monitors view Rules/SIP9 omissions?

At the R3 SPG Forum, one of the monitors stated that, if the Rules and SIP9 requirements are not strictly complied with, the RPB could ask the IP to revert to creditors with the omitted information in order to make sure that the creditors understood what they were approving and that this would be at the cost of the IP, not the estate. The IPA’s article states that “where a resolution for fees has been passed and insufficient information is provided we would recommend that the correct information is provided to creditors at the next available opportunity and ratification of the fee sought”. Logically, such a recommendation would depend on the materiality of the omission.

When considering the validity of any fee decision, personally I would put more weight on the Rules’ requirements, rather than SIP9 (nothing personal RPBs, but I believe the court would be more concerned with a breach of the Rules). For example, I would have serious concerns about the validity of a fees decision where no details of expenses are provided – minor technical breaches may not be fatal to a fees decision, but surely there comes a point where the breach kills the purported decision.

Fixed and Percentage Fees

How can you address the SIP9 “fair and reasonable” explanation?

It is evident that in some cases the SIP9 (paragraph 10) requirement for a “fair and reasonable” explanation for proposed fixed or % fees is not being met to the monitors’ expectations. The ICAEW article highlights the need to deal with this even for IVAs… which could be difficult, as I suspect that most IPs proposing an IVA would consider that the fee that would get past creditors is both unfair and unreasonable! MVL fixed fees also are usually modest sums in view of the work involved.

The articles don’t elaborate on what kind of explanation would pass the SIP9 test. Where the fee is modest, I would have thought that a simple explanation of the work proposed to be undertaken would demonstrate the reasonableness, but a sentence including words such as “I consider the proposed fee to be a fair and reasonable reflection of the work to be undertaken, because…” might help isolate the explanation from the surrounding gumpf. For IVAs, it might be appropriate to note how the proposed fee compares to the known expectations of what the major/common creditors believe to be fair and reasonable.

What is an acceptable percentage?

Soon after the new fees regime began, the RPB monitors started expressing concern about large percentage fees sought on simple assets, such as cash at bank. Their concerns have now crystallised into something that I think is sensible. Although a fee of 20% of cash at bank may seem alarming in view of the work involved in recovering those funds, very likely the fee is intended to cover other work, perhaps all other work involved in the case from cradle to grave. In addressing the fair and reasonable test, clearly it is necessary to explain what work will be covered by the proposed fee. Of course, if you were to seek 20% of a substantial bank balance simply to cover the work in recovering the cash, you can expect to be challenged!

Equally, it is important to be clear on what the proposed fee does not cover. For example, as mentioned above, the extent of investigation work and potential recoveries may be largely unknown when you seek fee approval. It may be wise to define to which assets a % fee relates and flag up to creditors the potential for other assets to come to light, which may involve other work excluded from the early-day proposed fee. The IPA article repeats the message that a fee cannot be proposed on unknown assets.

Mixed Fee Bases

It seems to me that it can be tricky enough to get correct the fee decision and billing of a single basis fee, without complicating things by looking for more than one basis! To my relief, personally I have seen few mixed fee bases being used.

How is mixing time costs with fixed/% viewed?

In particular, I think it is hazardous to seek a fee on time costs plus one other basis. Only where tasks are clearly defined – for example, a % on all work related to book debt collections and time costs on everything else – could I see this working reasonably successfully. The IPA article notes that:

when proposing fees, you need to state clearly to what work each basis relates; and

your time recording system must be “sufficiently robust to ensure the correct time is accurately recorded against the appropriate tasks”.

I would add a third: mistakes are almost inevitable, so I would recommend a review of the time costs incurred before billing – the narrative or staff members involved should help you spot mis-postings.

Of course, there are plenty of other Rules/SIP areas where mistakes are commonly made – for example, the two articles highlight some common issues with progress reports, which are well worth a read. However, few breaches of Rules or SIPs have the potential to be more damaging. Therefore, I welcome the RPB monitors’ efforts in highlighting the pitfalls around fees. Prevention is far better than cure.

The Review reveals another drop in the number of appointment-taking IPs. In fact, there was the same number on 1 January 2017 as there was on the same day in 2009: 1,303.

Is it a surprise that the number of appointment-taking IPs has dropped again? The 2016 insolvency statistics show modest increases in the numbers of CVLs and IVAs compared with 2015 and of course there was a bumper crop of MVLs in early 2016. Why is it that fewer IPs seem to be responsible for more cases?

My hunch is that the complexity of cases in general is decreasing and I suspect that the additional hurdles put in place as regards fees have encouraged IPs to look at efficiencies, to create slicker processes, and to be more risk-averse, less inclined to go out on a limb with the result that some cases are despatched more swiftly and require less IP input.

I also suspect the IP number for next January will show another drop. The expense and effort to adapt to the 2016 Rules will make some think again, won’t it?

Does the presence of the regulators breathing down one’s neck erode IPs’ keenness to remain in the profession? How worried should IPs be about the risk of a regulatory sanction?

Regulatory actions on the increase

The RPBs seem to have shown varying degrees of enthusiasm when it comes to taking regulatory action.

To me, this hints at regulatory scrutiny of a different kind. Is it coincidental that the ACCA issued proportionately far more sanctions than any other RPB last year? Could the Insolvency Service’s repeated monitoring visits to the ACCA over 2015 and 2016 have had anything to do with this spike?

What are behind these sanctions? Are they generated from the RPBs’ monitoring visits or from complaints?

Monitoring v complaints sanctions return to normality

Last year, I observed that for the first time RPBs’ investigations into complaints had generated more sanctions than their monitoring visits. Regulatory actions in 2016 returned to a more typical pattern.

Does this reflect a shifting RPB behaviour or is it more a result of the number of complaints received and/or the number of monitoring visits undertaken?

Dramatic fall in complaints

Well, no wonder there were fewer disciplinary actions on the back of complaints: the RPBs received 28% fewer complaints in 2016 than they did in 2015.

Why is this? Is it because fewer complaints were made? Undoubtedly, IVAs have generated a flood of complaints in recent years not least because of the issues surrounding ownership of PPI claims, but those issues were still live in 2016, weren’t they?

Perhaps we can explore this by looking at the complaint profile by case type:

Yes, it looks like IVAs continued to be contentious last year, although perhaps the worst is over. It seems, however, that the most significant drop has been felt in complaints relating to bankruptcies and liquidations. The reduction in bankruptcy complaints is understandable, as the numbers of bankruptcies have dropped enormously over the past few years, but liquidation numbers have kept reasonably steady, so I am not sure what is going on there.

But are fewer people really complaining or is there something else behind these figures?

An effective Complaints Gateway sift?

When the Complaints Gateway was set up in 2014, it was acknowledged that the Insolvency Service would ensure that complaints met some simple criteria before they were referred to the RPBs. There must be an indication of a breach of legislation, SIP or the Code of Ethics and the allegations should be capable of being supported with evidence. Where this is not immediately apparent, the Service seeks additional information from the complainant.

The graphs above are based on the complaints referred to the RPBs, so what is the picture as regards complaints received before the sifting process occurs?

This shows that the Complaints Gateway sifted out more complaints last year: the percentage rejected rose from 25% in 2014, to 27% in 2015, to 29% in 2016.

The Insolvency Service’s review explains that in 2016 a new criterion was added: “Complainants are now required in the vast majority of cases to have raised the matter of concern with the insolvency practitioner in the first instance before the complaint will be considered by the Gateway”. This is a welcome development, but it did not affect the numbers much: it resulted in only 13 complaints being turned away for this reason.

But this rejected pile is not the whole story. The graph also demonstrates that a significant number of complaints – 144 (17%) – were neither rejected nor referred last year, which is a much larger proportion than previous years. Presumably these complaints are being held pending further exchanges between the Service and the complainant. Personally, I am comforted by this demonstration of the Service’s diligence in managing the Gateway, but I hope that this does not hint at a system that is beginning to get snarled up.

How many complaints led to sanctions?

When I looked at the Insolvency Service’s review last year, I noted that the IPA’s sanctions record appeared out of kilter to the other RPBs. It is interesting to note that 2016 appears to have been a more “normal” year for the IPA, but instead the ACCA seems to have had an exceptional year. As mentioned above, I wonder if the Insolvency Service’s focus on the ACCA has had anything to do with this unusual activity (I appreciate that 2010 was another exceptional year… and I wonder if the fact that 2010 was the year that the Insolvency Service got heavy with its SIP16-reviewing exercise had anything to do with that particular flurry).

The obvious conclusion to draw from this graph might be that an ACCA-licensed IP has a 1 in 3 chance that any complaint will result in a sanction. However, perhaps these IPs can rest a little easier, given that the ACCA’s complaints-handling is now being dealt with by the IPA.

What about sanctions arising from monitoring visits? How do the RPBs compare on that front?

All but one RPB reported an increase in monitoring sanctions

These percentages look rather spectacular, don’t they? It gives the impression that on average almost one third of all monitoring visits result in some kind of negative outcome… and it appears that 90% of all the CAI’s monitoring visits gave rise to a negative outcome! Well, not quite. It is likely that some monitoring visits led to more than one black mark, say a plan for improvement and a targeted visit to review how those plans had been implemented.

Nevertheless, it is interesting to note that almost all RPBs recorded increases in the number of negative outcomes from monitoring visits over the previous year. I am not sure why the IPA seems to have bucked the trend. It will be interesting to see how the populations of ACCA and IPA-licensed IPs fare this year, as they are now being monitored and judged by the same teams and Committees.

How frequently are visits being undertaken?

The Principles for Monitoring, which forms part of a memorandum of understanding (“MoU”) between the Insolvency Service and the RPBs, state that the period between monitoring visits “is not expected to significantly exceed three years but may, where satisfactory risk assessment measures are employed, extend to a period not exceeding six years”. However, most if not all the RPBs publicise that their monitoring programmes are generally on a 3-yearly cycle.

The following graph shows that the RPBs are not quite meeting this timescale:

If we look at each RPB’s visits for the past 3 years as a percentage of their appointment-taking licence-holders, how far off the 100% mark were they..?

ICAEW’s missing of the mark is not surprising, given that they publicise that their IPs in the larger practices are on 6-year cycles. At the other end of the spectrum is the ACCA, which managed to visit all their IPs over the past 3 years and then some. However, as we know, the ACCA has relinquished its monitoring function to the IPA, so it seems unlikely that this will continue.

What is the future for monitoring visits?

The Insolvency Service’s 2015 review hinted that the days of the MoU may have been numbered. Their 2016 review strengthens this message:

“We propose to withdraw the MoU as soon as is reasonably feasible, subject to working through some final details”.

The review goes on to explain that the Service will be adding to their existing guidance (https://goo.gl/wDHElg). As it currently stands, prescriptive requirements such as the frequency of monitoring visits is conspicuously absent from this guidance. Instead, it is largely outcomes-based and reflects the Regulator’s Code to which the Insolvency Service itself is subject and that emphasises the targeting of monitoring resources where they should be most effective at addressing priority risks. The Service itself seems to be lightening up on its own monitoring visits: the review states that, having completed their round of full monitoring visits to the RPBs, they are now moving towards a number of risk based themed reviews. If this approach filters through to the RPBs’ monitoring visits, will we see a removal of the 3-yearly standard cycle?

Current priorities for the regulators

Does the 2016 review reveal any priorities for this year?

Not unsurprisingly, given one particularly high profile failure, IVAs feature heavily. The review refers to “general concerns around the volume IVA business model and developments in practice” and continues:

“The Insolvency Service is working with the profession to tackle some of these concerns; for example, through changes to guidance on monitoring and protections for client funds, and also a review of insurance arrangements. We are also engaging with stakeholder groups to better understand their concerns and how these may be tackled. We expect that this will be a key focus of our work for the coming year.”

Other projects mentioned in the review include:

Possible legislative changes to the bonding regime – consultation later this year;

Progression of the Insolvency Service’s recommendation that the RPBs introduce a compensation mechanism for complainants who have suffered inconvenience, loss or distress;

Publication of the Insolvency Service’s review into the RPBs’ monitoring and regulation processes, including consistency of outcomes, the extent of independence between the membership and regulatory functions, and the RPBs’ financial capabilities – report to be released within 12 months;

Progress on a review into the RPBs’ approach to the regulatory objective to encourage a profession which delivers services at a fair and reasonable cost, including how they are assessing compliance with the Oct-15 fee estimate regime – report to be released by the end of the year; and

A consultation on revisions to the Code of Ethics – expected in the spring.

The Insolvency Service’s 2014 Review had the target of transparency at its core. This time, the Insolvency Service has added consistency. Do the Annual Reviews reveal a picture of consistency between the RPBs?

My second post on the Insolvency Service’s 2015 Annual Review of IP regulation looks at the following:

Are the RPBs sticking to a 3-year visit cycle?

How likely is it that a monitoring visit will result in some kind of regulatory action?

What action are the RPBs likely to take and is there much difference between the RPBs?

What can we learn from 6 years of SIP16 monitoring?

How have the RPBs been faring in their own monitoring visits conducted by the Insolvency Service?

What have the Service set in their sights for 2016?

RPBs converge on a 3-yearly visit cycle

The graph of the percentages of IPs that had a monitoring visit last year gives me the impression that a 3-yearly visit cycle has most definitely become the norm:

(Note: because the number of SoS IPs dropped so significantly during the year – from 40 to 28 – all the graphs in this article reflect a 2015 mid-point of SoS-authorised IPs of 34.)

Does this mean that IPs can predict the timing of their next routine visit? I’m not sure. It seems to me that some standard text is slipping into the Insolvency Service’s reports on their monitoring visits to the RPBs. The words: “[RPB] operates a 3-year cycle of rolling monitoring visits to its insolvency practitioners. The nature and timing of visits is determined annually on a risk-assessment basis” have appeared in more than one InsS report.

What do these words mean: that every IP is visited once in three years, but some are moved up or down the list depending on their risk profile? Personally, this doesn’t make sense to me: either visits are timed according to a risk assessment or they are carried out on a 3-year cycle, I don’t see how you can achieve both. If visit timings are sensitive to risk, then some IPs are going to receive more than one visit in a 3-year period and, unless the RPB records >33% of their IP number as having a visit every year (which the graph above shows is generally not the case), the corollary will be that some IPs won’t be visited in a 3-year period.

My perception on the outside is that, generally, the timing of visits is pretty predictable and is now pretty-much 3-yearly. I’ve seen no early parachuting-in on the basis of risk assessments, although I accept that my field of vision is very narrow.

Most RPBs report reductions in negative outcomes from monitoring visits

The following illustrates the percentage of monitoring visits that resulted in a “negative outcome” (my phrase):

As you can see, most RPBs are clocking up between c.10% and 20% of monitoring visits leading to some form of negative consequence and, although individual records have fluctuated considerably in the past, the overall trend across all the regulatory bodies has fallen from 30% in 2008 to 20%.

However, two bodies seem to be bucking the trend: CARB and the SoS.

Last year, I didn’t include CARB (the regulatory body for members of the Institute of Chartered Accountants in Ireland), because its membership was relatively small. It still licenses only 41 appointment-taking IPs – only 3% of the population – but, with the exit of SoS authorisations, I thought it was worth adding them to the mix.

I am sure that CARB’s apparent erratic history is a consequence of its small population of licensed IPs and this may well explain why it is still recording a much greater percentage of negative outcomes than the other RPBs. Nevertheless, CARB does seem to have recorded exceptionally high levels for the past few years.

The high SoS percentage is a little surprising: 50% of all 2015 visits resulted in some form of negative outcome – these were all “plans for improvement”. CARB’s were a mixture of targeted visits, undertakings and one penalty/referral for disciplinary consideration.

So what kind of negative outcomes are being recorded by the other RPBs? Are there any preferred strategies for dealing with IPs falling short of expected standards?

What responses are popular for unsatisfactory visits?

The following illustrates the actions taken by the top three RPBs over the last 4 years:

* The figures for ICR/self certifications requested and further visits should be read with caution. These categories do not appear in every annual review, but, for example, it is clear that RPBs have been conducting targeted visits, so this graph probably does not show the whole picture for the 2012 and 2013 outcomes. In addition, of course the ICAEW requires all IPs to carry out annual ICRs, so it is perhaps not surprising that this category has rarely featured.

I think that all this graph suggests is that there is no trend in outcome types! I find this comforting: it might be difficult to predict what outcome to expect, but it suggests to me that the RPBs are flexible in their approaches, they will implement whatever tool they think is best fitted for the task.

Looking back on 6 years of SIP16 monitoringWe all remember how over the years so many people seemed to get hot under the collar about pre-packs and we recall some appallingly misleading headlines that suggested that around one third of IPs were failing to comply with regulations. Where have the 6 years of InsS monitoring of SIP16 Statements got us? I will dodge that question, but I’ll simply illustrate the statistics:

Note: several years are “estimates” because the InsS did not always review all the SIP16 Statements they received. Also, the Service ended its monitoring in October 2015. Therefore, I have taken the stats in these cases and pro rated them up to a full year’s worth.

Does the graph above suggest that a consequence of SIP16 monitoring has been to discourage pre-packs? Well, have a look at this one…

As you can see, the dropping number of SIP16s is more to do with the drop in Administrations. In fact, the percentage of pre-packs has not changed much: it was a peak of 31% of all Administrations in 2012 and was at its lowest in 2014 at 24%.

I guess it could still be argued that the SIP16 scrutiny has persuaded some to sell businesses/assets in the pre (or immediately post) liquidation period, rather than use Administration. I’m not sure how to test that particular theory.

So, back to SIP16 compliance, the graph-but-one above shows that the percentage of Statements that were compliant has increased. It might be easier to see from the following:

A hidden downside of all this focus on improving SIP16 compliance, I think, is the costs involved in drafting a SIP16 Statement and then, as often happens, in getting someone fairly senior in the practice to double-check the Statement to make sure that it ticks every last SIP16 box. Is this effort a good use of resources and of estate funds?

Now that the Insolvency Service has dropped SIP16 monitoring, does that mean we can all relax a bit? I think this would be unwise. The Service’s report states that it “will review the outcome of the RPBs’ consideration of SIP16 compliance and will continue to report details in the Annual Review”, so I think we can expect SIP16 to remain a hot regulatory topic for some time to come.

The changing profile of pre-packs

The Service’s reports on SIP16 Statements suggest other pre-pack trends:

Personally, I’m surprised at the number of SIP16 Statements that disclose that the business/assets were marketed by the Administrator: last year it was 56%. I’m not sure if that’s because some SIP16 Statements are explaining that the company was behind some marketing activities, but, if that’s not the reason, then 56% seems very low to me. It would be interesting to see if the revised SIP16, which introduced the “marketing essentials”, makes a difference to this rate.

Have some pity for the RPBs!

The Service claimed to have delivered on their commitments in 2015 (incidentally, one of their 2014 expectations was that the new Rules would be made in the autumn of 2015 and they would come into force in April 2016 – I’m not complaining that the Rules are still being drafted, but I do think it’s a bit rich for the Executive Foreword to report pleasure in having met all the 2014 “commitments”).

The Foreword states that the reduction in authorising bodies is “a welcome step”. With now only 5 RPBs to monitor and the savings made in dropping SIP16 monitoring (which was the reported reason for the levy hike in 2009), personally I struggle to see the Service’s justification for increasing the levy this year. The report states that it was required in view of the Service’s “enhanced role as oversight regulator”, but I thought that the Service did not expect to have to flex its new regulatory muscles as regards taking formal actions against RPBs or directly against IPs.

However, the tone of the 2015 Review does suggest a polishing of the thumb-screws. The Service refers to the power to introduce a single regulator and states that this power will “significantly shape” the Service’s work to come.

In 2015, the Service carried out full monitoring visits to the ICAEW, ICAS and CARB, and a follow-up visit to the ACCA. This is certainly more visits than previous years, but personally I question whether the visits are effective. Of course, I am sure that the published visit reports do not tell the full stories – at least, I hope that they don’t – but it does seem to me that the Service is making mountains out of some molehills and their reports do give me the sense that they’re concerned with processes ticking the Principles for Monitoring boxes, rather than being effective and focussing on good principles of regulation.

For example, here are some of the molehill weaknesses identified in the Service’s visits that were resisted at least in part by some of the RPBs – to which I say “bravo!”:

Pre-visit information requested from the IPs did not include details of complaints received by the IP. The ICAEW responded that it was not convinced of the merits of asking for this on all visits but agreed to “consider whether it might be appropriate on a visit by visit basis”.

Closing meeting notes did not detail the scope of the visit. The ICAEW believed that it is important for the closing meeting notes to clearly set out the areas that the IP needs to address (which they do) and it did not think it was helpful to include generic information… although it seems that, by the time of the follow-up visit to the ICAEW in February 2016, this had been actioned.

The Service remains “concerned” that complainants are not provided with details of the independent assessor on their case. “ACCA regrets it must continue to reject this recommendation as ACCA does not believe naming assessors will add any real value to the process… There is also the risk of assessors being harassed by complainants where their decision is not favourable to them.”

Late bordereaux were only being chased at the start of the following month. The Service wanted procedures put in place to “ensure that cover schedules are provided within the statutory timescale of the 20th of each month and [to] follow up any outstanding returns on 21st or the next working day of each month”. Actually, CARB agreed to do this, but it’s just a personal bug-bear of mine. The Service’s report to the ICAEW went on about the “vital importance” of bonding – with which I agree, of course – but it does not follow that any bordereaux sent by IPs to their RPB “demonstrate that they have sufficient security for the performance of their functions”. It simply demonstrates that the IP can submit a schedule on time every month. I very much suspect that bordereaux are not checked on receipt by the RPBs – what are they going to do: cross-check bordereaux against Gazette notices? – so simply enforcing a zero tolerance attitude to meeting the statutory timescale is missing the point and seems a waste of valuable resources, doesn’t it?

Future Focus?

The Annual Review describes the following on the Insolvency Service’s to-do list:

Complaint-handling: in 2015, the Service explored the RPBs’ complaint-handling processes and application of the Common Sanctions Guidance. The Service has made a number of recommendations to improve the complaints process and is in discussion with the RPBs. They expect to publish a full report on this subject “shortly”.

Debt advice: also in 2015, they carried out a high-level review of how the RPBs are monitoring IPs’ provision of debt advice and they are currently considering recommendations for discussion with the RPBs.

Future themed reviews: The Service is planning themed reviews (which usually mean topic-focussed questionnaires to all RPBs) over 2016 and 2017 covering: IP monitoring; the fees rules; and pre-packs.

Bonding: the Service has been examining “the type and level of cover offered by bonds and considering both the legislative and regulatory arrangements to see if they remain fit for purpose”. They are cagey about the outcomes but do state that they “will work with the industry to effect any regulatory changes that may be necessary” and they refer to “any legislative change” being subject to consultation.

Relationship with RPBs: the Service is contemplating whether the Memorandum of Understanding (“MoU”) with the RPBs is still needed, now that there are statutory regulatory objectives in place. The MoU is a strange animal – https://goo.gl/J6wmuN. I think that it reads like a lot of the SIPs: a mixture of principles and prescription (e.g. a 10-day acknowledgement of complaints); and a mixture of important standards and apparent OTT trivia. It would be interesting to see how the Service approaches monitoring visits to the RPBs if the MoU is removed: they will have to become smarter, I think.

Ethics? The apparent focus on ethical issues seems to have fallen from the list this year. In 2015, breaches of ethics moved from third to second place in the list of complaints received by subject matter (21% in 2014 and 27% in 2015), but reference to the JIC’s work on revising the Ethics Code has not been repeated in this year’s Review. Presumably the work is ongoing… although there is certainly more than enough other tasks to keep the regulators busy!

The Insolvency Service’s 2015 review of IP regulation was released in March and, as usual, I’ve dug around the statistics in comparison with previous years.

They indicate that complaint sanctions have increased (despite complaint numbers dropping), but monitoring sanctions have fallen. Why is this? And why was one RPB alone responsible for 93% of all complaints sanctions?

I honestly had no idea that the R3 member survey issued earlier today was going to ask about the effectiveness of the regulatory system. I would encourage R3 members to respond to the survey (but don’t let this blog post influence you!).

IP number falls to 6-year low

I guess it was inevitable: no IP welcomes the hassle of switching authorising body and word on the street has always been that being authorised by the SoS is a far different experience to being licensed by an RPB. Therefore, I think that the withdrawal from authorising by the SoS (even with a run-off period) courtesy of the Deregulation Act 2015 and the Law Societies was likely to affect the IP numbers.

Here is how the landscape has shifted:

As you can see, the remaining RPBs have not gained all that the SoS and Law Societies have lost and ACCA’s and CARB’s numbers have dropped since last year. It is also a shame to note that, not only has the IP number fallen for the first time in 4 years, it has also dropped to below the 2010 total.

Personally, I expect the number to drop further during 2016: I am sure that the prospect of having to adapt to the new Insolvency Rules 2016 along with the enduring fatigue of struggling to get in new (fee-paying) work and of taking the continual flak from regulators and government will persuade some to hang up their boots. I also don’t see that the industry is attracting sufficient new joiners who are willing and able to take up the responsibility, regardless of the government’s partial licence initiative that has finally got off the ground.

Maybe this next graph will make us feel a bit better…

Number of regulatory sanctions fall

Although the numbers are spiky, I guess there is some comfort to be had in seeing that the regulatory bodies issued fewer sanctions against IPs in 2015. [To try to put 2010’s numbers into context, you’ll remember that 1 January 2009 was the start of the Insolvency Service’s monitoring of the revised SIP16, which led to a number of referrals to the RPBs, although I cannot be certain that this was behind the unusual 2010 peak in sanctions.]

But what interests me is that the number of sanctions in 2015 arising from complaints far outstripped those arising from monitoring visits, which seems quite a departure from the picture of previous years. What is behind this? Is it simply a consequence of our growing complaint-focussed society?

Complaints on the decrease

Well actually, as you can see here, it seems that fewer complaints were registered last year… by quite a margin.

I confess that some of these years are not like-for-like comparisons: before the Complaints Gateway, the RPBs were responsible for reporting to the Insolvency Service how many complaints they had received and it is very likely that they incorporated some kind of filter – as the Service does – to deal with communications received that were not truly complaints. However, it cannot be said for certain that the RPBs’ pre-Gateway filters worked in the same way as the Service’s does now. Nevertheless, what this graph does show is that 2015’s complaints referred to the regulatory bodies were less than 2014’s (which was c.half a Gateway year – the “Gateway (adj.)” column represents a pro rata’d full 12 months of Gateway operation based on the partial 2014 Gateway number).

It is also noteworthy that the Insolvency Service is chalking up a similar year-on-year percentage of complaints filtered out: in 2014, this ran at 24.5% of the complaints received, and in 2015, it was 26.5%.

So, if there were fewer complaints lodged, then why have complaints sanctions increased?

How long does it take to process complaints?

The correlation between complaints lodged and complaint sanctions is an interesting one:

Is it too great a stretch of the imagination to suggest that complaint sanctions take somewhere around 2 years to emerge? I suggest this because, as you can see, the 2010/11 sanction peak coincided with a complaints-lodged trough and the 2013 sanctions trough coincided with a complaints lodged peak – the pattern seems to show a 2-year shift, doesn’t it..?

I am conscious, however, that this could simply be a coincidence: why should sanctions form a constant percentage of all complaints? Perhaps the sanctions simply have formed a bit of a random cluster in otherwise quiet years.

Could there be another reason for the increased complaints sanctions in 2015?

One RPB breaks away from the pack

How strange! Why has the IPA issued so many complaints sanctions when compared with the other RPBs?

I have heard more than one IP suggest that the IPA licenses more than its fair share of IPs who fall short of acceptable standards of practice. Personally, I don’t buy this. Also more sanctions don’t necessarily mean there are more sanctionable offences going on. It reminds me of the debates that often surround the statistics on crime: does an increase in convictions mean that there are more crimes being committed or does it mean that the police are getting better at dealing with them?

Nevertheless, the suggestion that the IPA’s licensed population is different might help explain the IPA peak in sanctions, mightn’t it? To test this out, perhaps we should compare the number of complaints received by each RPB.

Ok, so yes, IPA-licensed IPs have received more complaints than other RPBs (although SoS-authorised IPs came out on top again this past year). If the complaints were shared evenly, then 58% of all IPA-licensed IPs would have received a complaint last year, compared to only 43% of those licensed by the other three largest RPBs. I hasten to add that, personally, I don’t think this indicates differing standards of practice depending on an IP’s licensing body: it could indicate that IPA-licensed (and perhaps also SoS-authorised) IPs work in a more complaints-heavy environment, as I mention further below.

Nevertheless, let’s see how these complaints-received numbers would flow through to sanctions, if there were a direct correlation. For simplicity’s sake, I will assume that a complaint lodged in 2013 concluded in 2015 – although I think this is highly unlikely to be the average, I think it could well be so for the tricky complaints that lead to sanctions. This would mean that, across all the RPBs (excluding the Insolvency Service, which has no power to sanction SoS-authorised IPs in respect of complaints), 12% of all complaints led to sanctions. On this basis, the IPA might be expected to issue 36 complaint-led sanctions, so this doesn’t get us much closer to explaining the 76 sanctions issued by the IPA.

I can suggest some factors that might be behind the increase in the number of complaints sanctions granted by the IPA:

The IPA licenses the majority of IVA-specialising IPs, which do seem to have attracted more than the average number of sanctions: last year, two IPs alone were issued with seven reprimands for IVA/debtor issues.

The IPA’s process is that matters identified on a monitoring visit that are considered worthy of disciplinary action are passed from the Membership & Authorisation Committee to the Investigation Committee as internal complaints. Therefore, I think this may lead to some IPA “complaint” sanctions actually originating from monitoring visits. However, analysis of the sanctions arising from monitoring visits (which I will cover in another blog) indicates that the IPA sits in the middle of the RPB pack, so it doesn’t look like this is a material factor.

Connected to the above, the IPA’s policy is that any incidence of unauthorised remuneration spotted on monitoring visits is referred to the Investigation Committee for consideration for disciplinary action. Given that it seems that such incidences include failures that have already been rectified (as explained in the IPA’s September 2015 newsletter) and that unauthorised remuneration can arise from a vast range of seemingly inconspicuous technical faults, I would not be surprised if this practice were to result in more than a few unpublished warnings and undertakings.

But this cannot be the whole story, can it? The IPA issued 93% of all complaints sanctions last year, despite only licensing 35% of all appointment-takers. The previous year followed a similar pattern: the IPA issued 82% of all complaints sanctions.

To put it another way, over the past two years the IPA issued 111 complaints sanctions, whilst all the other RPBs put together issued only 14 sanctions.

What is going on? It is difficult to tell from the outside, because the vast majority of the sanctions are not published. Don’t get me wrong, I’m not complaining about that. If the sanctions were evenly-spread, I could not believe that c.16% of all IPA-licensed IPs conducted themselves so improperly that they merited the punitive publicity that .gov.uk metes out on IPs (what other individual professionals are flogged so publicly?!).

The Regulators’ objective to ensure fairness

This incongruence, however, makes me question the fairness of the RPBs’ processes. It cannot be fair for IPs to endure different treatment depending on their licensing body.

You might say: what’s the damage, when the majority of sanctions went unpublished? I have witnessed the anguish that IPs go through when a disciplinary committee is considering their case, especially if that process takes years to conclude. It lingered like a Damocles Sword over many of my conversations with the IPs. The apparent disparity in treatment also does not help those (myself included) that argue that a multiple regulator system can work well.

One of the new regulatory objectives introduced by the Small Business Enterprise & Employment Act 2015 was to secure “fair treatment for persons affected by [IPs’] acts and omissions”, but what about fair treatment for IPs? In addition, isn’t it possible that any unfair treatment on IPs will trickle down to those affected by their acts and omissions?

The Insolvency Service has sight of all the RPBs’ activities and conducts monitoring visits on them regularly. Therefore, it seems to me that the Service is best placed to explore what’s going on and to ensure that the RPBs’ processes achieve consistent and fair outcomes.

In my next blog, I will examine the Service’s monitoring of the RPBs as well as take a closer look at the 2015 statistics on the RPBs’ monitoring of IPs.

Sorry for the long silence. SIP9/fees have ruled my life for the past few months and I’ll share my thoughts on those when the fog has cleared. In the meantime, I thought I’d catch up on something far less controversial (you’d think!): SIP1’s requirement to “report” IPs to the Complaints Gateway or to the RPB. Does this mean that reports will be handled as full-blown complaints or is there another way?

Why shouldn’t all reports be handled as formal complaints?

Well, imagine you are a licensed IP working for other licensed IPs. Maybe you’re in that boat now. Maybe you’re in a firm’s compliance department. Maybe you’re a case manager. Say you become uncomfortable about something you’ve seen, something that you think triggers the SIP1 reporting requirement. Should you to report it via the Insolvency Service’s Complaints Gateway?

What would happen next? Would the RPB write to the IP providing a copy of the report? The IPA’s complaints procedure, for example, states that this is done in all complaint cases.

Clearly, this is unhelpful. But does elevating the need to report concerns to a SIP requirement rule out any alternative to lodging a formal complaint?

Does SIP1 allow IPs to discharge their reporting duty by whistle-blowing to the RPB?

SIP1 states:

“An insolvency practitioner who becomes aware of any insolvency practitioner who they consider is not complying or who has not complied with the relevant laws and regulations and whose actions discredit the profession, should report that insolvency practitioner to the complaints gateway operated by the Insolvency Service or to that insolvency practitioner’s recognised professional body.”

This appears to give IPs a choice: either they may lodge a (formal) complaint via the Gateway or they can report to the IP’s RPB.

What is the destiny of a “report” to the RPB?

The MoU between the Insolvency Service and the RPBs (https://goo.gl/ICqHEo) suggests that there is no practical distinction. It defines a complaint as “a communication about a person authorised as an insolvency practitioner expressing dissatisfaction with that person’s conduct as it relates to his or her professional work as an insolvency practitioner in Great Britain, or with the conduct of others carrying out such work on that person’s behalf.” The MoU then states: “Each Recognised Professional Body will forward to the Authority any Complaint received by it within five Working Days of receipt” and then the Authority, the Insolvency Service, will process the Complaint in the usual manner.

So this would appear to complete the circle. It appears that however an IP seeks to report a matter, it is going to be handled as a complaint sooner or later.

Is there no way to whistle-blow to a regulatory body?

So it seems that all reports will end up in the Complaints Gateway. This seems wrong, doesn’t it? After all, the Insolvency Service is a “prescribed person” for the purposes of whistle-blowing about misconduct in companies generally (https://goo.gl/cIkGL4). It doesn’t make sense to leave those working within the insolvency profession with nowhere to turn.

Surely the Service appreciates that IPs (and others employed by IPs) might want to use a far more discreet method than a formal complaint to bring their concerns to the attention of the regulatory bodies. I certainly hope that the Service would not look to enforce this aspect of the MoU against the RPBs. We must be able to trust our regulatory bodies to act sensibly when dealing with such sensitive situations.

To be honest, I haven’t asked anyone at the Service for comments. However, I have sought the views of some within the RPBs.

The IPA’s view

Alison Curry gave me this answer:

“If the practitioner is reporting regulatory intelligence, in discharge of their SIP 1 obligations (and their membership rules, as the case may be) then they may do so to the RPB of the practitioner reported upon. In such an instance, presumably, they could maintain anonymity if they chose, but could not be expected to be appraised of an outcome (i.e. they would not be a complainant in the formal sense). Presumably then the RPB will have a process by which that intelligence is fed into their monitoring processes. We certainly do and expect the IS to be monitoring that others do also.”

Alison also pointed out that, as information may end up in the monitoring stream, it could result in a referral to the Investigation Committee (which deals with complaints). However, this would be a referral from the Membership & Authorisation Committee (which deals with monitoring), so I think the whistle-blower’s identity would be unlikely to feature in the “complaint” referral, as the chances are that the IPA’s monitoring team will have gathered their own evidence in order for the M&A Committee to consider the issue in the first place.

ICAS’ view

David Menzies gave me this answer:

“You will be aware that the normal complaint procedures as agreed by the IS and the RPBs are that complaints should be made through the Complaints Gateway. RPBs also receive regulatory intelligence and it is possible that information relating to an IP’s misconduct could also be received by the RPB in that manner. In reality whether information is submitted through the complaints gateway or via an RPB is not critical, the important aspect being that the information is transmitted in the first place…

“The issue of the reporter’s identity being disclosed is of course something that no guarantees can ever be given on. If matters eventually proceeded to a disciplinary tribunal then certain documents would have to be put before the tribunal and that would most likely include correspondence with the complainer. There is also the possibility that if the IP who was being complained against submitted a subject access request under Data Protection legislation then it may be difficult to justify not disclosing the correspondence containing the complaint. There may well be circumstances where we can withhold a complainant’s identity but I think that this would need to be looked at on a case by case basis.”

The Other RPBs

I won’t quote my ACCA contact here, as it wasn’t an “official” response. Nevertheless, I did learn that ACCA’s monitoring team receives intelligence – from IPs as well as the other RPBs – and this is similarly absorbed into its monitoring processes, rather than put through the formal complaints process where the discloser doesn’t wish to lodge a formal complaint.

I suspect also that this is the case with the ICAEW and, to be fair to them, they were hoping to revert to me with a consensus view once this matter had been discussed at the Regulators’ Forum a couple of months’ ago. I expect that the demands of other SIP revisions have overtaken the publication of any guidance on this matter.

So whistle-blowing to the IP’s RPB can count as SIP1 compliance?

From the comments I have received, it would seem so. It also seems to me that the RPBs would not treat it as a formal complaint and thus pass it to the Insolvency Service for processing via the Gateway. Confidential intelligence-delivery worked within RPBs before the revised SIP1. The revision certainly was not intended to close any doors that were previously open.

What about your duty under your RPB’s Membership Rules?

Within all the RPBs’ membership rules/regulations, there is an obligation to report the misconduct of another member. The purpose of the revised SIP1 was to expand this obligation so that, in effect, the same rules apply whether the offending IP is a member of your RPB or not.

However, this means that, technically, if you have lodged a complaint via the Insolvency Service’s Gateway, you may need to report the matter also to your RPB so that you comply with its membership rules. This does seem a bit of unnecessary duplication, however, and I would hope that an IP would not be beaten about the head for complaining only to the Gateway.

What acts should be reported?

As quoted above, SIP1 sets out two criteria:

non-compliance with “the relevant laws and regulations” AND

actions that discredit the profession.

I am pleased to see that, at least with the IPA, its rules have been amended in the past few months clearly to bring them in line with the revised SIP1. Previously, their rules had stated “misconduct” needed to be reported, which could have constituted simply a breach of a SIP, statutory provision or the Ethics Code. Now, the IPA has also imported reference to discrediting the profession (although also, interestingly, discredit to either the member, the IPA, or any other member) as a must-have in order to trigger the reporting requirement.

What actions discredit the profession? Actions at the far end of the spectrum will be blindingly obvious, but I reckon there is a huge swathe of greyness where subjectivity reigns. To be fair though, we have always lived with this issue. The revised SIP1 wasn’t meant to make our lives more difficult – I don’t think so anyway – but rather to emphasise our personal responsibility to keep our profession clean. With this objective in mind, I have no complaints about the revised SIP1.

In this post, I analyse the Insolvency Service’s annual review of IP regulation, asking the following questions:

Are the regulators visiting their IPs once every three years?

How likely is it that a monitoring visit will result in some kind of negative outcome?

How likely is a targeted visit?

Has the Complaints Gateway led to more complaints?

What are the chances of an IP receiving a complaint?

How likely is it that a complaint will result in a sanction?

The Insolvency Service’s reports can be found at: http://goo.gl/MZHeHK. As I did last year (http://wp.me/p2FU2Z-6C), I have only focussed attention on the authorising bodies with the largest number of IPs (but included stats for the others in the figures for “all”) and only in relation to appointment-taking IPs. Again, regrettably, I don’t see how I can embed the graphs into this page, so they can be found at: Graphs 23-04-15. You might find it easier to read the full article along with the graphs here(2).

Monitoring Visits

Are the regulators visiting their IPs once every three years?

Graph (i) (here(2)) looks at how much of each regulator’s population has been visited each year:

Is it a coincidence that the two regulators that were visited by the Service last year – the ACCA and the Service’s own monitoring team – have both reported huge changes in monitoring visit numbers? Of course, this graph also shows that those two regulators carried out significantly less monitoring visits in 2013, so perhaps they were already conscious that they had some catching-up to do.

I’m not convinced that it was the Service’s visit that prompted ACCA’s increase in inspections: the Service’s February 2015 report on its 2014 visits to the ACCA did not disclose any concerns regarding the visit cycle and I think it is noteworthy that ACCA had a lull in visits in 2010, so perhaps the 2013 trough simply reflects the natural cycle. Good on the Insolvency Service, though, for exerting real efforts, it seems, to get through lots of monitoring visits in 2014!

The trend line is interesting and reflects, I think, the shifting expectations. The Service’s Principles for Monitoring continue to set the standard of a monitoring visit once every three years with a long-stop date of six years if the regulator employs satisfactory risk assessment processes. However, I think most regulators now profess to carry out 3-yearly visits as the norm and most seem to be achieving something near this.

The ICAEW seems a little out-of-step with the other regulators, though. At their 2014 rate, it would take 4½ years to get around all their IPs. The report does explain, however, that the ICAEW also carried out 32 other reviews, most of which were “phone reviews” to new appointment-taking IPs. The Service hasn’t counted these in the stats as true visits, so neither have I.

How likely is it that a monitoring visit will result in some kind of negative outcome?

It’s spiky, but you can see that, overall around 1 in 4 visits in 2014 ended up with some kind of action needed.

Above this line, ACCA and ICAEW reported the most negative outcomes. Most of the ACCA’s negative outcomes related to the ordering of a further visit (20% of their visits). The majority of ICAEW’s negative outcomes related to the request for a compliance review (16% of their visits). Of course, ICAEW IPs are required to carry out compliance reviews every year in any event. I understand that this category involves the ICAEW specifically asking to see and consider the following year’s compliance review and/or requiring that the review be carried out by an external provider, where weaknesses in the IP’s internal review system have been identified.

I find ICAS’ flat-line rather interesting: for two years now, they have not reported any negative outcome from monitoring visits. The Service had scheduled a visit to ICAS in April this year, so I’ll be interested to see the results of that.

How likely is a targeted visit?

Let’s take a closer look at ACCA’s ordering of further visits (graph (iii) here(2)): is this a new behaviour?

The 2015 estimated figures are based on the outcomes reported for the 2014 visits, although of course some could already have occurred in 2014.

ACCA seems to be treading a path all its own: the other RPBs – and now even the Service – don’t seem to favour targeted visits.

Complaints

Has the Complaints Gateway led to more complaints?

It’s hard to tell. The Service’s first-year report on the Complaints Gateway said that, as it had received 941 complaints in its first 12 months – and by comparison, 748 and 578 complaints were made direct to the regulators in 2013 and 2012 respectively – “it may be that this increase in complaints reflects the improvement in accessibility and increased confidence in the simplification of the complaints process”.

However, did the pre-Gateway figures reflect all complaints received by each regulator or only those that made it through the front-line filter? If it is the latter, then the Gateway comparison figure is 699, not 941, which means that fewer complaints were received via the Gateway than previously (or at least for 2013), as this graph (iv) (here(2)) demonstrates.

The stats for 2013 are a mixture: for half of the year, the regulators were receiving the complaints direct and for the second half of the year the Gateway was in operation. It seems to me that the Service has changed it reporting methodology: for the 2013 report, the stats were the total complaints made per regulator, but in 2014 the report refers to the complaints referred to each regulator.

Therefore, I don’t think we can draw any conclusions, as we don’t know on what basis the regulators were reporting complaints before the Gateway. We cannot even say with confidence that the number of complaints received in 2013/14 is significantly higher than in 2012 and earlier, as this graph suggests, because it may be that the regulators were filtering out more complaints than the Gateway is currently.

About all we can say is that marginally fewer complaints were referred from the Gateway for the second half of 2014 than for the first half.

What are the chances of an IP receiving a complaint?

Of course, complaints aren’t something that can be spread evenly across the IP population: some IPs work in a more contentious field, others in high profile work, which may attract more attention than others. The Service’s report mentioned that the IPA is still dealing with 34 complaints from 2012/2013 that relate to the same IVA practice.

This illustrates that, if complaints were spread evenly, half of all IPs would receive one complaint each year – and this figure hasn’t changed a great deal over the past few years.

As I mentioned last year, I do wonder if this graph illustrates the deterrent value of RPB sanctions: given that the Service has no power to order disciplinary sanctions on the back of complaints, perhaps it is not surprising that, year after year, SoS-authorised IPs have clocked up the most complaints. I believe that the IPA’s 2013 peak may have had something to do with the delayed IVA completion issue (as I understand that the IPA licenses the majority of IPs specialising in IVAs). It’s good to see that this is on the way down.

I am also interested in the low number of complaints recorded by ICAS-licensed IPs: maybe this justifies their flat-lined actions on monitoring visits explained above: maybe their IPs are just more well-behaved! Or does it reflect that individuals involved in Scottish insolvency procedures may have somewhere else to go with their complaints: the Accountant in Bankruptcy? Although the AiB website refers complainants to the RPB (shouldn’t this be to the Gateway?), it also states that they can write to the AiB and it seems to me that the AiB’s statutory supervisory role could create a fuzzy line.

How likely is it that a complaint will result in a sanction?

Although at first glance, this graph (vi) (here(2)) appears to show that the RPBs “perform” similarly when it comes to deciding on sanctions, it does show that, on average, the IPA issues sanctions on almost twice as many complaints when compared with the average over the RPBs as a whole. Also, it seems that IPA-licensed IPs are seven times more likely to be sanctioned on the back of a complaint than ICAEW-licensed IPs. The ACCA figure seems odd: no sanctions at all were reported for 2014.

Of the 43 complaints sanctions reported in 2014, 35 were issued by the IPA: that’s 82% of all sanctions. That’s a hefty proportion, considering that the IPA licenses only 34% of all appointment-taking IPs. It is no wonder that, at last week’s IPA conference, David Kerr commented on the complaints sanction stats and stressed the need for the RPBs to be working, and disclosing, consistently on complaints-handling.

Of course, this doesn’t reflect the severity of the outcomes: included here is anything from an unpublicised warning (when the RPB discloses them to the Service) to a licence withdrawal. And, despite what I said earlier about the timing of the Service’s visit to the ACCA, I am still tempted to suggest that perhaps the Service’s visits have pushed the regulators – the Insolvency Service’s monitoring team and ACCA – into action, as those two regulators have recorded significant jumps in activity over the past year.

The Service has a busy year planned: full monitoring visits to ICAEW, ICAS, CARB, LSS and SRA (although that may be scaled back given the decision for the SRA to pull out of IP-licensing), and a follow up visit to ACCA. No visit planned to the IPA? Perhaps that suggests that the Service is looking as closely at these stats as I am.

The Insolvency Service has released two reports on its own IP-monitoring team and one on ACCA’s monitoring, but is the Insolvency Service playing fair? Is it applying double standards and how sensible are its demands of authorising bodies?

The Insolvency Service’s monitoring of the Insolvency Service’s monitoring

No, I’ve not copied-and-pasted by mistake: in April/May 2014, the Insolvency Service carried out a monitoring visit of its own monitoring team, i.e. the team that deals with Secretary of State-authorised IPs (“IPS”).

The report issued on 29 August 2014 identified some “serious weaknesses”, leading to a decision to make a follow-up visit three months later. This occurred in January 2015 – not seriously tardy, I guess (although not a great example to the Team, given that late monitoring visits on IPs was the most serious weakness identified in the first visit) – and the report on the follow-up visit has now been released.

The recent report makes no reference to any further visits or follow-up actions, although the summary discloses a number of wriggle-phrases: “IPS has implemented, or made progress against, all the recommendations… IPS has moved towards… IPS has plans in place to address this…” Would the Insolvency Service be satisfied if an RPB had made such “progress” towards goals? Or would the Service be content for an RPB to accept such assurances from an IP who had only “moved towards” rectifying matters?

Catching up on overdue monitoring visits

To be fair, there did seem to be significant progress with the key issue – that as at May 2014 over half of their IPs had not had a visit in the past three years. The report disclosed that, of the 28 IPs that had been identified from the 2014 review as overdue a visit, most had been visited or would be visited by May 2015. The remaining five IPs had been asked to complete a pre-visit questionnaire, and the IPS planned to consider these on a risk basis and “if appropriate, schedule a prompt monitoring visit”.

It is evident from the report, however, that the only visits carried out by the Team since their 2014 review had been to IPs who were already overdue a visit. Thus, I’m wondering, how many more IPs’ three years were up between April/May 2014 and now and is the Team constantly chasing their tails? Of course, we expect SoS-authorisations to go in the future (although the De-regulation Bill provides a run-down period of another year), so is this really something to get excited about? My issue is with the consistency of standards that I expect the Insolvency Service to apply to all licensing/authorising bodies.

“Independent” decision-making

The report makes reference to the introduction of “a layer of independence to its authorisation and monitoring process”. This refers to the fact that the Section Head now decides on actions following monitoring visits and reauthorisations – with the benefit of a copy of the last monitoring report (which seems pointless to me: if the monitor’s findings were not such that they merited withdrawal of the IP’s authorisation, on what basis would they merit withholding reauthorisation up to a year later?). Is the Section Head really independent? I accept that the Insolvency Service structure (and budget) does not provide for the levels of independence possible for RPBs, but, again, I do feel that the Service is applying double standards here, especially given its report on ACCA below.

The Insolvency Service’s report on ACCA

The Service’s review of ACCA revealed “some weaknesses” and it is planning a follow-up visit within three to six months. ACCA has rejected two of the Service’s recommendations.

Early-day monitoring visits

I was surprised to read the Service write so negatively about early monitoring visits. About monitoring visits occurring within the first 12 months of the IP’s licence, it writes: “There is no evidence of these initial visits being conducted in accordance with the PfM [Principles for Monitoring]; instead, these appear to be conducted as courtesy visits”. ACCA has asked the Service to clarify what is intended by the recommendation, given that a full scope visit is always completed within the IP’s first three years. ACCA points to the PfM’s risk-based approach to early visits and states that it “will consider whether it should discontinue introductory visits in the future, given the Insolvency Service’s comments which suggest they are of little value.”

I know that ACCA is not the only RPB that carries out less-than-full-scope early visits, so I am wondering if we will see a shift from all those RPBs.

Personally, I feel that the Insolvency Service is taking the wrong tack here. When I was at the IPA, I monitored new IPs’ caseloads to see when their first inspection visit looked appropriate. I also took into consideration other factors: were they working in an office with other IPs? If so, what were their track records? Were they hitting the radar of the Complaints Department? What did their self certifications look like? But often a key question was: was their caseload building at such a rate that a visit would be useful? Very often, new IPs take on very few cases and, on the basis of caseload alone, it is usually around 18 months before a proper visit can be conducted.

Nevertheless, I think that there is value in conducting an early visit. Calling it a “courtesy visit” is a little unfair, I’m sure. ACCA responded that “the purpose of these visits is to assist insolvency practitioners to ensure they have adequate procedures in place to carry out their work”. And that’s the point, isn’t it? It may be too early to see how the IP is really going to perform, but early-days are a good opportunity to see how geared-up the IP is, explore their attitude towards compliance and ethics versus profit, and perhaps even help them. Is it sensible to criticise ACCA for not evidencing that an early-day visit has been conducted in the same way as a full visit? If RPBs are discouraged – or prohibited – from carrying out introductory visits, compliance with the PfM would indicate that the RPB simply needs to record the decision that a full visit in the first 12 months is not necessary and then bump the IP to the 3-year point. Is that better regulation?

Extensive monitoring reports

I have sympathy with ACCA as regards the Insolvency Service’s next criticism. The report explains that ACCA’s monitoring reports describe the main areas of concern, but not the areas examined where no concerns were generated. The Service recommended that “ACCA consider expanding their monitoring reports to include all information obtained during the monitoring process, including areas of no concern to provide a clear audit trail”.

Interestingly, the Insolvency Service’s 2014 report on its own monitoring came up with a similar recommendation, although in 2014 the Service’s recommendation appeared more dogmatic: “Ensuring that monitoring reports include all of the information obtained during the monitoring process, not just in relation to areas of concern; any areas where there are no concerns may be summarised. The reports should also include the bonding information on each case.” My original notes in the margin of that report expressed “Why?!” I certainly don’t see why bonding information always needs to be recorded and I struggle to see how all information obtained could be sensibly written down. When I review cases, I scribble pages of notes, summarising key facts and events in the case’s lifecycle, such as key Proposal terms and modifications, mainly so that I can see if these points are followed through over time. As my review questions are answered satisfactorily, I move on; if I had to summarise all this information in my reports, they would double in length but I don’t believe they would be any more revealing or helpful to the reader.

The 2015 follow-up report on the Insolvency Service’s own monitoring states: “IPS had significantly expanded its monitoring reports. These now contain sufficient detail to enable an informed decision to be made on appropriate action following the issue of the report.” Hmm… that doesn’t exactly confirm that the reports now contain “all” information or indeed the bonding information on each case. Does this, along with the Service’s recommendation that ACCA “consider” expanding reports, reflect that they themselves are moderating their original opinion of what should be in reports?

I cheered at ACCA’s response to the recommendation: “ACCA believes that including in the monitoring report areas where there are no concerns risks: expanding the report unnecessarily with no perceived benefit; diluting the overall outcome and reducing focus on the significant weaknesses in the insolvency practitioner’s procedures and the need to make appropriate improvements.” Good for you, ACCA!

I think it’s a bit of a shame that, despite explaining this opinion, ACCA then states that it has amended its standard report template in an attempt to satisfy the Insolvency Service, although I am sure that many of us appreciate the wisdom in meeting our regulators’ demands even if we don’t agree with them.

“Independent” decision-making

Remembering that the IPS had satisfied the Insolvency Service on this matter by passing all monitoring reports through their Section Head, I sucked my teeth at the Service’s next recommendation to ACCA: “That any monitoring report with unsatisfactory findings be considered independently, for example by the Admissions and Licensing Committee, to assess what regulatory action may be necessary”.

Firstly, no IP is perfect; I have not seen a report with no “unsatisfactory findings”, so this suggests that effectively all monitoring reports would need to go through the Committee. To be fair, I come from an IPA background where all reports did go through the Committee – and I thought it was valuable that the Committee see the good with the bad – but it’s a big ask for any Committee (especially if reports become far longer seemingly as required by the Service) and I am not surprised that some RPBs have sought to make the process more efficient. After all, the majority of IPs visited are so obviously way above the threshold where some action is deserved that it makes perfect sense to fast-track these, doesn’t it?

The report stated that “ACCA regrets that it must reject this recommendation as it believes it is an impractical and disproportionate response to the vast majority of visit outcomes”. ACCA’s response makes clear that each report is considered at least by the monitor and a reviewer, who I think can decide on certain actions such as scheduling a follow-up visit: is this not sufficient for at least the top 50% of IPs?

Admittedly, the devil is in deciding what to do with the reports at the margins: at what stage is an issue serious enough to warrant Committee attention? Unfortunately for ACCA, the case that led to this recommendation was not a great example. Although ACCA has done a good job in putting into context each of the breaches identified at this IP visit that ACCA decided fell below the threshold for Committee attention, I have to say that the fees issue alone – even though it was a one-off unusual circumstance (the IP had taken a £5,000 deposit for the costs of liquidating a company, but it was actually placed in administration and the IP drew the deposit for pre-admin costs without complying “fully” with R2.67A) – would have meant, in an IPA context, that it would not only have been considered at length by the Membership & Authorisation Committee, but it would have been an automatic referral also to the Investigation Committee for consideration for disciplinary action.

I am also not persuaded by ACCA’s defence that the IP’s repeat breaches of legislation and/or SIPs resulted in “no actual harm” to the debtor (in one case) or creditors “such that, given the function of the Admissions and Licensing Committee, a referral to it would not have been justified”. In my experience, it is very rare that breaches of statute or SIPs actually result in harm, but is that the only criterion for deciding whether an issue is sufficiently serious to warrant action? You could throw out half the rules and SIPs, if all IPs needed to do was avoid harming stakeholders.

I think that ACCA is on stronger ground as regards another issue that the IP had already rectified. What would be the point of referring this to the Committee? “Withdrawal or suspension of the licence would be disproportionate and it is not clear what conditions would be appropriate to protect the public, particularly as the breach had already been rectified.”

I think that ACCA’s final comments put it nicely: “To recommend that such cases should routinely be referred to the Admissions and Licensing Committee to decide on any regulatory action and timing of the next visit is a poor use of Committee resources, clearly disproportionate to the findings and, in ACCA’s view, contrary to the guidance contained in the Insolvency Service Regulators’ Code.”

Surely the Insolvency Service should be concentrating on outcomes, shouldn’t they? After all, that is what Nick Howard said (in the podcast at http://goo.gl/WUst5M) was his objective as regards the Service’s monitoring of all the RPBs: to ensure that they act consistently in reaching the same outcomes. Admittedly, in this case it does look to me like the IPA (for one) would have put the IP through the ringer, made him sweat a bit more, than ACCA appears to have done, but would it have affected the outcome? If the IP took on board all of the ACCA monitor’s points and made the necessary changes (some which appear to have taken place prior to the visit in any event), does it matter how his report was processed?

And I would add: how does the IPS’ process – of referring reports to the Section Head – meet the Service’s apparent requirement for independence any better?

Complaints-handling

ACCA has evidently had some difficulties in the past in resourcing their complaints-handling adequately, although they do seem to have cracked it more recently. I did smile, though, at the Service’s recommendation that “it would be helpful in future for the Insolvency Service to be kept informed of any significant changes in staffing and resources” – ACCA had increased their staffing for complaints from one member to two. Can you imagine if authorising bodies took such a keen interest in IPs’ staff numbers?!

One of the Service’s other recommendations was that the name of the independent assessor be given to the complainant and the IP “to ensure transparency and openness throughout the process”. This was the second recommendation that ACCA had rejected: “ACCA does not believe naming assessors will add any real value to the process… If assessors are named, there is a danger that they may be passed extraneous material, which risks delays in progressing complaints. There is also the risk of assessors being harassed by members and complainants where their decision is not favourable to them”.

My personal view is that this is another example of the Service trying to meddle with the processes instead of concerning itself with the outcomes. I can see how they might feel that transparency in this matter might help “improve confidence” in the complaints regime, but is it that material?

Single regulator?

What worries me about all this is that the Service appears to be seeking to achieve consistency by ensuring that all authorising bodies’ processes are the same. This is particularly unhelpful if the Service starts with what they think an authorising body should look like and then exerts pressure on every body to squeeze them into that mould, instead of looking objectively at how the body performs before looking to criticise its processes.

There are a Memorandum of Understanding and Principles for Monitoring. The Service should be measuring the bodies against these standards. The Service’s “Oversight regulation and monitoring in the insolvency profession” document (http://goo.gl/jipcWs) confirms that assessing compliance with the MoU and PfM is fundamental. Thankfully, the MoU and PfM are not so prescriptive that they describe, for example, how much detail should go into monitoring reports.

In this document, the Service also claims to use “an outcomes and principles based approach” in carrying out its oversight role. I’m afraid that its monitoring reports do not do much to support this claim. If the Service wants to be effective in its oversight role, personally I think it needs to be thinking and acting smarter.

The clock is ticking for the reserve power to introduce a single regulator. My problem is that not all that the Service is doing seems to be helping RPBs to achieve their objectives in the best way they think they can. I ask myself: does the Service really want to support better delegated regulation?

Although this is a bit of a PR statement, a couple of crafty comments have been slipped in.

The newsletter explains that the Service’s “IP regulation function has been strengthened and we have raised the bar on our expectations of authorising bodies”. I started off sceptical but to be fair the Service’s summary of how it carries out its oversight function of the authorising bodies – https://www.gov.uk/government/publications/insolvency-practitioner-regulation-oversight-and-monitoring-of-authorising-bodies – does convey a more intensive Big Brother sense than the Principles for Monitoring alone had done previously. This document puts more emphasis on their risk-based assessments, desk-top monitoring and themed reviews, as well as targeting topical areas of concern, which can only help to provide a better framework in which their physical monitoring visits to the RPBs can sit.

I commend the Service for establishing more intelligent regulatory processes, but two sentences of the newsletter stick in the throat: “We saw the impact that our changing expectations had in a few areas. Things deemed acceptable a few years ago were now being picked up as areas for improvement.” This is a reference to its report on the visit to its own people who monitor SoS-authorised IPs, the Insolvency Practitioner Services (“IPS”): https://www.gov.uk/government/publications/monitoring-activity-reports-of-insolvency-practitioner-authorising-bodies. Having worked in the IPA’s regulatory department from 2005 to 2012, I would like to assure readers that many of the items identified in the Service’s report on IPS have been unacceptable for many years – at least to the IPA during my time and most probably to the other RPBs (I am as certain as I can be of that without having worked at the RPBs myself).

I am aghast at the Service’s apparent suggestion that the following recent discoveries at the IPS were acceptable a few years ago:

A 5-year visit cycle with insufficient risk assessment to justify a gap longer than 3 years;

Visits to new appointment-takers not carried out within 12 months and no evidence of risk assessment to justify this;

No evidence that one IP’s receipt of more than 1,000 complaints in the previous year (as disclosed in the pre-visit questionnaire) was raised during the visit, nor was it considered in any detail in the report;

No evidence of website checks (which the Service demanded of the RPBs many years ago);

“Little evidence that compliance with SIP16 is being considered”;

“No evidence that relevant ethical checklists and initial meeting notes from cases had been considered”; and

“Once a final report has been sent to the IP, there does not appear to be any process whereby the findings of the report are considered further by IPS”.

Still, that’s enough of the past. The Service has now thrown down the gauntlet. I shall be pleased if they now prove they can parry and thrust with intelligence and effectiveness.

Worthy of note is that the newsletter explains that, in future, sanctions handed down to IPs by the RPBs will be published on the Service’s website (presumably more contemporaneously than within its annual reviews).

The minutes report that the IPA will have a final version – of what? Presumably a statutory annual report template? – within “a couple of weeks” and that two Committee members will draft a Dear IP article (there’s a novelty!) to explain that use of the standard is not mandatory.

As a consequence of concerns raised by an adviser about the equity clause, DRF has agreed to “draft a response” – it seems this is only intended to go to the adviser who had written in, although it would seem to me to have wider interest – “to clarify the position, which is that a person will not be expected to go to a subprime lender and the importance of independent financial advice”. It is good to have that assurance, but what exactly does the IVA Protocol require debtors to do in relation to equity? Does the Protocol clause need revising, I wonder.

Resistance to refunding dividends when set-off applied

I see the issue: a creditor receives dividends and then sets off mis-sold PPI compensation against their remaining debt. Consequently, it could be argued that the creditor has been overpaid a dividend and should return (some of) it. The minutes state that “it is a complicated issue and different opinions prevail” (well, there’s a revelation!), although it has been raised with the FCA.

Variations

It seems that the Committee has only just cottoned on to the fact that the Protocol does not allow the supervisor to decide whether a variation meeting should be called, so they are to look at re-wording the standard terms to “give supervisor discretion as to whether variation is appropriate so when one is called it is genuine and in these instances the supervisor will be entitled to get paid”.

I’m sorry if I sound a little despairing at this, not least because of course the cynic may see this as yet another avenue for IPs to make some easy money! It was something that I’d heard about when I was at the IPA – that some IPs were struggling with IVA debtors who wanted, say, to offer a full and final settlement to the creditors that the IP was confident would be rejected by creditors, but under the Protocol terms it seemed that they had no choice but to pass the offer to creditors. I’m just surprised that this issue has not yet been resolved.

Recent pension changes

The minutes simply state: “InsS to enquire with colleagues as to how it is planned to treat these in bankruptcy and feed back”. About time too! Shortly after the April proposals had been first announced, I’d read articles questioning whether the government had thought about how any lump sum – which from next April could be the whole pension pot – would be treated in a bankruptcy. Presumably, legislation will be drafted to protect this pot from a Trustee’s hands, but that depends on the drafter getting it right. The lesson of Raithatha v Williamson comes to mind…

Well, I’m assuming that this is what the Committee minutes refer to, anyway.

Aha, so Dr Judge has been able to spin an increased number of complaints as evidence that the gateway “is meeting the aim of making the complaints process easier to understand and use”! I wonder if, had the number of complaints decreased, his message might have been that insolvency regulation had played a part in raising standards so that there were fewer causes for complaint.

The report mentions that the Service is “continuing dialogue” with the SRA and Law Society of Scotland to try to get them to adopt the gateway.

The Service still seems to be hung up about the effectiveness of the Insolvency Code of Ethics (as I’d mentioned in an earlier post, http://wp.me/p2FU2Z-6I) and have reported their “findings” to the JIC “to assist with its review into this area”.

The Service also seems to have got heavy with the RPBs about complaints on delayed IVA closures due to ongoing PPI refunds. The ICAEW and the IPA “have agreed to take forward all cases for investigation” – because, of course, some complaints are closed at assessment stage on the basis that the complaints reviewer has concluded that there is no case to answer (i.e. it is not that these complaints do not get considered at all) – “where the delay in closing the IVA exceeds six months from the debtor’s final payment”. Does this mean that the general regulator view is that any delay under 6 months is acceptable? Hopefully, this typical Service measure of setting unprincipled boundaries will not result in a formulaic approach to dealing with all complaints about delayed closure of IVAs. And, although the other RPBs may license a smaller proportion of IVA-providing IPs, I wonder what their practices are…

The report also explains that the Service has persuaded the ICAEW to modify its approach a little in relation to complaints resolved by conciliation. Now, such a complaint will still be considered in the context of any regulatory breaches committed by the IP. Years ago, the Service urged the RPBs to consider whether they could make greater use of financial compensation (or even simply requiring an IP to write an apology) in their complaints processes, but there was some resistance because it seemed that the key objective of the regulatory complaints process – to pick up IPs failing to meet standards – was at risk of getting lost: might some IPs be persuaded to agree a swift end to a complaint, if it meant that less attention would be paid to it? To be fair, this has always been an IP’s option: he can always satisfy the complainant before they ever approach the regulator. However, now settling a complaint after it has started on the Gateway path may not be the end of it for the IP, whichever RPB licenses him.

The Statistics

I think that the stats have been more than adequately covered by other commentaries. In any event, I found it difficult to draw any real conclusions from them in isolation, but they also don’t add much to the picture presented in the Insolvency Service’s 2013 annual review. That’s not to say, however, that this report has no use; at the very least, it will serve as a reference point for the future.

Ok, the complaints number has increased, but it does seem that the delayed IVA closure due to PPI refunds is an exceptional issue at the moment. Given that the IPA licenses the majority of IPs who carry out IVAs, it is not surprising therefore that the IPA has the largest referred-complaint per IP figure: 0.63, compared to 0.54 over all the authorising bodies (although the SoS is barely a whisker behind at 0.62). My personal expectation, however, is that the Insolvency Service’s being seen as being more involved in the complaints process via the Gateway alone may sustain slightly higher levels of complaints in the longer term, as perceived victims may not be so quick to assume that the RPB/IP relationship stacks the odds so heavily against them receiving a fair hearing.