Blog

Blog

paulamery

Friday, April 15, 2011 14:19 (CET)

Posted By Paul Amery

A Game-Changer



Conscious that I’m stepping into what is likely to be a brief hiatus between this week’s salvo of ETF-related warnings from the G20 Financial Stability Board, the IMF and the Bank for International Settlements, and a likely counterblast of official responses from ETF issuers, here are some initial thoughts.

My first impression is that this week will go down in history as having changed the way in which the ETF industry operates. It’s unlikely that things can go on as before, with a growing number of increasingly disparate fund (and non-fund) structures all being labelled with the same brand name.  How this is all resolved is anyone’s guess, but an obvious conclusion is that the ETF issuer’s own good name is going to be even more important than before. Trust in the issuer will override belief in the product concept. The industry has been surfing on a wave of pro-ETF sentiment for too long, perhaps.

Second, the ETF market’s growth rate will slow, and could even pause or go into reverse. The focus in the BIS paper on regulatory arbitrage by banks—the use of ETFs as a funding mechanism for parent investment banks, with the maturity of the swap being used to disguise the fact that the bank is effectively receiving overnight funding from the ETF while also obtaining relief from key liquidity metrics for the calculation of bank capital requirements—raises serious questions about the whole business model of synthetic ETF replication.

A regulatory tightening in this area could make synthetic replication uneconomic, while even the maintenance of the status quo may make it difficult for synthetic ETF issuers to expand their operations (i.e., for banks to assign much more balance sheet usage for the writing of ETF-related swaps). And, without doubt, after this week’s flurry of ETF-related questions, plenty more risk managers and compliance departments are going to be asking questions about ETF structures—and, I’m guessing, not just at buy-side firms, but within issuers’ parent companies as well.

Third, the physical versus synthetic replication debate has been reignited (perhaps, more broadly, one should speak of the battle between those ETF issuers that come from a fund management background and those that are part of investment banks). iShares is the leading exponent of the physical replication model, while Lyxor and db x-trackers are the leading firms in the synthetic category. In addition, there are now also several hybrids combining elements of both business models (Credit Suisse, for example, and HSBC to a lesser extent).

It was noticeable that iShares was quickest off the mark in responding to the first policy statement this week, that of the FSB on Tuesday. Within 24 hours iShares had put out a press release, arguing that the FSB had hit the nail on the head by pointing out potential conflicts of interest where swap-based ETFs and their derivative trading counterparts are within the same bank. iShares also applauded the FSB’s highlighting of the risks that arise if a bank uses synthetic ETFs as an inexpensive source of funding for illiquid securities.

But did the FSB thereby give the green light for the physical replication model, as iShares implies? Hardly. According to the FSB paper’s authors: “Securities lending…may create similar counterparty and collateral risks to [those incurred by] synthetic ETFs. In addition, securities lending could make the liquidity position of the ETF fragile, by challenging the ability of ETF providers to meet unexpected liquidity demands from investors, particularly if outflows from ETFs become significant under severe stress. A prevalence of securities lending could create a risk of a market squeeze in the underlying securities if ETF providers recalled on-loan securities on a large scale in order to meet redemptions. In addition, the use of ETFs as collateral in a long chain of secured lending and rehypothecation may create operational risks and contribute to the build up of leverage.”

That’s unequivocal. Meanwhile, the claim made by iShares’ European boss, Joe Linhares, in  Wednesday’s press release, that the firm “has always been transparent about the revenues generated and the risk framework surrounding its [securities lending] activity” is stretching reality.

 


iShares does disclose the revenue split between the fund manager (BlackRock) and the iShares funds (it used to be 50/50, then from November last year the manager took 10% of its cut and handed it back to fund investors, creating a 60/40 split). But as far as the transparency of the securities lending risk framework is concerned, the manager’s historical practices have left much to be desired.

For example, in the iShares plc annual report and accounts for 2009, on page 57 (under note 19, “securities lending income”) there’s a table of approved lending counterparties. However, while there’s a full list of counterparties that were in use on 29 February 2008, there is no list for 28 February 2009. Why the decrease in disclosure, year-on-year, for what is merely an annual snapshot of iShares ETFs’ lending relationships? “Because it’s not required by the Irish regulator,” I was told by an iShares spokesperson when I asked this question last year. So much for voluntary transparency.

The fund manager has recently started publishing a regular, more comprehensive review of its securities lending activities, in which counterparty names are once more disclosed (though the amounts on loan and revenues generated by each counterparty, valuable pieces of information for anyone wanting to assess risks, are not).

ETF—and securities lending—market practice is still far from the level of transparency that the regulators (and, I assume, most investors) would like to see. On the face of it, those synthetic ETFs that disclose their collateral basket daily are offering a far greater level of openness. In fact, in its ETF paper, the BIS analyses the db x-trackers’ MSCI Emerging Markets ETF on the basis of just this information, which is available on the issuer’s website. No one is yet able to perform an equivalent analysis where lending activities in a physically replicated ETF are concerned, as far as I’m aware.

Finally, the BIS ETF paper in particular looks in detail at the important differences between different varieties of synthetic ETF structures. In the unfunded swap structure, the ETF owns the assets in the collateral basket, while in the funded swap structure (a misnomer, according to the BIS, since no true swap is involved) the ETF investor only has a pledge of collateral from the counterparty.  A pledge is only a promise to repay and there is the potential for investors in a funded swap ETF to face a lengthy delay if a counterparty fails and an administrator steps in. In the unfunded swap structure, outright ownership of collateral should allow you to step in and liquidate it at will in an emergency. (We covered this subject in some detail in an article in November, “A Swap-based ETF Checklist”).

But there are additional subtleties to consider. In the funded swap model investors typically benefit from a greater degree of overcollateralisation than when unfunded swaps are involved. Further, points out the BIS, the unfunded swap model gives the bank writing the swap (and transferring assets) leeway to alter its risk-weighted capital charges, while granting a pledge to the ETF doesn’t. This in turn implies economic incentives for the bank which may increase risks to the ETF end-investor (if the bank’s funding structure is compromised as a result of the widespread use of this practice).

How all this works out and which synthetic ETF model is likely to end up as preferred is unclear to me. I haven’t even discussed the multi-swap provider model used by Source and ETF Securities. Will this turn out to be a winner from the debate? I’d be interested to hear experts’ opinion on this—please leave comments at the bottom of this blog, rather than emailing me directly. Use a pseudonym if necessary!

There are plenty of more obvious points of discussion arising from this week’s regulatory onslaught. Perhaps the most interesting of these relate to possible systemic risks resulting from the widespread use of ETFs and a mismatch between investors’ perception of ETF liquidity and market realities. The comments of Srichander Ramaswamy, author of the BIS paper, are particularly interesting in this regard. Could large-scale redemption pressures on an ETF ignite a wider financial crisis? We’ll return to this topic next week.

 



    


 



paulamery

Friday, April 08, 2011 13:49 (CET)

Posted By Paul Amery

Keep A Steady Hand On The Tiller



As my colleague Dave Nadig pointed out in his blog a couple of days ago, the compilers of the Nasdaq 100 index have got themselves in a muddle over stock concentration limits.

When the owners of the US stock exchange’s popular benchmark wanted to license it for use by exchange-traded fund providers back in the late 1990s — and, as Dave argues, which provider wouldn’t want to make money from his index in this way? — they faced a problem.

In order to keep an index-tracking ETF compliant with the US Internal Revenue Service’s diversification rules, which set a 25 percent upper limit for a single fund constituent, Nasdaq had to rejig its index rules, which until then had set stock weights according to company capitalisation. The reason? One stock — Microsoft — exceeded 25 percent of the benchmark back in 1998, and until they got its weighting down there was no way of launching an ETF on the Nasdaq 100 that wouldn’t be tax-disadvantaged.

The exchange/index compiler got around the problem by imposing “adjustment factors”. These cut Microsoft’s weighting to below 20 percent, reduced the weighting of other large-caps (defined as those representing more than 1 percent of the index at the time) by the same proportion, and also boosted the weighting of the smaller companies in the benchmark.

Now, thirteen years later, Nasdaq’s index committee faces a new problem. As a result of the stock’s underperformance, Microsoft’s index weighting has happily shrunk over the years, so the company no longer threatens to break the 25 percent limit. But a new tech stock darling, Apple, has reached over 20 percent of the Nasdaq 100 and is heading towards the weighting ceiling.

Furthermore, the index weightings are now far from being an accurate reflection of market capitalisation. As a result of the 1998 imposition of adjustment factors, Apple’s index weight was boosted from 0.4 percent to around 0.9 percent. At the end of 1998, Microsoft’s weight had been cut from an unadjusted 22.5% to an adjusted 14.5%.

But by mid-2010, Apple and Microsoft, while having similar capitalisations overall, represented 20 percent and 4.3 percent of the index, respectively, purely as a result of the adjustment factors that they’d inherited from the late 1990s. In fact Nasdaq’s description of its index as “modified capitalisation-weighted” didn’t make sense at all.

Recognising the new imbalance in the index make-up, Nasdaq announced on Tuesday this week that it will conduct an extraordinary rebalancing in early May to bring stock weights back  into line with actual market capitalisations. Apple’s weight will fall from 21 percent to 12 percent, correcting the largest weighting skew.

If all this seems arbitrary and confusing, that’s because it is. The way this particular index has been managed reminds you of a boat’s helmsman who can’t steer. A steady hand on the tiller keeps the boat moving in the same direction and at full speed; wild oversteer takes you off course and loses you ground in a race.

Nasdaq is not the only culprit in offering a benchmark that seems poorly designed for use in stock selection. If you’re new to indexing, you may be surprised to find out that the membership criteria for companies entering the world-famous S&P 500 and Dow Jones Industrial Average indices are also highly subjective.

The Dow’s components are chosen by an “averages committee” comprised of the managing editor of The Wall Street Journal, the head of Dow Jones Indexes research and the head of CME Group research.

Selection for the S&P 500 is also at the discretion of an index committee, the goal of which is “to ensure that the S&P 500 remains a leading indicator of US equities, reflecting the risk and return characteristics of the broader large cap universe on an ongoing basis”.

According to one well-founded analysis of the S&P 500 index committee’s stock picking record, the committee members are subject to the same style biases and drift as the average active manager. They boosted the index’s weighting in tech stocks during the bubble of the late nineties, only to remove several of the same names shortly thereafter; and they relaxed a longstanding prohibition on including holding companies in 2001, allowing lots of real estate investment trusts to be added to the index during the greatest real estate bubble in US history.

This hasn’t stopped the index from being used as the underlying benchmark for the largest ETF in the world, the SPDR S&P 500 ETF (which, at the latest count, had US$94 billion under management).

But amid all the euphoria over passive investing and the exchange-traded fund market’s growth rates, when you’re selecting a tracker product it’s worth casting a very sceptical eye over the index being used.

Index providers will license any methodology for money, and exchange-traded product issuers will sell pretty much any concept that can gain assets. But investors should stick to indices that have objective and clearly understandable rules, and above all rules that make investment sense.

 



    


 





Wednesday, April 06, 2011 13:42 (CET)

Posted By Dave Nadig

The Worst Index In The World



The Nasdaq 100 is the index behind one of the largest ETFs in the world—the US-listed PowerShares Q’s (NYSEArca: QQQ). According to yesterday’s Wall St. Journal, over US$300 billion of investor money is managed according to the list of companies in the 100. Why?

When we look at an index at IndexUniverse, we focus on two things: selection, and weight. The 100 is broken on both counts.

To get into the Nasdaq 100, here’s what you have to do:

  1. Happen to have Nasdaq as your primary listing
  2. Not be a financial company (for no particular reason)
  3. Be “seasoned,” which means being on Nasdaq for two years, or being in the top 25 percent of the Nasdaq 100 in terms of market cap

In other words, the list is based entirely on “what’s working now for Nasdaq.” Because of Nasdaq’s history as the home base of the dot-com bubble, the list is peppered with technology names, but also includes companies ranging from Staples to Sears. If a NYSE company wanted in, presumably all they’d have to do is change exchanges and wait six months. A rigorous selection process this ain’t.

But as arbitrary as the inclusion criteria for the 100 is, it’s the weighting scheme that really makes the 100 the worst index ever made by man.

As I wrote about last fall in a blog called “QQQQ Follies,” the index uses a modified market-cap weighting scheme but, in this case, that’s like suggesting that a garbage truck is a “modified” roadster—you could win a case in court on the distinction, and it would still stink.

Before Nasdaq wanted to launch mutual funds and ETFs based on the index—and what index owner doesn’t?—it was pretty vanilla, and Microsoft was over 25 percent of the index (back in 1998). To bring Microsoft down to an investable level (’40 Act funds have difficulties at 25 percent in any one holding), they created a crazy system that turned the 100 into a quasi-equal-weighted index.

Here’s the rule, which was just triggered: When any security gets over 24 percent; or when the aggregate of positions of more than 4.5 percent is greater than 48 percent; or whenever Nasdaq feels like it—seriously, that’s the trigger this time—a rebalance is triggered.

Once triggered, what’s supposed to happen is that the weights of all stocks that are over 1 percent of the index are reduced in tandem until the offense is cleared up; that is, when either the big stock goes to 20 percent, or the aggregate 4.5 percent positions get to 40 percent. The pool of weight that was freed up in that ratcheting down is then distributed, issue by issue, to those stocks under 1 percent of the index. So if a company has a 0.99 percent weight, it gets pushed up to 1 percent. If there’s weight left over, the process works down the list to the stock with a 0.98 percent weight, and so on.

It’s an arbitrary and ridiculous way to try and “equal-weight” the bottom end of the index, and honestly, typing it now, it seems too silly to be true. But if you don’t believe me, read the rule book.

But here’s where it gets really ridiculous. As crazy as all this is, at least it’s predictable and in the rule book. But in this case, Nasdaq seems to be throwing the rule book away. As of last night, Apple was only 20.36 percent of the index, and only one other stock was over the 4.5 percent cap—Qualcomm at 4.89 percent. So Nasdaq has just decided to reconstitute the index by fiat, and apparently arbitrarily. Here’s the official rundown.

What it comes down to is a giant reset button: On May 2, the index will just magically change from its current weighting scheme to an actual market-cap weighting scheme, or at least, one that puts the arbitrary list of 100 securities in market-cap order and in market-cap weights.

And then, apparently, Nasdaq will send the 100 out into the wilds again, until the next time one of the true magic rules is triggered. Or they just decide to mess with it again.

This isn’t index management, it’s index mismanagement. Is the new 100 “better” than the old 100? Probably. Is it worth the rather substantial chaos this move will cost the markets for the next month? Doubtful.

The 100 is broken by design. Putting the Band-Aid on now, without fixing the underlying methodology, is just kicking the problem down the road.



    


 



paulamery

Friday, April 01, 2011 11:49 (CET)

Posted By Paul Amery

Shifting Regulations



It’s impossible to open a financial publication these days without sensing the major, often behind-the-scenes conflict that’s going on over the regulation of the banking sector.

Its central front is the head-to-head confrontation between regulators and financials over bank capitalisation and the “too big to fail” (TBTF) issue.

The UK’s Bank of England has been striking a belligerent posture in recent weeks, calling for a doubling or tripling of tier one bank capital from the minimum 7% of risk-weighted assets set out in the Basel III rules.

Mervyn King, the Bank’s governor, recently repeated his call for the break-up of banks into separate investment and retail arms, with a state guarantee on offer only to the restricted, retail part.  Andrew Haldane, executive director for financial stability at the Bank, gave a presentation to the Institute of International and European Affairs in Dublin earlier this year (available here), in which he makes it pretty clear that tackling TBTF (and reducing or eliminating the possibility of the financial sector ever offloading its liabilities on the taxpayer again) is concern number one in Threadneedle Street.

Similar debates are taking place elsewhere, although in the US market the former central bank chairman is striking a stance that’s much more pro-financials, anti-regulation.  In an opinion piece in the Financial Times from earlier this week, Alan Greenspan argues against excessively strict implementation of the Dodd-Frank act.

(Greenspan’s central argument is that financial markets are driven by an international version of Adam Smith’s ‘invisible hand’ and are therefore best left to their own devices. “With notably rare exceptions (2008, for example), the global ‘invisible hand’ has created relatively stable exchange rates, interest rates, prices, and wage rates,” he says.  Perhaps Dublin’s IIEA—Irish taxpayers were yesterday landed with a new bill for their country’s bank clean-up, amounting to a whole year of gross income for each worker in the country—should invite him to defend his thesis in public.)

The response of the banks has been fierce.  In the UK, both HSBC and Barclays have repeated threats to leave the country.  In the US, JP Morgan’s chairman, Jamie Dimon, spoke out on Wednesday against a provision in Dodd-Frank that would require US banks to spin off their over-the-counter (“OTC”) derivatives operations.

Increases to capital requirements, while reducing the overall risk of banks, would also slow down the manic pace of financial sector innovation and put downward pressure on pay (lower leverage in banks means reduced share price volatility and a reduced value for the major part of financial executives’ remuneration that comes in the form of a call option on future performance).

There are other consequences of proposed changes to the regulation of financial derivatives.  As Tracy Alloway described on this site a few weeks ago, the push towards central clearing has implications for collateral costs and therefore for the overall economics of the business of swap providers.  Two-thirds of Europe’s exchange-traded funds are swap-backed.

While swap-backed ETFs have to be collateralised to a minimum 90% of net asset value, the provision of collateral both to a higher percentage and in a higher quality form would impact issuers’ profitability.  But that’s the way the market is moving.  In the words of one panellist at Data Explorers’ Securities Financing Forum, held in March, “all of the derivatives world is moving towards collateralised structures. And people want high-grade collateral, while not everyone has it.”

Meanwhile, the business of securities financing itself—something key to the operation of a range of exchange-traded and index funds—”was under the radar of regulators for a number of years, but is now paying the price”, according to Kevin McNulty, CEO of the International Securities Lending Association, speaking at the same event.

The battle over financial regulations is one reason we chose the subject as the central theme of our inaugural Journal of Indexes Europe issue (the publication, entitled “Shifting Regulations”, is due out in a couple of weeks and you can register for a free subscription here).

We’ll also be covering the changing regulatory landscape in detail at our forthcoming Inside ETFs Europe conference in Amsterdam on May 5/6. (Admission is free to institutional investors and the event is filling up fast, so please register to join us).  The conference has separate panels devoted to both tax and regulation, with high-level speakers on both subjects.  I, for one, am looking forward to those discussions with particular interest.



    


 



paulamery

Thursday, March 24, 2011 17:53 (CET)

Posted By Paul Amery

Which Tracker?



In the feature article we published yesterday, I described how the London Stock Exchange listings of three Japanese ETFs saw some breaks in market maker activity and exceptionally high dealing spreads in the post-tsunami period. This made trading in these funds both difficult and perilous (you risked buying or selling at a price that was significantly different from a fund’s underlying fair value).

Would it therefore have been better to use a traditional index fund to buy into Japan? Before attempting to answer that, let’s look at the prices recorded on the London Stock Exchange between Monday and Thursday last week in all three of the Japan ETFs we wrote about in our feature: IJPN, LTPX and XMJP.

The scale on the right hand side of the chart gives the percentage price move in these funds during last week from the closing price recorded at the LSE on the previous Friday, March 11.



The reason Lyxor’s fund (LTPX, represented by the magenta line) looks out of sync with the other two ETFs in the left of the chart is because it didn’t trade at all on the London Stock Exchange on Monday March 14th. The exchange recorded a closing price on that day that was unchanged from the closing price on Friday 11th. On Tuesday, mid-morning UK time, when trading in LTPX resumed, the fund’s price joined those of IJPN and XMJP in a near-15% decline from Friday’s (London) closing level.

The UK retail investor I wrote about in the feature article did get his trade done, despite a delay, late on that Tuesday morning in London, and you can see from the evolution of prices during the remainder of the week that Tuesday turned out to be a good time to buy into Japan’s equity market.

Let’s imagine that another, hypothetical European investor, also noticing the sharp decline in Japan’s share market on Monday and Tuesday last week (the Nikkei 225 index fell 6.18% on March 14 and then an additional 10.55% on March 15), decided to buy into the market using a traditional index fund.

Taking Vanguard’s Japan Stock Index fund as an example, it would have been necessary to send in a subscription agreement to the fund’s administrator by 4pm on Tuesday. The investor would then have been allocated shares in the fund based on the next day’s closing price (in Tokyo) for the Japanese shares underlying the index.

By investing only at the close of Tokyo trading the next day – and it’s impossible for an index fund to get subscription money into the market any earlier – our hypothetical investor would have missed out on the 5% rise in Japanese shares that took place on Wednesday.

This is all with the benefit of hindsight, of course, but it illustrates that there’s no easy answer to the question of whether to use an index fund or ETF.

With an ETF, you should get immediate execution, and therefore certainty as to the index level you’re investing at, but you clearly risk paying a significant spread to fair value in volatile markets. Furthermore, there’s no easy way to determine what that fair value is, particularly when the market underlying the tracker is closed (as is the case when Japanese equity ETFs are traded in European time). To get a feel of how Japan’s market is going to trade the next day you can look at the Chicago Mercantile Exchange’s Nikkei 225 futures, which trade almost around the clock, as a proxy, but these futures may not give an accurate hedge for other widely tracked Japanese equity indices, such as the TOPIX and the MSCI Japan.

With an index fund (or if you’re trading an ETF at the end-of-day net asset value, as many investors do), you’re eliminating the risk of paying a large dealing spread, but you’re also giving up certainty as to when you will be gaining the market exposure. Place an order to buy a Japan index fund in the morning European time, and you have both a full European trading day and then a full Japanese trading day ahead of you until you buy into the market.

(For the purposes of this discussion I’m ignoring structural differences between index funds and ETFs, such as the possible dilution that may occur within index funds with large cash inflows, while ETFs effectively reflect the cost of buying into the underlying shares via their own bid-offer spreads)

The extent to which you value the extra flexibility that ETFs give you depends on the kind of investor you are. If you’re investing a lump sum in the market each month, you’re probably better off with an index fund, or buying into an ETF at NAV, if that option is there. If you want to use your asset allocation skills more aggressively, then ETFs are the perfect way to do this. But, as we saw last week, you have to pay very careful attention to the price you’re getting and how your trade is executed.




    


 


 


The views expressed by those blogging are for informational purposes only and should not be construed as a recommendation for any security.
 

[yasr_overall_rating size="large"]
error: Alert: Content is protected !!