"Kee Points" with Jim Kee, Ph.D.

Industrial production jumped up pretty dramatically in April, which is a good sign for manufacturing. Industrial production numbers are more timely (reported monthly) than GDP numbers (reported quarterly) and better counted, meaning subject to fewer revisions. However, they are also more volatile (i.e. more “head fakes”) and less representative of economies to the extent those economies are serviced-based. Not surprisingly then, industrial production numbers are a better proxy for the overall economies in countries that are more manufacturing/industrial based, like many emerging market economies, than in economies that are more service-based, like the US. An old economist’s trick is to look in the back of the Economist magazine for monthly industrial production growth numbers for the world’s major developed and emerging economies, which gives you a quick global snapshot of how things are looking (quarterly GDP growth is also reported). Also reported there are weekly and year-to-date equity market performance of many of those economies. It is very instructive to see how these two numbers often diverge: Industrial production, GDP, etc., is quantity data, which is gathered with a lag and subject to revision. Equity market data, on the other hand, is price data and more forward-looking in general. Prices lead quantities

An interesting fact: Some years ago, a big story was that the number of mutual funds had actually exceeded the number of publically traded stocks. Recently, the number of “stock index funds” (indices based upon a variety of criteria, e.g. high dividends, emerging markets, etc.) has exceeded the number of publically traded stocks. And this from Christian Ledoux, our Director of Equity Research: In 2015 the top ten stocks of the US market (S&P 500) were up double-digits, while the other 490 stocks were actually negative on average. In fact, by the February 11th low of last year (2016), theaverage stock was down 26.7% from its 52-week high. But since the S&P 500 is cap-weighted (larger companies get a bigger weighting), the official index was down only 15% from its high. So the majority of stocks in the S&P 500 have already been through a “stealth bear market” (i.e. down more than 20%).  

Antitrust: President Trump’s nomination to head the US Justice Department’s Antitrust Division, Makan Delhrahim (confirmation hearings held May 10), is considered pretty hands-off or permissive when it comes to antitrust enforcement. The impact of antitrust actions on the stock market is pretty under-studied, which I find surprising. George Bittlingmayer of the University of Kansas has probably done and published the most work in this area, though his work is somewhat dated. For example, back in the 1980s his research corroborated the view of economist Irving Fisher, which stressed the importance of antitrust restraint for the stock market boom that occurred during the 1920s. In fact, Bittlengmayer argued that the threat of antitrust enforcement gave rise to the financial panics in the early twentieth century. He would look at the relationship between the number of antitrust case filings on the one hand, and stock market declines (reactions) on the other. A paper of his in 1993 argued that antitrust explains ten to twenty percent of annual stock returns, and later work (2000) suggested that actions filed against Microsoft hurt computer industry firms in general. It would be nice if there was more peer-reviewed research to balance out Bittlingmayer’s views, but I don’t come across much. The point is that his research suggests that Delhrahim’s appointment would be a tailwind for the market.

Finally, economist William Baumol passed away last week at the age of 95 (WSJ), which is an end-of-an-era event for professional economists. Baumol published in just about every field of economics, but I was most familiar with his textbook on Economic Theory and Operations Analysis. Operations research, or “OR,” has had several false starts in economics, not unlike game theory or even behavioral economics (more on that next week). It involves mathematical optimization techniques, many of which evolved from military research. Back in the 1960s there was great enthusiasm for applying these techniques to solve business problems, and many large firms established operations research teams. But by the 1970s the limitations of mathematization of business problems became obvious, and most of those programs were scrapped (they had broad success in physical applications, like refinery operations). I see a resurgence of sorts now in the field of business analytics, a product of the data generated through the use of information technology. [Aside: typically OR has fallen into one of two groups. The first, characterized by the late Gene Woolsey at the Colorado School of Mines (with whom I studied), asserts that you can only understand a process or operation by actually working every stage of it, from operating the cash register to driving the truck (so to speak). The second, characterized by Eliyahu Goldratt and the Theory of Constraints (TOC), places more emphasis on applying theoretical concepts, in this case finding the constraint(s) in the system and applying resources to that constraint (1920s scholars, long forgotten, called it “finding the neck in the bottle”). I think both are viable ways to skin the cat].