December 05, 2008

Since Enron, Little Has Changed

This post was co-authored with Malcolm Salter, the James J. Hill Professor Emeritus at Harvard Business School.

Enron’s demise in 2001 and the collapse of some of our most prominent financial institutions this fall share a common root cause: a shocking breakdown in corporate governance resulting from the endorsement of perverse financial incentives by directors, coupled with ineffective monitoring of firm-wide risk.

As Warren Buffett has said, “Executive compensation is the acid test of corporate governance.” Financial incentives determine what objectives an organization pursues, and they drive the way managers conduct a business.

Enron’s board ratified a cocktail of financial incentives and compensation contracts that promoted reckless gambling with shareholder money. Its bonus system, in particular, gave new business developers and commodity traders many incentives to inflate estimates of future profitability in order to pocket annual bonuses before the actual performance of multi-year transactions was known.

Lacking the recapture of bonus payments for unprofitable contracts, executives had little accountability in deploying shareholder capital. Finally, short-term quantitative criteria displaced qualitative measures of executive performance. The result was overcompensation, outsized risk-taking, and supreme overconfidence.

In the current subprime crisis, mortgage bankers and some commercial bankers utilized similar incentives to achieve short-term gains. Mortgage brokers, for example, were paid on commission with no economic penalty for writing poorly performing home loans. The mortgage bankers earned fees for packaging home loans as multi-layered collateralized loan obligations to investors, yet they bore no liability for the credit quality of these complex securities.

The result of these perverse incentives was as predictable for these bankers as it was at Enron: excessive risk-taking was rewarded to achieve short-term gains. Executives received outsized cash bonuses for closing deals and selling securities without evidence of future profitability. All this encouraged deception and carelessness in the management of firm-wide risk.

The history of Enron and the unfolding story of the current banking crisis suggest important lessons for boards in designing executive compensation programs:

Awarding performance bonuses based on estimated future cash flows and profits eliminates accountability and invites employees to maximize short-term pecuniary goals while risking the company’s viability.

Failing to include provisions for rescinding bonuses if companies revise their past performance creates perverse incentives for executives and promotes gaming behavior.

Awarding stock grants without extended holding periods enables executives to benefit from short-term stock price fluctuations, putting corporate insiders in conflict with ordinary shareholders.

Like the Enron board, directors at Lehman, UBS, Wachovia, Washington Mutual, Citigroup, and Fannie Mae failed to understand how compensation systems drove behavior, thereby creating the conditions that led to their failures. Directors at these firms failed to detect and deter the inevitable gambling that resulted from their compensation plans.

Were these boards incompetent, uninformed, or simply intimidated by powerful CEOs? The answer is not entirely clear. Whatever the reason, the outcome was the same: they failed in their fiduciary duty to govern.

If self-governance is to be preserved as a principal of corporate law, several improvements are required to protect against future breakdowns in board governance. Public companies should select only directors who have the time to serve effectively. Commensurate with that time and increased levels of director liability, director compensation should be increased, based on long-term performance of the firm.

Directors should be required to put more of their wealth at risk through investments in company shares so that their interests align with shareholder interests. Holding periods of restricted stock and stock options awarded to directors should be ten years or until retirement. In the event of a corporate failure, directors should be forced to sacrifice their earnings since they joined the board.

Finally, the roles of board governance and management must be clearly delineated and separated from each other. According to a study by Spencer Stuart, 95% of the S&P 500 companies currently have “lead” or presiding directors that coordinate the work of all independent directors, up from 36% in 2003.

The non-executive chair, or lead director, provides an independent voice when recruiting new directors, approving board meeting agendas, asking for information on firm-wide risks, evaluating CEO performance, creating a process for CEO succession. He also organizes the independent directors in the event of a unexpected issue, such as a takeover bid, resignation of the CEO, or fiscal crisis. Without clear separation of board governance and corporate management, the entire corporation may be put at risk.

If directors fail to provide clear oversight of executive compensation and risk-taking, they may abdicate their fiduciary responsibilities to groups of dissident shareholders and, ultimately, to the government. The Enron case resulted in the rushed passage of Sarbanes-Oxley legislation, a process that took just 31 days and considered only limited input from the business community. Unless boards of directors act immediately to adhere to their fiduciary responsibilities, this could happen again in 2009.

In our opinion, this would not be in the best interests of free-market capitalism and the growth of the U.S. economy, but it may happen unless boards take their responsibilities very seriously.