Ethics In An Unethical World

By Russ Banham

Corporate Board Member

Fred Davidson was stifling in the unbearable heat of a midsummer’s day in 2002. Temperatures in the corrugated steel warehouse hovered around 100 degrees. Weeks earlier, Energold Drilling, a global drilling solutions company that operates 270 rigs in 24 countries across the Americas, Africa and Asia, had imported several drill rigs into the country. The rigs were now blanketed in a fine layer of dust. For nine hours straight, the customs agent ticked off one supposed problem after another with the rigs’ components—none of them actual regulatory infringements.

It was a test of wills between the two men. “It was not a lot of money he wanted, but in no way were we going to set a precedent,” says Davidson, who is Energold’s CEO and a director on its board. “If I was going to shed a few pounds of weight in that warehouse, he was going to shed a few pounds with me.”

Finally, the customs agent backed away from his unspoken expectation of a payment and released the rigs. “I had sent a clear message that we would never solicit preferential treatment for a bribe,” Davidson says. “Some companies just pay it, figuring it’s the cost of doing business and they won’t get caught. But it’s a slippery slope.”

This slope remains just as slippery in 2019 and is a growing risk for corporate boards of directors at fast-growing global companies. Not only is corruption a financial disaster for the companies they serve, it can dig into their own personal pockets. “Board members, as individuals, may be held civilly and criminally liable if they lack knowledge ‘about the content and operation of the (company’s) compliance and ethics program,’” says Pamela Passman, citing boilerplate from the U.S. Department of Justice. Passman is the CEO of the Center for Responsible Enterprise and Trade (CREATe), a non-governmental organization promoting anti-corruption best practices.

Aside from financial liability, board members have a responsibility to ensure that business strategy and the objectives of the company’s anti-bribery program are aligned, says Passman. “The challenge is that global oversight of bribery and corruption is complex, and there’s a limit to what boards can review,” she adds. The price for companies that downplay these risks has never been higher, says David Montero, author of Kickback, a history of corporate corruption. “Federal law enforcement agencies both in the U.S. and abroad are showing greater determination to crack down on corruption,” he explains. “Fines also are mounting, with the Justice Department pocketing more than $11 billion since 2006. Now, more than ever, board members should be apprised of the corruption risks involved in global expansion.”

A GROWING CONCERN

It is now 42 years since the Foreign Corrupt Practices Act (FCPA) came into law in the U.S., yet bribery continues to be the “cost of doing business” in many countries worldwide. The United Nations estimates that corruption eats up some 5 percent of the world’s GDP, a shocking figure. The most crooked places on Earth to do business are Somalia, Syria, North Korea, Venezuela, Iraq and Haiti, and the least corrupt are Denmark, New Zealand, Finland and Singapore, according to Transparency International’s 2018 Corruption Perceptions Index, the leading indicator of public sector bribery on a country-by-country basis.

It’s the countries in between that give pause for consideration. India ranked 78th of the 180 countries on the list, tied with Ghana and Burkina Faso. Argentina ranked 85th, and China and Serbia shared an 87 grade. Although Vietnam’s economy is soaring, the country ranked a dismal 117 in a tie with Pakistan. What about the U.S.? It was ranked 22nd, sandwiched between France and United Arab Emirates. The U.S. received a score of 71 on a 0 to 100 scale in which 0 is most corrupt and 100 is least. Somalia, at the bottom of the list, received a score of 10.

To be sure, FCPA, and FCPA-like laws in places like the UK, Brazil and Canada, and the equally punitive provisions of the OECD Anti-Bribery Convention, ratified by 43 countries, have had an effect. These varied regulations prohibit companies and their representatives from influencing foreign officials with payments or rewards to receive preferential treatment in obtaining or retaining business.

Civil and criminal sanctions for anti-bribery violations are significant and sobering. As directors know, FCPA authorizes the U.S. Securities and Exchange Commission to bring civil enforcement actions against company officers, directors, employees and stockholders. If determined to have committed the violation, they must disgorge the ill-gotten gains, pay substantial civil penalties and may even be imprisoned. Sixteen companies paid a record $2.89 billion in 2018 to resolve FCPA cases.

“We’ve seen significant reduction in bribery-related crimes by U.S. companies in the past dozen years,” Montero says. “For roughly 30 years, FCPA criminalized commercial bribery overseas, but enforcement was laughable. The Justice Department had one full-time prosecutor, literally the same guy from 1977 to 2005. No one gave the law much thought.”

No longer is this the case. While stricter anti-bribery oversight and enforcement has resulted in fewer companies offering payments in a country for preferential treatment, it has not curtailed the practice of corrupt government individuals asking for one. The payments typically are billed as “surcharges” and “commissions” to help companies mask skirting the law.

“Paying a bribe comes at a cost, but not paying a bribe also comes at a cost— in `lost’ contracts, slow licensing timeframes and other unnecessary delays and bureaucratic roadblocks,” says Montero. “Further incentivizing a bribe is the knowledge that a competitor will pay one and get away with it.”

He’s referring to companies in countries not signatory to the OECD’s Anti-Bribery Convention or bound by the FCPA and similar laws. Such companies, says Daniel Wagner, CEO of consultancy Country Risk Solutions, “not only are legally permitted to pay bribes, they’ll often receive a tax deduction for the amount paid, which puts their competitors at a distinct advantage.” Among countries permitting tax deductions for bribes showed to be a necessary part of a transaction are Austria, Belgium, France and Germany.

This preferential treatment is a “competitive injustice” to companies that will not stoop to paying a bribe, says Jim Nelson, president and COO of Parr Instrument Company, a manufacturer and global seller of chemical reactors and pressure vessels in 75 countries. “It hurts us financially and ticks me off personally when I find a competitor that’s not bound to the FCPA paying a bribe without a care in the world,” says Nelson.

“Bribery is a continual problem,” says Jon R. Tabor, chairman and CEO of Allied Mineral Products, a global manufacturer of monolithic refractory products for a myriad of industrial applications. The company has 12 manufacturing plants in eight countries, including China, India, Russia, Chile and Brazil, and a sales presence in more than 100 countries. “We’ve been asked and refused to pay bribes in Russia, China, India, and elsewhere,” Tabor says. “Once you start paying, you get a reputation for it and it never stops.”

Davidson agrees. “You pay just one person and the word gets out, and now you’re expected to pay everyone,” he says. “It’s like quicksand—you put your foot into it, and it gradually sucks you in.”

HUMAN NATURE

Why do companies risk their reputations in committing these crimes? They figure they’ll get away with it, says Montero. “In my research, I discovered that only about 20 percent of companies [that engage in bribery] ever get caught,” he says. “While the fines may look astronomical, they’re miniscule relative to the value of selling in an overseas market.”

The good news is that fewer companies are taking the risk. “The fines are an impediment, but it’s really the risk of imprisonment and disbarment from a country [to procure business in the future] that are starting to make a positive difference,” says Patrick Moulette, head of the anti-corruption division of the OECD’s Directorate for Financial and Enterprise Affairs.

Since the OECD Anti-Bribery Convention entered in force in 1999, 560 individuals and 184 business entities have received criminal sanctions for foreign bribery. At least 125 of these individuals were sentenced to prison, with 11 of them receiving five-year terms. At present, more than 155 criminal proceedings are underway against 146 individuals and nine businesses.

Refusing to pay a bribe does not always mean a company will face hurdles getting its products into a foreign region. In countries like India, where the asking of small bribes by low-level customs agents is common to get goods off a dock and into commerce, not paying the bribe stalls the proceedings temporarily. “It doesn’t mean you can’t transact business in the country; it just means it will take a bit longer,” says Bill Pollard, a partner at Deloitte Risk and Financial Advisory, who specializes in anti-bribery due diligence and post-bribery detection.

Blowing the whistle on a government employee who asks for a payment doesn’t necessarily speed up the delay. Davidson recalls once being asked for a kickback by a customs agent. “I went over the person to his superior and told him what happened,” he says. “Two days later, I got a call from the same customs agent. He doubled the payment. That told me people up the chain were getting a percentage.”

BOARDROOM BEHAVIORS

Despite the prevalence of such practices, beyond a peek at an organization’s anti-bribery policies, board members rarely are apprised of what is going on in the trenches. “Board members need to be confident that not only is a robust program in place, but that it is effective and embedded throughout the organization and particularly in high-risk regions,” warns Passman.

Montero agrees, pointing out that directors must balance their interests in growing business abroad and maintaining the commitment to ethical practices. “Board members should realize that some markets are simply so corrupt that bribery cannot be avoided; in such cases, they should walk away and concentrate on those segments where the company can both thrive and adhere to sound compliance,” he says.

That’s the practice at BlackLine, a public company experiencing rapid global expansion. “There are countries where corruption is standard business practice that we will not do any business in—period,” says Therese Tucker, founder, CEO and board member of the provider of finance and accounting automation software solutions that has sprouted offices in the UK, Australia, France, Germany, Singapore and Japan in the past six years. “Even if the country represents a significant market, it’s just not worth it to us.”

This approach should be de rigueur in all companies, Montero says. “Corrupt behaviors are a short-term solution with a long-term downside—bribes may drive up sales today, but, over the long run, they increase costs, adding to inefficiency and undermining morale,” he explains. Board members should take care not to inadvertently encourage such risky behaviors. “At times, board members will push management too hard to execute deals quickly in specific jurisdictions for competitive reasons—without a discussion of the bribery and corruption risks,” Pollard says. Directors also need to become more knowledgeable about the risk environment in the geographies eyed for market expansion. In assessing the country risk, the compliance experts agree that Transparency International’s index is a great start. “It’s the best way to determine which markets you can effectively compete in and be the ethical player you want to be,” says Montero.

Due diligence into the third parties the business relies on to service, sell and distribute its products abroad also is recommended. According to the 2018 Anti-Bribery and Corruption Report by corporate investigations firm Kroll, nearly half (45 percent) of companies surveyed rely on third-party partners to enter foreign markets and conduct business abroad. If the third party engages in bribes to obtain or retain business, the company itself could be in violation of FCPA and other anti-bribery regulations.

Given that 58 percent of the survey respondents uncovered legal, ethical or compliance issues in their due diligence to select a third party, the compliance risks are not for the fainthearted. “Third-party risks are the most significant corruption challenge for a company expanding overseas,” says Passman. “The further you move away from relying on your own employees abroad, the higher the bribery risk.”

To strengthen third party compliance practices, Passman advises boards to follow the framework within the ISO 37001 Anti-Bribery Management Systems Standard, published in 2016 by the International Organization for Standardization. The standard provides for independent verifications and audits of third-party partners. If these evaluations indicate prior problems with a third party, the standard requires that these issues are made public to alert other companies.

Board members also can help beef up the provisions of the company’s anti-bribery code of conduct to make sure they are transparent, strict and punitive. Contracts with third parties in violation of the code should be terminated, and employees should be made liable for disciplinary actions, including loss of employment. The violators also should be reported to relevant regulatory and criminal authorities. “The code of conduct should be absolutely clear that local business practices are never a justification for paying a bribe,” says Wagner from Country Risk Solutions.

To ensure BlackLine’s employees understand and appreciate the company’s strict compliance with FCPA and other anti-bribery regimes, Tucker has an external legal consultant draw up a 60-pages long detailed employment contract that includes a boldfaced Code of Conduct. “We have a zero-tolerance policy when it comes to bribery and any form of dishonesty,” she says. “When trust is lost, it cannot be regained.”

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

In Blockchain They Trust

By Russ Banham

CFO

Someday, you may use an app at a supermarket to scan the beef sirloin you plan to buy for dinner, discovering the cow’s life journey. Another app will assure that the pair of handmade Gucci loafers you just bought are authentic. These apps will be connected over the internet to blockchain platforms, each one a digital ecosystem created for a specific industry. And they’re not distant dreams of tech entrepreneurs — these apps are already in development.

A year ago, the CFO/Duke University Business Outlook Survey found that 78% of U.S. finance chiefs said they didn’t know whether or how blockchain would affect their company. Only 3% claimed to even understand it. But many organizations apparently did their homework since then and warmed to the technology’s potential. In Deloitte’s 2019 Global Blockchain Survey (of a more general set of senior executives), more than half (53%) said blockchain had become a critical priority for their organizations; four in 10 said they were willing to invest $5 million or more in blockchain initiatives in the next year.

What has been the catalyst? Companies eager to drive down operating costs, certainly. The other, somewhat surprising, impetus is the lack of trust — between industry competitors, suppliers and customers, and even the manufacturer and the consumer.

Traceable and Accurate

Farther back, five years ago, blockchain meant bitcoin, the cryptocurrency whose founding depended on a trading platform in which currency data was confirmable and immutable. Bitcoin’s star has faded. But blockchain has wider value as a network to exchange data and transact via “smart contracts.” These contracts trigger based on prearranged terms and conditions. Smart contracts can automate highly manual and semi-manual transactional processes to cut operating expenses and reduce points of friction with customers.

At its most basic, blockchain is a digital ledger that records transactions among a network’s participants and distributes them to members in real time. Every 10 minutes, a transaction is verified as factual and then permanently time-stamped and stored in a “block” similar to a page in a ledger. Once a block of transactions is complete, it is linked to the preceding block to create a chain of records.

Since the data entries provide a secure audit trail, network members are assured the ledgers are beyond reproach (although some, like MIT Technology Review, claim blockchains are hackable). “Blockchain’s initial wave of business transformation is the creation of single sources of truth,” says Jamie Solomon, a managing director for North America at Accenture.

The technology lets distrustful parties come to an agreement without relying on intermediaries. In blockchain-fueled networks, companies can share accurate and verifiable data with each other and with suppliers. Not all data — just information that is of mutual benefit. “Industries have now passed the stage where they want to apply blockchain because it’s cool,” says Paul Brody, global blockchain leader at Ernst & Young. “There is now widespread [recognition] that blockchain lends itself to solving real business problems.”

Taking Steps

One of those problems is overcoming consumer skepticism of companies that claim to sell “ethical” goods. Blockchain, it turns out, offers an uncontestable way to trace a product’s lifecycle. That capability impelled Lukas Pünder, finance chief of handmade shoe brand CANO, to investigate developing a blockchain for the fashion apparel industry.

“We wanted consumers to be able to trace every step in the manufacture of each pair of our shoes — from the origin of the raw materials to the craftspeople in Mexico who use traditional braiding methods,” says Pünder.

Pünder leveraged Oracle’s blockchain technology to create a digital ecosystem for CANO. Customers interested in their purchase’s provenance can use an app on their smartphones to scan a near-field communications chip embedded in the shoes or apparel. The two-year-old company’s complete summer collection will be equipped with the transparency technology.

For the winter collection, to be launched in September, CANO products will use a pilot solution for the entire industry called Retraced. Retraced offers more in-depth information about a product and has a more sophisticated design. Other fashion brands that will be equipped for the Retraced transparency solution include European makers John W. Shoes, Afew Store, and Jyoti-Fair Works. Additional brands will be onboarded after a test phase. With Retraced, about 50,000 to 100,000 products will be tracked this year.

Trust in a company’s sustainable practices are important, Pünder says. In an industry rocked by allegations of unsafe working conditions and low wages, the apps let consumers know that they’re not purchasing products from unscrupulous sellers. “By leveraging transparency as a core value, a company can achieve desirable brand differentiation,” Pünder says.

Companies like CANO can also discern which suppliers are producing shoddy work, generating lower quality products that customers tend to return. “Ten percent of all fashion items are faulty. Now you can identify exactly which company in the supply chain is responsible,” Pünder says.

Pruning Processes

In what other industry is trust an issue? Insurance. “Policies and claims involve multiple parties, complicated agreements, complex logic, different intermediaries, and many verification points, making them ripe for blockchain,” says EY’s Brody.

More than 30 large global insurers, reinsurers, and insurance brokers joined in 2018 to create a blockchain consortium, The Institutes RiskStream Collaborative.

“There’s great value in members sharing their data for mutual benefit, but the problem in the past has been an immense lack of trust between these entities,” says Christopher McDaniel, Risk Stream president.

The consortium is developing Canopy, a blockchain that connects the industry in a data-sharing network. An example of its proposed use is the car insurance claims process. At present, if two drivers, each insured by a different company, are in a minor collision, they jot down their driver’s license and car registration information. Each policyholder then calls his or her insurance agent to relay the other party’s information.

Once notified, the insurers start the drawn-out claims administration process, manually preparing a “First Notice of Loss.” A claims adjuster is tasked with gauging the extent of the damage and relative fault for the accident. This process entails numerous and lengthy back-and-forth phone calls and emails between the insurers. They eventually agree on who pays.

In the future, with Canopy, each policyholder would have an app provided by their insurer. The drivers would upload a QR code reader and scan each other’s codes. The information would flow to Canopy in real time, giving the insurers the ability to simultaneously verify the drivers’ identifying information. The blockhain platform would trigger a First Notice of Loss without the involvement of agents.

By sharing their policyholder data in Canopy, the two insurers’ processing cycle times would shorten. Agents would be able to devote more time to managing client risks instead of processing information. “You need people to process claims and underwrite policies,” says Matt Lehman, managing director in the insurance practice of Accenture, a solutions provider to RiskStream. “That’s a lot of trapped value.”

Both the proof of insurance and First Notice of Loss capabilities will be technically ready and available to network members in July, but then carriers have to embed them into their own mobile applications, which will take longer.

Further down the line in Canopy’s development, as the vehicle accident information flows to the blockchain platform, it could set off a series of smart contracts to member tow truck firms, car repair shops, rental car agencies, and law enforcement.The next stage in Canopy’s development calls for members to share data in the interest of developing new products. RiskStream’s McDaniel provides the example of a group of electric bicycles reinsured at a micro-transactional level.

“A primary insurer of electric bicycles could cluster them across different geographies, creating a portfolio of risks that would be traded in an open market,” he says. “Different reinsurers would assume portions of the primary insurer’s risks in real time, automated through prearranged smart contracts.”

“Once you remove the inefficiencies across companies in an industry, all sorts of innovative concepts bubble up, to the benefit of all parties in the blockchain network,” McDaniel adds.

Sean Ringsted, chief digital officer at the large global insurer Chubb (a member of Canopy), cites the value of Canopy’s ongoing work for policyholders. “By improving our operating efficiencies, eliminating duplicative, redundant data flows and questions about where the data comes from and is it accurate, our customers benefit from much easier and less time-consuming claims processes, not to mention more innovative risk-transfer products,” he says.

Farm to Table

Livestock agriculture is another industry experimenting with blockchain. “There’s a growing segment of direct-to-consumer brands that retail only organic, free-range, grass-fed, responsibly raised, and naturally sustainable lamb, beef, chicken, and pork of the highest quality from small farmers,” says Leslie Moore, owner of Farmer Girl Meats, an e-commerce farm-to-table business based in Princeton, Kansas. “The challenge has been proving everything I just said to consumers.”

Moore, a third-generation farmer raised on her family’s grass-fed beef farm in Kansas, left in the 1990s for business school and later a job in branding at a large manufacturer. She returned to the farm with an idea for building a platform that would track relevant data on the farm’s meat products.

Truth and transparency are lacking in today’s meat industry, she says. “Ambiguous language in [U.S. Department of Agriculture] regulations allow imported beef from Paraguay, New Zealand, and Australia to be labeled as ‘Product of the USA,’” says Moore.

The imported grass-fed beef is shipped in what are called primal cuts (the main areas of the animal, which include the loin, rib, round, flank, chuck, sirloin, and brisket). It goes directly to USDA-approved facilities in the United States. The meat is inspected and cut into packaged goods destined for grocery store shelves nationwide.

“An animal born, raised, and harvested in a foreign country can be marketed to consumers as a product of the United States; its true origin is unknown to the buyer,” Moore claims.

Through a partnership with Silicon Valley blockchain startup Citizens Reserve, Moore hopes to alter the paradigm for small livestock producers. The app provides traceability from the birth of an animal to the steak or pork chop on a plate, she says. “Everything that animal encounters over its lifespan becomes part of its story.”

This includes what a cow or pig is fed each day, what kinds of fertilizers or pesticides the farm may use, and whether an animal has been treated with antibiotics, making it no longer antibiotic-free. “That classification results in a lower markup, but if the buyer could see that the medicine was used only topically and not ingested, it could alter economic outcomes for the farmer,” says Moore.

The blockchain platform would give each package of meat a unique digital identity providing “farm-to-plate” lifecycle information so consumers can make more educated buying decisions.

Thane Tokerud, financial controller of Citizens Reserve, says the major benefit of the ecosystem it is developing, called Impact Ranching, is providing traceability.

Farmers and other vendors on the platform could view distribution outlets eager to sell meats from farms that can literally prove their sustainable practices through the use of blockchain, Tokerud explains. Specialty meats have up to a 20% markup, so the additional distribution opportunities can equate to significantly higher margins.

Another advantage, which may not get the promotion the others do, is in product recalls. Regulators and distributors could quickly ascertain the origin of meat sitting on grocery store shelves and pull it if necessary. Walmart and its Sam’s Club division, for example, are planning to implement blockchain technology this year to get real-time, end-to-end traceability of leafy green products.

Impact Ranching, which goes live in 2020 and will have many agricultural industry collaborators, also may obviate farmers’ reliance on the costly third-party certifications required by the USDA. “Since the data in the ecosystem is verifiable and immutable, the information theoretically would allow farmers to self-regulate, reducing the time-consuming bureaucracy they presently confront,” Tokerud says.

Accenture’s Lehman sees a similar benefit for insurers. “Regulation in insurance is complicated, given 50 different states with disparate rules and complex filings,” he says. “If you can create specific real-time views for regulators in Canopy, where they get to see accurate, immutable, and standardized data they know is factual, it will remove a layer of bureaucracy.”

That’s a big ask of regulators — blockchain technology will first have to earn the government’s imprimatur. That may take awhile because the applications are still somewhere immature. It’s also unclear how fast or if these industry solutions will produce a return on investment for companies. But industries are pushing forward, confident of blockchain’s potential to bring business partners together and build credibility with consumers.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

New Accounting Model For Life Insurers Seeks Greater Transparency

By Russ Banham

Forbes

A new accounting standard issued by the Financial Accounting Standards Board (FASB) should make it easier for investors to compare companies that sell long-duration life and health products and annuities with market guarantees. However, companies will need to expend a great deal of effort to comply with the standard.

FASB’s Accounting Standards Update No. 2018-12 is driven by the accounting standard-setting body’s objective to improve the existing recognition, measurement and disclosure requirements for life insurance and annuity contracts that remain in force for an extended time. The standard takes effect in 2021 for public business entities operating on a fiscal year basis, and in 2022 for other entities (early adoption is permitted).

Let’s look first at these changes as they relate to life insurance, a product involving a promise of payment that may not occur for several decades. The assumptions used by insurers to measure the liabilities for their policyholders’ future benefits extend decades into the future. Under current accounting rules, the assumptions are locked at contract inception and held constant over the term of the contract. If the contract was sold in 1980, the assumptions at the time—the discount rate and estimates of longevity, for example—would generally remain appropriate 30 years later. This approach takes a long view of the obligation and avoids changes in the short term.

FASB’s updated rules will require insurers to revisit their assumptions for these types of life contracts annually, or earlier if they believe significant changes have occurred. Any changes in the liability resulting from these new assumptions will be recognized immediately; however, FASB has made an important distinction: Changes in the discount rate won’t be recognized in earnings while changes related to other assumptions will be.

That’s good news for both investors and companies as it provides a current view of the liability without introducing unnecessary noise into earnings. “Many investors felt the current accounting model was not very transparent, making it difficult for them to understand what was on the books, while companies were worried about earnings volatility caused by movements in interest rates,” explained Edward Chanda, national sector leader for insurance at audit firm KPMG LLP in the United States.

With regard to annuities with market guarantees under current accounting practices, some types of guarantees previously were accounted for as insurance and others at fair value. However, under the new accounting standard, any such guarantee that is responsive to market changes will be accounted for at fair value. While this will make it easier to compare companies, earnings volatility resulting from changes in fair value associated with these long-term contracts could make it more difficult for companies to explain their earnings to investors.

Clearer Vistas Ahead

Concerns about the current accounting had prompted FASB’s joint efforts with the International Accounting Standards Board to update the accounting standard nearly a decade ago. The two groups eventually parted company, with FASB focused on more targeted reforms to long-duration contracts.

There are several benefits of the new standard. FASB stated that it will improve the timeliness of reporting changes in an insurer’s liability for future policy benefits and provide guidance regarding the rate used to discount future cash flows. It also will improve and enhance the consistency of the accounting for certain market-based options or guarantees associated with deposit (or account balance) contracts. Other benefits include simplifying the amortization of deferred acquisition costs and improving the effectiveness of the required disclosures.

By and large, these changes will make it easier for investors and other stakeholders to compare different insurers on more of an apples-to-apples basis. “The changes are a step in the right direction for investors in terms of providing more consistency between companies and greater transparency about the assumptions,” said Laura Gray, principal and leader of KPMG’s actuarial practice.

The onus is on insurers to provide this transparency. The new standard requires significant changes in insurer accounting processes, since economic events could change the assumptions almost every year. Even more worrisome is the hard work required to ensure their data, systems and processes are compliant within a relatively tight timeframe. “It will require significant investments in people and resources that may be in short supply,” said Gray.

Insurers are bracing for the changes. “While everyone acknowledges there are challenges with the current ‘locked’ approach, there’s always an element of trepidation when it comes to change,” said Chanda. “Companies have been telling their story one way for a long time and now they have to tell it other ways, separating the signals from the noise.”

In planning their transition, companies should form steering committees and working groups to wrestle with the tougher aspects of the new standard, Gray said. “A lot of judgment will be needed to determine future results,” she explained. “But, as people get used to the ‘new normal,’ these judgments around assumption-setting will become better understood.”

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How Machine Learning Speeds Up Fraud Detection

By Russ Banham

Forbes

In their work to unearth evidence of fraudulent activities, forensic accounting investigators dig through diverse data looking for anomalies that suggest something is just not right. But as the massive volumes of data collected by companies balloon, this task has become increasingly arduous, time-consuming and humanly impossible.

The regrettable consequence is the greater chance of a well-thought-out scam slipping through the cracks. A case in point is healthcare fraud, which has been estimated to cost the United States tens of billions of dollars annually.

For forensic accounting investigators, unearthing these crimes manually is an uphill climb. “The fundamental issue is that there is a flawed approach in examining fraud, since fraudsters know the rules that are set up to catch them,” says Justin Bass, chief data science officer at Crowe, the global accounting, consulting and technology firm combining specialized industry expertise with innovative technology solutions.

Bass provides the example of money laundering rules, which require banks to report any cash transactions of more than $10,000 to regulatory authorities. In response, “fraudsters simply break up the cash transactions into smaller amounts,” he explains. “The rules are created to catch these smaller amounts, but then the fraudsters circumvent them with other methods — which leads to creation of other rules and other subsequent actions by fraudsters to evade those new rules.

Machine Learning To The Rescue

Now there is a way to circumvent fraudsters via the use of machine learning(ML), the subset of artificial intelligence giving computers the ability to scan a haystack of data in search of the proverbial needle and progressively improve this capability through continuous learning.

Instead of investigators manually reviewing spreadsheet rows and columns, looking for three or four data elements that together indicate a suspicious transaction, ML can peruse thousands of data elements — instantly.

Applying an algorithm to this massive volume of data to tease out unique interrelationships presents a greater likelihood of detecting anomalies indicating fraud. “Whereas people generally can visualize three or four dimensions when evaluating the accuracy of a purchase order, machines can examine innumerable dimensions to ferret out the truly suspicious activities,” Bass explains.

To that end, Crowe has developed a proprietary ML tool called Crowe Data Anomaly Detection that has allowed the firm’s forensic accounting investigators to focus their efforts on higher-risk cases, reducing the time spent on those that don’t pan out, says Bass, whose team created the fraud-busting solution.

“We let the data tell us where to look, as opposed to us having to look everywhere,” says Tim Bryan, one such investigator and a partner in the Crowe forensic accounting and technology services group.

How It Works

Since the solution is capable of continuous learning, its ability to detect fraud improves by the day, Bryan notes. “Each time the tool is right about an actual anomalous transaction, the information automatically goes into the system, making it smarter. The same applies to when it is wrong, as this false positive also is incorporated.”

To detect the aforementioned money laundering schemes, the data anomaly detection solution examines the underlying data to pinpoint incongruities, clustering like-transactions together. Programmed to identify transactions under $10,000, the tool might highlight, say, if similar sums are deposited in a large number of banks across geographies, instantly detecting this atypical interrelationship. As a result, the customary latency time between when an investigator receives a transaction report and subsequently conducts a hindsight analysis is vastly reduced. “The transaction now comes in and is immediately scored by the tool,” Bryan says.

To test the tool’s ability to identify suspicious and possibly fraudulent activity, Crowe recently used the solution to analyze more than 16,000 contracts from a large telecommunications company. “With human analysts, the project took five professionals four months to complete,” says Bryan. “The machine learning tool enabled professionals to focus their time on investigating only the top 5% of transactions within one month, culminating in a 95% reduction in the amount of data the professionals needed to review, saving significant time and costs.”

Turning The Tide

That’s good news for companies (and bad news for fraudsters). “I’m confident we have a game-changer here,” Bass says.

Having successfully tested and used the tool internally, Crowe recently made it available as both a standalone software product and an add-on to clients’ existing accounting systems. The technology doesn’t just assist in detecting potentially fraudulent activities; it also illuminates human errors that could result in accounting mistakes.

“What Justin’s team has developed is what we in forensic accounting call ‘the brains,’” says Bryan. “It is industry agnostic, in the sense that it can be used in the healthcare space to look at fraudulent billing, in insurance to examine suspicious workers’ compensation claims, in manufacturing to look at fraudulent purchasing and in academia for a university to scope out fraud or errors in their expense processes.”

Since the tool enables continuous monitoring, as opposed to a one-time look back at data, Bryan says it presents the vital opportunity to improve the accuracy of financial statements across the board. “The tool finds things we couldn’t find using our rules-based investigatory procedures,” he acknowledges. “Now we’re leveraging technology to do what we’re good at — only much better.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How Energy Companies Are Leading The Way In Cybersecurity

By Russ Banham

Forbes

In today’s increasingly digital world, the secure transmission of sensitive information has become a top priority for both individual citizens and the world’s largest government agencies. Since 90 percent of the U.S.’s power infrastructure is privately held, leading energy companies are adopting cybersecurity practices intended to reduce the impact of any incident that might put energy delivery at risk. However, sometimes these measures fall short.

A universal challenge

On March 19, the computer screens at Norsk Hydro went blank. The giant Norwegian energy and mining company’s IT systems were infected with a new strain of ransomware virus called LockerGoga. The situation was “severe,” Norsk Hydro CFO Eivind Kallevik told a hastily-convened news conference.

The cyberattack had launched in one system the previous night and spread quickly throughout the company’s network, locking up digital files and devices critical to its core operations. As in other ransomware attacks, Norsk Hydro was given a stark choice: pay a ransom to unlock the systems, or pay the price in curtailed production.

In the six years since the first ransomware strain CryptoLocker appeared in 2013, such attacks have become business as usual of the worst kind. Every minute, a private company falls victim to a ransomware attack. These invasions have cost businesses a staggering sum: more than $8 billion each year.

If the targeted company is vital to critical infrastructure, the impact is even more significant. For instance, if an attack compromises the energy grid — the network of synchronized power providers and consumers connected by transmission and distribution lines — everyone relying on it will suffer the consequences in the form of lost power.

Protection and prevention

Taking preemptive steps to combat this grim possibility, the U.S. House of Representatives recently introduced a bill (H.R. 1975) to establish a Cybersecurity Advisory Committee within the Department of Homeland Security. The 35-member committee of cybersecurity experts would make recommendations on the development and implementation of policies to combat cybercrimes, such as ransomware attacks, against the nation’s critical infrastructure.

The energy industry is also stepping up to protect its assets from the damage caused by a major cyberattack, such as the one successfully launched againstUkraine’s power grid in December 2015. Hackers were able to compromise the IT systems of three energy distribution companies, effectively disrupting the supply of electricity to end consumers.

To prevent a similar attack from occurring on American shores, the Federal Energy Regulatory Commission (FERC) issued a final rule in 2018 lowering the threshold for a “reportable cyber event.” The goal of the rule is to improve data collection to better analyze the risk of a cyberattack for defense and response purposes.

The FERC also directed the North American Electric Reliability Corporation, a nonprofit institution overseeing the steadfastness of electric grids across North America, to “augment the mandatory reporting of cyber security incidents, including incidents that might facilitate subsequent efforts to harm the reliable operation of the bulk electric system,” accrding to the rule filing.

In issuing the final rule, the FERC’s then-Chairman Kevin McIntyre emphasized the fluid aspects of challenges to cybersecurity.

“Cyber threats to the bulk power system are ever changing, and they are a matter that commands constant vigilance,” he stated.

New tools on the front lines of the cyber frontier

Today’s energy industry plays a vital role in securing the flow of electricity to businesses and consumers, essentially upholding our modern economy. It’s no wonder, then, that hostile governments, terrorist organizations and private-practice hackers have put the industry in their crosshairs, disrupting the operations of utilities and energy suppliers. The energy sector now rightly recognizes these cyberattacks as a core business risk which pose as much of a threat to large infrastructure as floods or fires.

To help the industry reduce the incidence and severity of these hazards, top energy companies have partnered with government agencies like the Department of Energy—and sometimes even with competitors—to make great headway in improving their cybersecurity practices.

With solutions designed specifically for the energy sector, new innovations make it easier for companies to safeguard vital information and keep operations online. These are not passive endeavors, either. Thanks to Information Sharing and Analysis Centers established by federal law, energy companies can learn from each other, sharing cyber threat indicators and other security information.

New software also assists companies with risk detection, monitoring and incident response by recognizing and understanding the exploits meant to inflict harm. Keeping up with these new attacks or malware through continuous threat monitoring, real-time anomaly detection and immediate malware pattern updates helps companies to stay a step ahead.

Meanwhile, the information gaps that attackers take advantage of in weak security measures can be adjusted for by using enhanced intrusion detection and user authentication to identify suspicious activity. As companies look for guidance on security, comprehensive online training and clearer policy on grid defense solutions can provide the information they need.

Maintaining power and establishing industry-wide trust

Companies that develop cybersecurity solutions are responding to this increasing and changing threat. Mitsubishi Heavy Industries (MHI) has partnered with NTT Group to commercialize a jointly developed cybersecurity technology for critical energy infrastructure control systems. Called InteRSePT(Integrated Resilient Security and Proactive Technology), the technology provides real-time monitoring of data flows in a network and helps to detect cyberattacks specifically designed to exploit operating controls.

Unlike conventional technology, which finds it challenging to spot this type of attack, the system discerns potential threats by changing the security remediation rules governing the operations of the target. These rule changes allow for earlier detections of anomalies, which can be screened to vet potential breaches. By rapidly identifying these threats and responding in kind to halt the damage, the system preserves continuous power generation and availability – with no disruption in service.

“Cybersecurity is a focal area for MHI, and we continue to place importance on developing next-generation solutions in this area,” MHI’s Chairman of the Board and former CEO Shunichi Miyanaga recently stated.

MHI is the first company in Asia to join the Charter of Trust for Cybersecurity, which calls for binding rules and standards to build security and trust in the digital realm. Initiated by Siemens during the Munich Security Conference in February 2018, the 17 company members of the trust (including Cisco, Enel Group, Dell Technologies and IBM) have pledged their compliance with minimum binding cybersecurity requirements, to be anchored by binding clauses in each member’s contracts with customers. These requirements are being finalized now and will be introduced on a step-by-step basis.

The ambitious goal of the Charter of Trust for Cybersecurity is to better protect the digital assets of critical infrastructure, ensuring high-quality cybersecurity throughout the networked environment. Since new forms of malware and viruses rapidly proliferate every day, it’s important to encourage energy industry efforts to work together, and with investigators, on cyber prevention and defense. The security of daily life – and all the infrastructure that powers it – depends on this effort.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

5G is the Road to Tomorrow

By Russ Banham

Perspectives

Few things in life are certain, but one of them appears to be the inevitably of self-driving cars. Although some disparage this possibility for potential safety and loss of personal freedom reasons, others can’t wait to hop into a Tesla or Waymo and say, “Home, please.”

This latter group may get their wish sooner rather than later as wider availability of mobile 5G network services may shorten the wait. 5G—the fifth generation of wireless technology—promises transmission speeds up to 20 times faster than current 4G platforms, in addition to lower latency (the time lag between the initiation and reception of communications). 5G networks are touted as having latency rates of under a millisecond. This near-instantaneous delivery of information can be crucial to the rapid responsiveness needed by autonomous cars and trucks when confronting an imminent danger like a giant pothole—or a pedestrian.

“You’re getting less latency, which is important in an environment where a remotely-piloted vehicle like a truck is getting a substantial volume of collision-avoidance information from sensors onboard the vehicle and from the surrounding environment, such as weather reports, driving conditions, pedestrians in the road, and so on,” says Steve Viscelli, senior fellow at the University of Pennsylvania’s Kleinman Center for Energy Policy. “The speed at which all this data flows to a remote human pilot operating an autonomous truck is fundamental to avoiding a collision.”

Maximum Speed Ahead

Autonomous cars and trucks are those in which automated driving systems (ADS) do some, most, or all the driving. There are five levels of autonomous driving, as outlined by the National Highway Traffic Safety Administration (NHTSA). These levels capture a progressively increasing use of ADS in driving a vehicle, with Level 4 describing a vehicle that is capable of performing all driving functions under certain conditions, and Level 5 describing a vehicle capable of performing all driving functions under all conditions.

When Levels 4 and 5 will occur in great numbers on the road has long been a matter of debate, though most experts believe that fully-autonomous commercial vehicles like trucks will predate the debut of entirely self-driven cars. While some trucks will involve passive drivers, others will be completely unmanned; in such cases, a combination of autonomous driving technologies and remote piloting by humans will control the vehicle.

“The first autonomous vehicles without humans on board are already widely in use in the construction and agriculture industries, but large numbers of unmanned trucks will be on highways and others roads before we see fully self-driving cars,” says Kartik Tawiri, cofounder and CTO of Starsky Robotics, a leading manufacturer of autonomous trucks.

Sharing this perspective is Viscelli, author of the book, The Big Rig: Trucking and the Decline of the American Dream. “Fully autonomous trucks that are unmanned and remotely piloted will come sooner than autonomous cars,” he explains. “There are just too many economic benefits to be gained from autonomy for the trucking industry to ignore.”

Chief among these is the precipitous decline in people willing to drive trucks long-distance; the American Trucking Associations (ATA) posits an urgent need for 60,000 drivers now and far more in the future. “The biggest problem in trucking is a dire shortage of long-haul drivers,” says Tawiri. “It’s a fun thing to do in your early 20s, but after that no one wants to spend their life in a metal box roaming the country. The turnover of drivers is huge.”

Autonomous trucks would solve the human labor dilemma, assuming legislators and regulators are willing to designate a dedicated lane on highways to accommodate driverless trucks. Aside from addressing the protracted driver shortage, the concept would increase safety: By limiting the use of autonomous trucks to a single lane and restricting non-autonomous vehicles from driving in this lane, the risk of a collision with non-autonomous vehicles is greatly reduced.

A designated lane also fits well with current autonomous technology: Driverless trucks can be remotely controlled through geo-fencing, which involves the use of global positioning (GPS) or radio frequency identification (RFID) to create a virtual perimeter in a prescribed area like a dedicated lane, limiting automotive autonomy to this geographic boundary.

In this regard, the agriculture industry is instructive. “We’re seeing quite a bit of autonomous equipment on farms in India, where tractors geo-fenced into a particular agricultural area drive around freely without anyone on board,” says John Simlett, consulting firm EY’s Future of Mobility leader.

Another factor driving autonomous trucks on the road is online retail. Small trucks and vans delivering consumer goods purchased from online retainers, such as Amazon, will continue to crowd residential streets, but autonomous long-haul trucks plying dedicated highway lanes in the future will transport the goods from ports and rail depots to the smaller vans and trucks.

“We’ll begin to see what the industry calls `platooning,’ in which the first truck in a queue of trucks is driven by a human being and the remainder use automated driver support systems, in addition to remote piloting in the first and last miles of travel, to maintain a specific distance behind the leader, accelerating and braking as the computer dictates,” says Viscelli.

This possibility bodes well for all of us. Highway accidents generally are the most catastrophic, with truck-related fatalities reaching their highest level over the past 29 years in 2017, rising 9 percent to cause 4,761 deaths, according to the latest available statistics. According to the NHTSA, autonomous trucks traveling in a dedicated lane away from other vehicles theoretically would enhance safety by removing “human error from the crash equation.”

The Missing Link?

Despite the varied benefits, the year of fully autonomous vehicles taking over the roads remains uncertain. A major stumbling block is safety, insofar as a clear and mutually-agreed upon understanding of acceptable risk by governments and the public.

“No critical system of transportation can claim a zero percent level of risk,” Tawiri says. “Decades passed before people agreed on an acceptable level of risk when flying in a plane. We’re in a phase now where we’re trying to define what is acceptable and unacceptable risk.”

This effort has not stopped dozens if not hundreds of autonomous test vehicles from jumping on the nation’s roads, most of them unnoticed by the public. So far, 29 states have passed legislation allowing specified uses of self-driving vehicles on state roads. That number is expected to increase this year following the decision by the U.S. Department of Transportation in December 2018 to limit federal oversight of autonomous vehicles, in addition to plans by Congress to reintroduce legislation permitting more than 100,000 autonomous vehicles to be driven by 2022.

5G is expected to accelerate this timetable. At the recent Consumer Electronics Show (CES) earlier this year, the 5G Automotive Association (5GAA), an organization composed of more than 110 automotive, technology and telecommunications companies, unveiled Cooperative Intelligent Transportation Systems (CITS), an all-encompassing autonomous vehicle system comprising vehicle-to-vehicle, vehicle-to-infrastructure, vehicle-to-network, and vehicle-to-pedestrian communications.

Such vehicle-to-everything wireless communications (dubbed V2X) can handle enormous data volumes, reducing latency risks. “5GAA supports the idea that 5G will be the ultimate platform to enable CITS, (as it) will be able to carry mission-critical communications for safer driving,” the group stated. “The impact on road safety alone is sufficiently important to make CITS a priority.”

5GAA has assembled a number of working groups, each tasked with a specific assignment—the development of industry standards, system architecture, business models, go-to-market strategies, and so on. At CES 2019, V2X took home the Innovation Award in the Vehicle Intelligence and Self-Driving Technology category, giving further credence to expectations of a shorter road to tomorrow.

According to EY’s Simlett, 5G networks are expected to reach half the world’s population by 2024. “My perspective is that we will begin to see Level 4 autonomous vehicles on the road in much greater numbers by 2030, with Level 5 vehicles following relatively soon thereafter,” he says.

That’s roughly 10 years from now. Assuming this prediction is close to reality, a speedier schedule for self-driving vehicles is on the near horizon. As NHTSA stated, “Fully automated cars and trucks that drive us, instead of us driving them, are a vision that seems on the verge of becoming a reality.”

Home, please.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

Leading a Digital Transformation With Bottom-Up Support

By Russ Banham

Perspectives

Across industry sectors today, companies are mobilizing for a fourth industrial revolution, a digital transformation combining physical operations with virtual technologies like predictive analytics, artificial intelligence (AI), and the Internet of Things (IoT) to increase operating efficiencies, reduce costs, and make more insightful, data-driven decisions.

By evolving current business and operating models, the hope is that positive changes can be reaped in the products or services a company offers, and how it interfaces with customers and delivers on these interactions. Not surprisingly, given these varied benefits, nearly every company (94 percent) in an October 2018 global survey by Deloitte has made digital transformation a top strategic objective.

Yet a lesser number (68 percent) of survey respondents see digital transformation as an avenue for profitability. Other studies express similar muddles. A global survey on digital transformation by McKinsey & Co. in October 2018 indicates that more than eight in 10 respondents have been pursuing such change initiatives for the past five years, but success has proved elusive, with fewer than one-third reporting an improvement in business performance.

These findings compare favorably with a recent study by Dell Technologies indicating that while digital transformation is a corporate priority, progress is slow. The survey reveals that only 5 percent of respondents are “digital leaders,” yet an astonishing 91 percent of them say they continue to confront “persistent barriers” in their transformations.

A follow-up survey by Deloitte in March 2019 strikes the same chord, indicating that only 50 percent of companies considered to be “digitally mature”—or well along in their digital transformation—report net profit margins and revenues above average for their industry. Only 17 percent of “lower maturity organizations”—companies whose digital transformations have just begun—report above average net margins, while 19 percent of these companies report above average revenues. These surveys, while painting a bright future for companies that digitally transform, indicate some steep bumps along the way.

The report suggests several reasons why digital transformation success is lagging, including meager capital budgets, technological immaturity, and less-than-rousing executive sponsorship. “Functional leaders and teams cannot drive effective transformation without support from the organization’s leadership,” the report states. “Maximized business income depends on effective execution.”

David Schatsky, managing director of Deloitte’s U.S. Innovation Team, elaborates on the study findings. “Digital transformation is not just a technology upgrade,” says Schatsky. “It’s a phenomenon that pervades an entire organization, reimagining business processes around data and emerging technologies. When you have an initiative of this scope and breadth, you need senior leaders to set the tone, reinforce the importance of the initiative, and develop plans and processes to execute the project.”

At the Top

Leadership is crucial to every business initiative, of course. The CEO sets the tone at the top of the organization by establishing that a particular charge initiative is a strategic priority, creates the organizational conditions for its iterative development, and empowers people to work on the project. What the CEO does not do (nor should do) is pull all the levers.

Typically, a chief digital officer (CDO), chief information officer (CIO), or other IT leader is entrusted to lead a digital transformation on a day-to-day basis. When such digitally savvy leaders are in place to guide the initiative forward, organizations are more likely to wring what they want from the project, the McKinsey survey indicates, stating that companies with a CDO leading the transformation are 1.6 times more likely to achieve success.

Such leaders must be able to convey the crucial importance of the strategic mandate to business unit leaders and functional heads, since they will benefit the most from the transformation and should be engaged in the process. Not surprisingly, eight in 10 respondents to the McKinsey survey pointed out that their digital transformation involved either multiple functions, business units, or the entire enterprise.

Deloitte’s March 2019 report suggests that successfully engaging functional and department leaders in the objectives and implementation of the initiative is a key attribute of effective leadership. “Functional leaders and teams cannot drive effective transformation without support from the organization’s leadership,” the firm states.

Schatsky affirms the value of this top-down/bottom-up commitment. “It seems clear that for digital transformation to succeed, a technically savvy senior level executive with broad purview over the organization—someone who has nurtured relationships with business unit and functional leaders—is required.”

Leading the Charge

Certainly, adroit stewardship and enterprise-wide dedication are needed to tackle the difficulties inherent in a major change initiative, among them organizational resistance, legacy technology and business models, and inaccessible, opaque, and possibly inaccurate data stored in departmental silos, hindering information for decision-making purposes.

These challenges help explain why only 59 percent of organizations have reached what Deloitte calls “median digital maturity,” despite several years of effort to rebuild the business around data. Too many companies are blindsided by the promise of technology, without considering the business needs of users first. “Technology is an enabler of transformation; it’s not the driver,” Schatsky notes.

That driver is data, says Henna Karna, chief data officer at the large global insurer and reinsurer AXA XL, where she is leading the company’s Electronic Data Solutions (EDS) team in evolving the enterprise around digital transformation. “To become a disruptive market force, companies must pursue a data-driven approach, not a tech one,” Karna asserts.

She explains that an investment in robotics processing automation or machine learning to improve the efficiency of a particular process in a part of the organization is a tech-led solution, as opposed to a data-driven one.

“Breakthrough innovations come from the audacity to upend traditional business models and other operating paradigms based on what the customer wants and the data is telling you,” says Karna. “For digital transformation to succeed, leaders must make it a fully comprehensive effort involving every business unit, function, and department. That’s what a transformation is, after all.”

Her point resonates: Innovation was named one of the top five strategic priorities by 96 percent of CXOs in a 2017 Deloitte survey. Two-thirds of the respondents envisage innovation as the optimal way to differentiate and grow their companies. “Digital transformation supports an organization’s capacity to innovate, and without innovation, companies today are challenged to grow and thrive in the future,” says Schatsky.

To seize innovation, organizations must invest in a data architecture that facilitates the needs of business users, giving them access to data to think differently, experiment, and create new ways of doing things. Karna and the EDS division at AXA XL are building this architecture. Like all transformations, the effort requires tearing down parts of the old operating structure and bypassing accepted bottlenecks.

In this work at AXA XL, organizational buy-in was needed. While it was important to have CEO sponsorship and support of the company’s digital transformation, more important, in many ways, was for business unit and department heads to rally around the initiative. “These are the people who best understand the challenges in accessing transparent data for decisions; they literally feel the pain in their bones,” says Karna. “That makes them passionate advocates for the change initiative.”

Although the digital transformation is now in its second phase, it is being incrementally rolled out across the insurer’s global footprint. Some early wins are evident. “Previously, it would take several days for a risk analyst to offer insights into losses from a particular catastrophe, since the person had to rely on data provided from other parts of the enterprise,” said Karna. “Today, it takes minutes for the analyst to reliably search our data ecosystem and engagement platform [dubbed DEEP] to access what is needed.”

Reaching Maturity

Small wonder why digital transformation has ascended to the top of so many companies’ strategic agendas. Yet, the stakes are high as businesses invest in the change initiative. To assist leaders in transforming their business models to attain digital maturity, Deloitte has developed seven key factors or “pivots,” as the firm calls them, to achieving a successful digital transformation.

These factors include a flexible and secure technology infrastructure, data mastery, digitally savvy talent networks, and ecosystem engagement. Remaining pivots involve the implementation of intelligence workflows that make the most out of human and technological capabilities, the delivery of a seamless customer experience enterprise-wide, and business model adaptability to changing market conditions.

All the pivots should be executed in concert across multiple business functions, rather than on a selective basis. “We believe adherence to the pivots will generate tangible results,” Schatsky says.

In this era of rapid innovation, complacency with the company’s legacy business models and the technologies supporting them is counterproductive. “It’s increasingly clear that being a digital enterprise will soon become the equivalent to being in business,” continues Schatsky. “Those that fail to transform—that lack the ability to aggregate and analyze digitized data to improve what they make, how they operate, and how they engage with customers—will fall behind.”

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How Robots Are Helping Scientists Predict Earthquake Aftershocks

By Russ Banham

Perspectives

Recent advances in computing power, deep learning, and physics simulation software present the possibility of mitigating the impact of an earthquake. This is good news in light of research that predicts a cataclysmic earthquake will strike the Western United States in the near future.

Scientists at Harvard University and the Massachusetts Institute of Technology (MIT) are approaching the urgent need to minimize earthquake damage from separate fronts. Harvard researchers are focused on the use of deep learning algorithms to posit where earthquake aftershocks are likely to occur, while MIT researchers have developed a seismic wave diversion structure on a supercomputer that draws the destructive waves away from protected areas like neighborhoods and downtown business centers.

The ingenuity of the technologies is equaled by their timing. Scientists at the United States Geological Survey (USGS) predict a 99.7 percent chance of a 6.7-magnitude earthquake striking Los Angeles within the next 30 years. That’s the same magnitude of the 1994 Northridge Earthquake that killed 72 people, injured more than 10,000 others, and destroyed thousands of homes, buildings, and cars in the surrounding region, costing more than $40 billion in property damage.

The prognosis is even worse for residents of the Pacific Northwest coastal region, home to the 620-mile long Cascadia Subduction Zone, where the Juan de Fuca ocean plate dips under the North American continental plate. Seattle and Portland, both inside the zone, confront an eight to 20 percent chance of experiencing the “Big One,” what seismologists call a full-margin rupture resulting in a magnitude 8.7 to 9.2 earthquake.

Most earthquakes are nowhere near as catastrophic. Of the half a billion or so detectable earthquakes that occur across the world each year (of which about 100,000 of them are felt), roughly 100 cause significant property damage and potential loss of life, noted Robert Haupt, senior technical staff scientist at MIT’s Lincoln Laboratory. “We’re hoping to do our part in reducing the devastating impact of these seismic events,” he says.

A Seismic Muffler

Haupt is a key architect of what Lincoln Lab is calling a “seismic muffler.” The concept calls for drilling a V-shaped array of sloping boreholes hundreds of feet deep on both sides of the structures to be protected, such as power plants, airport runways, office buildings, and other protected assets.

The one- to three-feet diameter boreholes—which are cased in steel and look like a set of trench walls from above—divert hazardous surface waves generated by an earthquake away from the protected asset. By the time this destructive wave force reaches ground surface, it dissipates—much like the acoustic energy coming from a combustion engine of an automobile is softened by the car’s muffler.

Haupt and his scientific colleagues at the lab have successfully tested a variety of borehole spacing models to dissipate hazardous seismic waves. In this work, they used 3D high-performance supercomputers and physics-based simulation software, such as SPECFEM3D seismic wave propagation software and COMSOL Multiphysics Acoustic software. The team uses artificial intelligence (AI) to sift through incredibly large data volumes involving earthquake detection probability. “There’s no way you can plot up all the multiple dimensions using spreadsheets or pen and paper,” Haupt says.

The findings were impressive. “We performed a series of tabletop exercises on 3D supercomputers that indicated a V-shaped array of mufflers can decrease the ground shaking effects of a 7.0-magnitude earthquake to a 5.5-magnitude earthquake and lower,” Haupt explains. “That’s a pretty significant reduction in ground motion.”

He’s not kidding. According to the USGS, a 7.0-magnitude temblor is 177.8 times stronger (energy release) than a 5.5-magnitude quake. Lincoln Lab recently received a patent for its innovative technology and is in licensing talks with several interested parties, whose names Haupt declined to disclose. Field testing is expected to be underway this summer.

Regarding the expense, Haupt estimates the cost would be less than what real estate developers currently pay to secure a skyscraper with base isolation systems, in which spring-like pads are inserted between a building’s foundation and a building to absorb ground motions. “Based on extensive 3D supercomputer computations, we believe it would protect a lot more buildings at about the same cost,” he says.

Location, Location, Location

Harvard University’s scientists have focused their research on the location of earthquake aftershocks, which follow the main shock and can occur for weeks, months, and even years after the primary event. Although smaller in magnitude than the initial temblor, some aftershocks pack a wallop—the case with a 2015 magnitude 6.7 aftershock recorded in Nepal.

While scientists are able to calculate the magnitude of aftershocks with some degree of precision, they’ve struggled with predicting their location. To improve the odds, Harvard research scientists Brendan Meade, a professor of earth and planetary sciences, and Phoebe DeVries, a postdoctoral fellow working in Meade’s lab, collected a massive volume of data from more than 130,000 earthquakes worldwide. Using AI technology, they analyzed this database to discern where the aftershocks had occurred, mapping them across a series of 5-kilometer square grids.

The researchers then compared the findings with a physics-based computer model calculating the stresses and strains of the Earth during the main shock. The model incorporated deep learningalgorithms to ferret out specific correlations between the strains and stressors and the aftershock locations.

The next stage of the research, published in the scientific journal Nature, called for testing the outcomes against 30,000 mainshock and aftershock events. The results were promising, encouraging the value of pursuing additional research. Thanks to AI, as Meade told The Harvard Gazette, “Problems that are dauntingly hard are extremely accessible these days.”

That’s exceedingly good news for anyone living in an earthquake-prone region. And hopefully just in time, too.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

Andy Fastow and me

Enron’s former CFO and convicted felon Andrew Fastow talks with the CFO writer who first chronicled his “groundbreaking” manipulation of accounting rules.

By Russ Banham

CFO

Twenty years ago, CFO gave Enron finance chief Andrew S. Fastow a CFO Excellence Award in the category of “capital structure management.” In a feature story naming him an award recipient, Fastow said, “Our story is one of a kind.” Little did he know how prophetic those words would soon become.

I remember Fastow well, as I wrote that October 1999 story. It explored his financial wizardry in helping turn a sleepy natural gas pipeline company into a blazing energy trading firm. At its height, Enron was the seventh largest company in America; its market capitalization hit $35 billion. For the story, I interviewed Fastow, Enron’s CEO Kenneth L. Lay, and president and COO Jeffrey K. Skilling. All would soon become notorious.

Two years after the story appeared, Enron became the biggest accounting scandal in American history. The upsurge in market capitalization that Fastow crowed about had been whittled down to nothing. His financial wizardry, as it turned out, was “smoke and mirrors” designed to mask Enron’s true financial performance. The company filed for bankruptcy on December 2, 2001, putting thousands out of work. Most of Enron’s employees had invested their retirement savings in the company’s stock. Other shareholders lost billions.

A U.S. Securities and Exchange Commission investigation followed, as did a criminal investigation by the U.S. Department of Justice. Fastow was charged with 78 counts of fraud for his central role in developing the off-balance-sheet special-purpose entities that led to the company’s collapse. He subsequently entered a plea agreement, forfeited his net worth of $24 million, and served a six-year prison sentence in a federal detention center in Oakdale, Louisiana. He was released from prison in December 2011.

The CFO article on Fastow was the first in-depth piece of journalism to lay out the complex finance and accounting strategies that underpinned Enron’s meteoric rise. In the story, Fastow was lauded. Said one Lehman Brothers analyst, “Thanks to Andy Fastow, Enron has been able to develop all these different businesses, which require huge amounts of capital, without diluting the stock price or deteriorating its credit quality—both of which actually have gone up. He has invented a groundbreaking strategy.”

What I had failed to capture, in a story meant to celebrate Fastow as a CFO wunderkind, was his shrewd manipulation of the accounting rules, which was unbeknownst to me and the impartial panel of CFOs who selected him for the award. He had been the chief engineer of the deals that made Enron’s financial performance and balance sheet appear much stronger than they were.

When Enron blew up in 2001, some of the fallout struck me as the writer of the article. I received anonymous emailed death threats, perhaps from embittered employees and shareholders. Although I was simply the messenger, I still felt guilt and shame.

He’s Back

I write this prologue for a reason. Over the past two years, Fastow has been on the public speaking circuit. A few months ago, I reached out to him on LinkedIn to request an interview. In our subsequent discussions via Skype, two of his comments stood out.

One was his assertion that the factors causing the collapse of Enron are in play at other companies. The other was his contention that the Sarbanes-Oxley Act, created in 2002 to prevent another Enron debacle, will not stop another Enron from happening.

During a Skype interview, he set up his laptop so I could watch a video of his keynote speech on trust and ethics in front of 2,000 people at the “In the Black: Accounting & Finance Innovation Summit” in Las Vegas, sponsored by Blackline.

He had given presentations over the past two years to university business students and organizations of certified fraud examiners, but these were his people—finance and accounting professionals. As one finance executive in the audience later told me, “I wanted to know what the world’s greatest CFO criminal mastermind could possibly have to say about ethics and trust.”

A few minutes into the speech, Fastow walked offstage and came back. In his right hand, he was holding his CFO Excellence trophy. In his left hand was his prison identification card. He then raised both arms and said, “How is it possible to go from a CFO of the year to federal prison for doing the same deals?”

The thesis of Fastow’s presentation is rules vs. principles, his argument that someone can follow the rulebook and still fail to do the right thing. That was Fastow’s wrongdoing, according to him. “I found every way I could to technically comply with the [accounting] rules,” he told the assembled. “But what I did was unethical and unprincipled. And it caused harm to people. For that, I deserved to go to prison.”

“Legal Fraud”

In our conversations, Fastow repeated the fact that all his structured transactions were approved by Enron’s accountants, senior management, and board of directors; internal and external attorneys; bank attorneys; and its audit firm, Arthur Andersen. (That Arthur Andersen approved them is not saying much—after the Enron debacle, its auditing business shut down.) “How is it possible to have all these smart people approve these deals and end up committing the greatest fraud in corporate history?” Fastow asked me.

He then answered his own question. “The fundamental problem is this: Virtually all the safeguards that have been built into the system are compliance and legal procedures to catch rulebreakers. But rulebreakers are only part of the problem. The more insidious and dangerous problem is the rule ‘users’—the rule exploiters who find the loopholes.”

Fastow was perhaps the world’s best rule user of his time. All he needed were accounting assumptions and structured finance to transform the appearance of Enron.

There is some truth to what Fastow attests to about using loopholes. Bethany McLean, co-author (with Peter Elkind) of the book, “The Smartest Guys in the Room: The Amazing Rise and Scandalous Fall of Enron,” has stated on more than one occasion that what Fastow perpetrated was “legal fraud,” an oxymoron suggesting that a CFO can follow the rules and exploit them to such an extent that the end result is fraud.

“My just asking the question, ‘Am I following the rules?’ was insufficient,” Fastow said. “I should have also been asking the question whether or not my behavior was ethical. I may have been trying to stay within the rules, but I was also, most definitely, trying to be misleading.”

I asked Fastow if he believes other CFOs ever feel compelled to exploit the rules. He responded with an analogy about Bill Belichick, head coach of the New England Patriots, who says he uses obscure football rules to his team’s competitive advantage and to make sure it wins.

If I understand Fastow correctly, he believes human nature leads some people to do whatever they can to win, including bending the rules. The easier that is, the greater the chance of doing it.

Enron’s use of mark-to-market (or fair-value) accounting, instead of the historical cost method, allowed it to recast deals that had resulted in a loss and recognize them as future profit. “Fair value accounting is a good example of where ethics come into play, as it provides you with all these gray areas that allow for creative flexibility,” Fastow said.

Such “creative flexibility” is in play today at many companies, he added, asserting, “All CFOs believe they are really good at identifying, processing, and managing risk, but the reality is that the brain sees what it wants to see. We process risk in a biased way.”

Pressed to elaborate, Fastow offered this example to me, a resident of Los Angeles: “You realize that virtually every seismologist agrees that California is 1,000 years overdue for a catastrophic earthquake. You’re sitting on a major fault line. And yet you don’t wake up every morning worrying about your family dying in a massive quake or your net worth being obliterated. You’re processing risk in a biased way—`it won’t happen to me and even if it does it won’t be that bad.’ Your brain knows the answer you want—that you want to live in L.A. So, it dismisses or minimizes the risk.”

Making company results appear better than they are is actually not dissimilar. “CFOs know the answer they want—hitting the quarterly target,” he said. “So, their brains minimize risk in order to get there. My personal belief is that almost every CFO wants to be ethical and do the right thing, but the problem is identifying you’re in an ethical risk-creating situation to begin with.”

In other words, you can commit fraud and still be technically within the rules. What you can’t do, though, is successfully argue in a court of law that because you didn’t break the rules you are not guilty. Like Fastow, disgraced former CEO Bernie Ebbers of WorldCom learned this the hard way, when the Second Circuit Court of Appeals rejected his argument that the government had to prove violations of generally accepted accounting principles (GAAP) for his conviction on fraud and conspiracy to stand.

As the court stated in 2006, “To be sure, GAAP may have relevance in that a defendant’s good faith attempt to comply with GAAP … may negate the government’s claim of an intent to deceive. [However,] if the government proves that a defendant was responsible for financial reports that intentionally and materially misled investors, the [securities fraud] statute is satisfied.” Simply put, the intent to mislead investors signifies a criminal purpose, irrespective of accounting rule loopholes.

What It’s Worth

It’s not unusual for former convicts to leverage their “expertise” in helping law enforcement. Bank robber Willie Sutton spent his last years consulting with banks on theft-deterrent techniques. Fastow is giving talks about rules vs. principles and consulting with corporate management and non-executive board directors about corporate culture and unrecognized risks—the “dangers of the gray areas.”

These gray areas stain all regulations, claimed Fastow, including the Sarbanes-Oxley legislation. “SOX is only asking `Are you following the rules?’” Fastow said. By now his point is clear: Ethics in corporate governance, an even timelier subject these days, is crucial.

Asked what boards of directors can do to feel confident that they have a clear picture of financial results, Fastow touted the data transparency provided by finance and accounting software. He further noted the possible use of an artificial intelligence (AI) tool developed by software provider KeenCorp that analyzes employee emails for evidence of negative tension in a company.

In 2016, KeenCorp, analyzed several years’ worth of emails sent by Enron’s top 150 executives. Not surprisingly, when Enron approached insolvency, index scores fell precipitously, signaling high amounts of tension among executives. Yet, two years earlier, on June 28, 1999, when Enron seemingly was on top of the world, equally low scores were posted. The firm reached out to Fastow in 2016 for an explanation.

It turns out that on that date in 1999, Fastow had spent hours talking with Enron’s board and senior management about “LJM,” the name given the complex transactions he’d designed to hide the company’s poorly performing assets to spruce up its financial statements. The software’s analysis of emails that day intuited negative tension about the deals.

“The algorithm pinpointed the day when the most existential decision was made,” Fastow said. (Fastow has since become an investor in KeenCorp, it should be noted.)

What is one to make of Andy Fastow today? It’s a difficult question. I reached out to three finance and accounting professionals who attended the “In the Black” summit. They said he appeared humbled by his ethical lapses and had something useful to offer. However, none of them had personally suffered from Fastow’s criminal manipulations. And that’s exactly what they were.

His victims might be pleased to know that Fastow appears condemned to forever make his penance. Despite it all, though, his family held together, offering him a measure of solace. As someone linked to him in perpetuity, I’m happy for that.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How Carbon Capture Tech Is Easing Industry’s Green Transition

By Russ Banham

Forbes

Scientists on the UN’s Intergovernmental Panel on Climate Change (IPCC) commissioned by the United Nations to provide guidance to global leaders on the economic and humanitarian impacts of climate change reviewed more than 6,000 scientific studies before reaching a devastating conclusion: Greenhouse gases (above all carbon dioxide, or CO2) must be reduced by 45 percent by 2030, and 100 percent by 2050, to deter a 2.7-degree Fahrenheit increase (from pre-industrial levels) in global temperatures.

With nearly 200 countries on board, the Paris Agreement offers hope for reducing the production of greenhouse gases in the future. But more needs to be done today to avoid the dire fate the IPCC projects. As we work toward a decarbonized energy future, the widespread use of carbon capture technology will likely be an indispensable strategy in our toolkit.

Seize, store and sell

The potential of this technology to reduce greenhouse gas emissions is “considerable,” the IPCC states, estimating that carbon capture can trap up to 85 to 90 percent of the CO2 emissions produced from the use of fossil fuels in industrial processes and electricity generation, effectively preventing that CO2from entering the atmosphere.

The IPCC is not alone in its support for increasing use of this promising technology. In a 2016 report titled “20 Years of Carbon Capture and Storage,” the International Energy Agency (IEA) suggests that wide-scale use of carbon capture would result in a 19 percent reduction in global CO2 emissions by 2050. However, this projection assumes the creation of approximately 3,400 carbon capture plants before that date. More action is needed, and quickly, to achieve these numbers; only 17 large-scale carbon capture plants exist in the world today, capturing roughly 40 million metric tons of carbon dioxide each year, a scant 0.1 percent of total global emissions.

In and out

The IEA recently noted that carbon emissions from advanced economies rose for the first time in five years in 2018. Given the economic challenge of substantially reducing fossil fuel use in electricity generation and industrial processes, carbon capture appears to be a practical, immediate solution.

Carbon capture and storage consists of three main parts: capturing, transporting and storing the CO2, the latter either underground in depleted oil and gas fields or in deep saline aquifer formations. The first leg of this journey entails separating CO2 from the other gases produced in industrial processes and electricity generation. The CO2 can then be transported to safe storage via pipeline, road tanker or ship.

The Petra Nova Project is the world’s largest carbon capture project at a coal power plant and is located just outside of Houston. The plant can capture about 1.4 million metric tons of CO2 each year.

In 2017, Mitsubishi Heavy Industries (MHI) Group, a carbon capture pioneer, along with its consortium partner, TIC, completed construction of Petra Nova’s post-combustion carbon capture and compression system, which captures more than 90 percent of the CO2 from a flue gas stream.

To provide a revenue stream for carbon capture, the CO2 is compressed, transported and pumped underground into an oil formation to increase overall oil production.

Just one of many

Carbon capture alone will not curtail the progressive warming of the planet, but it is a step that can be taken now — free from political wrangling. In recognition of the technology’s value, the U.S. Congress introduced bipartisan legislation in early 2018 to provide tax credits to companies that capture or reuse their CO2. President Trump signed the legislation into law.

Known as the 45Q tax credit, it provides carbon-producing companies a credit of up to $50 per ton over a 12-year period after the start of operation, depending on how the CO2 is used.

Even Congress could appreciate that a financial incentive was needed for industry to rapidly scale up carbon capture technology. Now the onus is on companies and the government to continue to find ways to realize more carbon capture projects.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.