Fix It Before It Breaks: How Smart Machines Are the New Quality Control

By Russ Banham

Dell Perspectives

It wasn’t long ago when “quality control” consisted of expert personnel whose task was to measure, weigh, touch, and scrutinize finished goods to discern evidence of possible defects. While many manufacturers still rely on such expertise, more are turning to automated “smart machines” in the Industrial Internet of Things (IIoT) for these appraisals.

Machine intelligence is essential for success in the digital future. According to a 2017 study by management consultancy Bain, the IIoT market will generate approximately $85 billion annually by 2020. To remain competitive in today’s rapidly-changing business environment, manufacturers must digitally transform how they make and distribute products.

A 2017 report by Deloitte affirms this view. “The concept of adopting and implementing a smart factory solution can feel complicated, even insurmountable,” the report stated. “However, rapid technology changes and trends have made the shift toward a more flexible, adaptive production system almost an imperative for manufacturers who wish to either remain competitive or disrupt their competition.”

A smart machine in a manufacturing context is a piece of factory equipment embedded with IoT sensors that calibrate and communicate performance issues over the internet to manufacturing control rooms and even other machines to drive faster and smarter decisions. The benefits include instant alerts of a machine that is wearing down and in need of maintenance or repair; self-correcting machine adjustments to address variations in tolerance; and real-time insight into a manufacturing slowdown at a major supplier to rapidly shift production to other suppliers.

“The primary value of smart machines is their ability to produce products of the highest quality,” said Dean Bartles, president of the National Tooling and Machine Association, a trade group representing the precision manufacturing industry. “The sensors inside the machines basically do what seasoned quality professionals have long done, only much, much better.”

High product quality is good news for a manufacturer’s top and bottom lines, as it results in more satisfied customers and fewer defects and production scrap (leftover materials that add to costs). “With smart sensors measuring vibration, temperatures, moisture and dimensions, you’re able to tell pretty quickly when something is off,” said Bartles, an industrial engineer and former vice president and general manager at three General Dynamics manufacturing plants.

“The machine itself tracks the `drift’ in the tooling as it occurs,” he explained. “Some smart machines can even intervene on their own to bring the ideal configurations back into line—self-correcting without the need for human intervention.”

Intelligent Factories

Self-correction capability, for example, could be found in an injection-molding machine, in which specific molten materials like metals and plastics are injected into a mold for use in fabricating a part or finished good, everything from surgical devices and electrical circuit boards to more prosaic toys and automotive intake manifolds. Once the mold is made, products are produced on a constant, uninterrupted basis. However, the process isn’t perfect; defects like blistering, cavities, and contamination by foreign materials are common problems.

By embedding visual, temperature, and weight sensors into the molding machine, imperfections can be identified as soon as they become evident. If the defect is determined to be the presence of cavities—caused by an inadequate volume of metals or plastics being pumped into the mold—the data analytics will direct a computer inside the machine to increase the volume to the correct level.

Companies like Marvin Windows and Doors are at the outset of developing tomorrow’s smart factories today. At a company plant in Oregon that manufactures different-sized wood pieces, Marvin has installed computer numerically controlled (CNC) machines that use laser sensors to give operators an inside look at a board’s knot and grain structure before cutting it into smaller pieces of wood. The sensors help ensure maximum yield, in this case, the largest piece of wood possible from a single block of lumber.

“The sensor inputs data into a computer inside the machine that analyzes the visual image,” said Jim Macaulay, CFO of Marvin Windows and Doors. “Based on this information, the machine knows the optimal cut to make, increasing production yield with less human intervention.” Previously, Marvin had relied on the eyes and experience of shop foremen to identify possible defects.

Internet-connected sensors have also been embedded inside other factory equipment at Marvin plants to measure temperature, vibration, moisture, and other conditions, Macaulay noted. If a motor inside a machine exceeds a specific temperature threshold, this information travels over the internet to a central location for corrective actions. By connecting the factory equipment together in what is known as the Industrial Internet of Things (IIoT), a problem can be self-corrected—one machine picking up a troubled machine’s production tasks. “As a result, we’re able to reduce the chance that one machine’s failure will result in a production stoppage,” said Macaulay.

Precision Is Paramount

Other manufacturers are beginning to seize similar value. “By using sensors to measure and report on diverse conditions, and then collecting all this information in one place for analysis using algorithms, manufacturers are able to draw rapid conclusions on remedial actions,” said Alex Reed, cofounder and CEO of Fluence Analytics, a manufacturer of industrial monitoring solutions that produce continuous data streams.

The algorithms ferret out correlations and non-correlations in the diverse data produced by different sensors, indicating machine wear and product quality issues well before they result in batch failures. While many smart machines can self-correct a problem, more complex manufacturing processes still require human beings to intervene.

“It’s not as straightforward as it often is portrayed to be, though we are definitely headed in a direction where the use of AI [artificial intelligence] and machine learning will direct a correction in a machine based on the findings of the analytics,” Reed said.

Both he and Bartles are members of the Smart Manufacturing Leadership Coalition (SMLC), a nonprofit group comprised of major companies like Rockwell, General Motors, and Owens Corning that works to develop the world’s first smart manufacturing open technology platform. The hope is that midsize and smaller manufacturers can use the open source platform in developing their IIoT strategies.

Abetting these aims is the increasing sophistication of sensors at lower price points. “In our work, we use spectroscopic, infrared and optical sensors to determine viscosity at the molecular level,” said Reed. “For example, we’re able to discern the composition of a material like a polymer. If the composition is off even a little, it can result in [product] failure, waste, and production downtimes. Nowadays, virtually everything in the supply chain begins with sensors.”

Bartles shares this opinion: “Sensors are the modern equivalent of a `red flag,’ giving you insights into possible manufacturing hiccups so your supply chain doesn’t fall behind schedule. More and more OEMs [original equipment manufacturers] are receiving sensor-produced data over the internet from their key suppliers’ machines. This information is extremely insightful for decision-making purposes.”

He recalled how this remarkable capability contrasts sharply with his earlier career at General Dynamics. “Like other manufacturers, we’d receive daily status updates from our suppliers on part counts. But suppliers sometimes don’t tell you the truth,” Bartles said. “Ideally, you want real-time accurate information. This way you know if you need to turn the knob off on one supplier that’s having trouble [in order] to turn the knob on at another supplier.”

Smart machines also offer a more advanced way to trace and track the product quality of all suppliers linked in the supply chain, ensuring each component of a finished product’s specifications has been validated for quality specifications, from the raw material through varied production stages to customer delivery.

No Turning Back Now

Given these myriad benefits and the wider profit margins that can accrue, the production experts anticipate growing demand by manufacturers of all sizes for smart machines down the line.

“We’re not yet at the point where this is widely adopted and delivering massive value; in many manufacturing environments, work still needs to be done by quality control experts,” said Reed. “However, these individuals tend to hail from older generations and are soon to leave the workforce; this puts the onus on companies to invest in the IIoT sooner rather than later. There’s no question that manufacturing is migrating from qualitative assessments by people to quantitative assessments by machines.”

Like everything in business, early movers and their fast followers generally have a leg up on competitors. While smart factories may seem like a fantasy torn from a Buck Rogers novella, they’re how most things will be made now and into the future.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How Blockchain Is Disrupting 3 Industries

By Russ Banham

Forbes

Blockchain burst into the mainstream five years ago as a secure platform for Bitcoin transactions, but the technology’s use today is extending well beyond cryptocurrency to transform industry sectors on a holistic ecosystem basis.

Healthcare, banking and insurance are just three industries that anticipate tens of billions of dollars in cost savings from the blockchain’s permanent decentralized ledger. As a result of blockchain, banks, for example, expect to generate more than $27 billion in cross-border settlements alone by 2030, according to a 2018 study.

At its most basic, a blockchain is a distributed, digital ledger with built-in security that records transactions among the network participants in real-time. Every 10 minutes, these transactions are verified, permanently time-stamped and stored in a block that is encrypted and inextricably linked to the preceding block — creating a blockchain.

Participants don’t need to trust that the ledger has not been tampered with because entries are trackable and irrevocable. Another advantage of blockchain technology is business efficiency. Participants can execute smart contracts without a central controlling authority. The contracts trigger when pre-arranged terms and conditions are met.

These benefits and others — like data quality and transparency — have made blockchain the go-to technology for digital transformation, explains David Uhryniak, blockchain competency leader at the accounting, consulting and technology firm Crowe. “Blockchain is the underlying technology that fosters the required trust to enable companies and entire industries to transform around data and successfully implement other transformative technologies like artificial intelligence (AI) and the Internet of Things (IoT),” he says. It’s already disrupting multiple industries in tangible ways.

In the property-casualty insurance industry, real progress is being made to create ecosystems that speed up the automobile insurance claims process — and the potential payout is huge.

Take, for example, the Institutes RiskBlock Alliance, a collaborative experiment in which dozens of insurance companies plan to share specified automobile policyholder data in a blockchain network. Other participants with access to the secure environment include third parties like car repair shops, tow truck companies, state motor vehicle departments and law enforcement agencies.

This type of ecosystem would streamline tedious and costly processes. Consider the following scenario: After a minor two-car collision, sensors in the vehicles would send alerts to the blockchain network, triggering pre-arranged smart contracts among the parties to dispatch tow trucks, which would take the cars to designated repair shops. At the same time, other sensors measuring the speed and braking of both vehicles, as well as data on weather and road conditions, could send this information to the blockchain, whereupon it would be instantly determined which party is likely at fault.

All that data is a goldmine from an analysis standpoint. “A smart contract between the two insurers of the vehicles may eliminate the need for a claims adjuster to go to the scene of the accident,” says Uhryniak. “The claims process would automatically spring into motion, possibly with the claim being filed and paid that day or the next one.”

That’s just one industry. Uhryniak cites four key benefits of blockchain that have transformational potential across business sectors — enhanced transparency, revenue, efficiency and engagement. “There’s friction within every process related to the various interactions with counterparties, customers, regulators and even the gathering of data,” says Uhryniak. These inefficiencies are costly, he explains, but blockchain networks obviate these challenges.

A case in point is the healthcare sector, an industry that Uhryniak projects will be a vastly different enterprise in the next five to 10 years. Today, payers like health insurers each have specific contract terms and conditions that must be met in order for a procedure or treatment to be deemed medically necessary and, therefore, covered by the insurer’s health plan. These terms and conditions could be turned into smart contracts and added to blockchains, eliminating the inefficiencies involved in verifying permissible insurance coverages, Uhryniak says.

Another benefit of this transparency in healthcare is streamlined and reliable billing. “Blockchain helps ensure a patient isn’t charged for the same medical procedure by two different physicians or hospitals,” Uhryniak says. “The technology verifies the accuracy of each transaction, which becomes a permanent, immutable record.

Health insurers benefit in other ways. For instance, an insurer can verify the necessity of medical treatments prior to execution and re-validate that they were in fact performed. Blockchain also addresses the problem of rising insurance claim denials. According to a 2017 analysis by Crowe of more than 300 hospitals, 9.6 percent of all medical insurance claims are denied. Used as a billing tool, blockchain could instantly ferret out whether or not an insurer’s claim matches its specified contract terms and conditions, reducing delays and human error.

Banks are also poised for blockchain-enabled transformation. Only 3 percent of bank executives surveyed by Crowe expect “minimal change” from blockchain technology in the next 10 years, with the remaining 97 percent anticipating modest to significant change. In addition to streamlining the mortgage approval and closing processes, blockchain technology proposes similar process enhancements in the disbursement of funds for commercial loans and syndicated loans, Uhryniak says.

As businesses initiate blockchain strategies, Uhryniak advises that they pursue a measured approach that begins with an evaluation of how the technology will provide a competitive advantage. Firms like Crowe can help with this assessment by guiding companies to identify the right blockchain platform for their needs and in some cases may be able to build this infrastructure as a “proof of concept” prior to implementation.

For companies, adapting to the confluence of technologies involved in tomorrow’s ecosystems requires thoughtful consideration — and, in many cases, raises concerns about risk. As the volume of IoT devices proliferates to reach 125 billion worldwide in 2030, and remarkable tools like machine learning become smarter through exponentially expanding computing power, only blockchain at the moment presents a system of trust in data. It’s no wonder so many industries have taken notice.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

Hiring America’s Heroes

By Russ Banham

Chief Executive

America’s military veterans are some of the most skilled people on the planet, able to lead a project team through extraordinary challenges or deliver superior outcomes on mission-driven tasks. More than one million veterans will exit the U.S. Armed Forces over the next five years. This diverse talent pool has highly sought-after competencies, including discipline, flexibility, planning, technical, communications and problem-solving skills. And that’s the short list.

Yet, more than one million U.S. veterans remain unemployed, somehow slipping through the recruitment net. Research suggests companies struggle to access this talent pool, despite recognition of its potential. In a recent study by Chief Executive and the State of Indiana of nearly 300 U.S.-based CEOs, 57 percent reported that their company considered hiring veterans, yet only 17 percent had implemented a program to support those efforts.

The good news? A growing number of U.S. companies are creating initiatives to more closely align military training experiences with employment openings and business needs. And the efforts are paying off. “Veterans are disciplined and accountable; they take ownership of their work, are very proactive in finding solutions to varied challenges, and don’t make excuses,” says Larry Hughes, vice president of training and diversity at 7-Eleven and a former Army officer who commanded two company units as a field artillery officer during his five-year service. “They also have advanced technical training and strong cross-cultural experiences. And they’re team builders who know how to resolve conflicts, motivate people and get the best out of them.”

On the pages that follow, we share some practical tips from companies and CEOs making a difference in the lives of veterans—while also making the most of a great opportunity.

Getting Started

Kevin Ryan founded the Service Brewing Company, a small brewery with a taproom in Savanna, Georgia, in 2014. Of the company’s 24 investors, 20 are veterans; and the majority of its 13 employees are also veterans. One is currently deployed in the National Guard and another is a former military spouse. “We’re always looking for veterans to add to our team,” says Ryan, a 1996 West Point graduate who subsequently served as an Army infantry officer.

In recruiting, Ryan aligned with two local military bases (Ft. Stewart and Hunter AAF) and Georgia Tech’s Veteran Education Program. He also reaches out to student veterans at Georgia Southern, as well as at the Association of the U.S. Army, the Military Officers Association of America, the Mighty Eighth Air Force Museum and many other organizations. “Soldiers don’t often get to go to job fairs or have the ability to network successfully, so we need to get out in front of them,” he says.

Other companies employ a similar strategy. At 7-Eleven, field personnel nurture close relationships with military base transition office staff members. “We advise on-base soldiers on resumé building and job interview tactics, host entrepreneurial boot camps and invite exiting service members to attend our seminars on franchising opportunities,” says Hughes. “We’re also a regular presence at military hiring fairs.”

The company has hired more than 300 veterans and military spouses as field consultants in the past year, tripling the number of these hires since 2014. The position is a gateway to other jobs in the organization.

Companies interested in hiring military veterans and spouses can draw on a wealth of resources geared toward assisting veterans. Local Veteran Service Organizations, Student Veterans of America chapters at colleges and universities and web sites like Hero 2 Hired, Veterans Job Bank or Vetsuccess.gov are all great ways of proactively recruiting ex-military men and women. Companies can also seek out career fairs focused on veteran recruitment and programs like Google’s “Jobs for Veterans” initiative.

Once hired, veterans and military spouses are given the special treatment they need and deserve to make the best of their talents. La Quintawelcomes military hires with a special veteran or military spouse pin for them to wear on their uniforms or business attire. Through the hotel’s guest loyalty program, five million points were donated to several veteran-focused organizations like Operation Homefront and Armed Services YMCA. “Putting people first is embedded in our culture, and those who have a passion for people and service fall in line with these core values,” says Derek Blake, La Quinta vice president of marketing and military programs.

Starbucks provides veterans with a unique benefit—to gift their Starbucks College Achievement Plan to a child or spouse. The program funds tuition for an online bachelor’s degree at Arizona State University in 150 various degree programs. Starbucks also offers veteran-employees what are called Military Mondays, a program developed with the William and Mary Law School to provide free legal counseling to service members at its stores. “Military Mondays is now scaling nationally and growing to include other critical services such as financial literacy training and investment counseling,” says Christopher Miller, Starbucks veterans and military affairs manager.

 Citi, in partnership with Bring Them Homes, has been instrumental in providing transitional, supportive, temporary, and permanent housing for veterans and their families. “To date, the program has supported the creation of more than 3,500 affordable housing units,” says Ruth Christopherson, a Citi senior vice president and retired colonel, U.S. Air National Guard.

Matching Skills

7-Eleven, which joined other U.S. companies in a 2012 pledge to hire one million veterans by 2020, is well on its way toward achieving the goal. The company has hired more than 300 veterans and military spouses in the past four years alone. To align the resumes of veterans with needed business skill sets, the company has created a presentation called “Military 101” that translates military assignments into corresponding business tasks.

“It ensures our recruiting team has a firm understanding of how military experiences and skill sets translate into roles within our team, and enables our transitioning veterans to be set up for success,” says Dave Strachan, chief of staff and a former Army officer. 7-Eleven CEO Joseph DePinto also is a former Army field artillery officer and West Point graduate.

Other companies tout the extraordinary range of abilities that soldiers attain over their own military careers. “People don’t think of veterans as having finance, operations, HR, IT or project management skills in a business context,” says retired U.S. Army Brigadier General Carol Eggert, a recipient of the Legion of Merit, a Bronze Star and a Purple Heart and head of Comcast NBCUniversal’s eight-person Military and Veteran Affairs organization (see sidebar). She says that misconception is fueled by a lack of understanding of the breadth and scope of leadership positions in the military.

Many companies are doing just that, creating an array of programs designed to match military community skill sets with business needs.For example, Citi, cofounder of the Veterans on Wall Street recruitment initiative and corporate sponsor of Military.com’s mobile app, launched Citi Salutes to centralize its 17 military veteran employee networks under the oversight of an executive steering committee. The firm also created a Veterans Recruiting Toolbox for recruiters.

Dow Chemical implemented a program where four or more years of military service meet the company’s minimum job requirements. The company also is running a pilot Military Engagement Program, in which a current employee-veteran coaches service members and military spouses through its hiring process.

Many companies, including 7-Eleven, Starbucks and Comcast, are corporate partners in the Hiring Our Heroes fellowship program. The 12-week operations management internship is designed to provide the skills needed to succeed in the civilian workforce. “We make an offer of employment to fellows who complete the program,” says Strachan, citing 7-Eleven’s recent hiring of a dozen graduates.

Smoothing Transitions

For many veterans, their first job in the private sector can be dislocating. The management structure is different, the vocabulary of business is arcane and the processes are atypical. Easing the transition of this talented group of employees improves the chances of retaining them. A 2016 survey by the U.S. Chamber of Commerce Foundation found that 44 percent of veterans left their first post-military job within a year.

Job vacancies at the Black Knight, a fast-growing company of 5,000 employees, are being filled with veterans at a 10 percent rate. For good reasons, too, since the company pledges full-wage continuation and medical and dental benefits to employees called up for active duty in the Reserves or National Guard. Returning employees are placed in the same position, or another position they might have attained had they remained continuously employed. “We’re ensuring their career paths remain productive and promising,” Circelli says. “You need to make hiring veterans a priority and then have the dedication to fulfill that commitment.”

At construction giant. Cushman and Wakefield, a military transition roadmap helops veterans acclimate to the corporate environment. Deloittesponsors the Career Opportunity Redefinition and Exploration Leadership Program, helping veterans and active duty service members identify their unique strengths to better direct their careers. Every Deloitte business has a partner, principal or managing director as a Champion for Military and Veterans. GE partnered with the U.S. Army Reserve Medical Command in a pioneering externship program providing eight months of biomed and imaging training to Army Reserve biomedical technicians.

Another Opportunity

According to research compiled by Blue Star Families (BSF), 43 percent of military spouses are unemployed, compared to 25.5 percent of civilian spouses. Eggert suggests employers shun this talent pool for outdated reasons. “Employers know they often need to relocate,” Eggert explains. “This makes no sense in an era where Millennials are job-hopping every three or four years.” Comcast not only proactively recruits military spouses, but also helps those forced to relocate find jobs elsewhere in the organization or with other employers through its partnerships with different veterans coalitions.

Booz Allen Hamilton welcomes military spouse employees with personal emails from other military spouses at the vice president level and has developed a specialized handbook for their use. And Starbucks is a member of the Defense Department’s Military Spouse Employment Partnership program, composed of more than 360 employers vetted and recognized by the Defense Department as portable career options.

Certainly, companies looking for skilled, hard-working and motivated employees would benefit from giving more thought and effort to hiring veterans and military spouses. “Every branch of service espouses specific core values like loyalty, dedication, respect and integrity,” says Eggert. “Military personnel live by these values, forging people with remarkable character, self-reliance, tenacity to get the job done and leadership.”

Service Brewing’s Kevin Ryan is certainly happy he’s hired so many vets. “One of the first things the military teaches you is to take orders—you’re given a task and you do it,” says Ryan. “Working in a brewery is a physically demanding job. You’re pulling and pushing and shoveling all day long, and then putting on your best face to pour a draft for a customer. I’ve never heard a single complaint.”

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How Carbon Capture Tech Is Easing Industry’s Green Transition

By Russ Banham

Forbes

Scientists on the UN’s Intergovernmental Panel on Climate Change (IPCC) commissioned by the United Nations to provide guidance to global leaders on the economic and humanitarian impacts of climate change reviewed more than 6,000 scientific studies before reaching a devastating conclusion: Greenhouse gases (above all carbon dioxide, or CO2) must be reduced by 45 percent by 2030, and 100 percent by 2050, to deter a 2.7-degree Fahrenheit increase (from pre-industrial levels) in global temperatures.

With nearly 200 countries on board, the Paris Agreement offers hope for reducing the production of greenhouse gases in the future. But more needs to be done today to avoid the dire fate the IPCC projects. As we work toward a decarbonized energy future, the widespread use of carbon capture technology will likely be an indispensable strategy in our toolkit.

Seize, store and sell

The potential of this technology to reduce greenhouse gas emissions is “considerable,” the IPCC states, estimating that carbon capture can trap up to 85 to 90 percent of the CO2 emissions produced from the use of fossil fuels in industrial processes and electricity generation, effectively preventing that CO2from entering the atmosphere.

The IPCC is not alone in its support for increasing use of this promising technology. In a 2016 report titled “20 Years of Carbon Capture and Storage,” the International Energy Agency (IEA) suggests that wide-scale use of carbon capture would result in a 19 percent reduction in global CO2 emissions by 2050. However, this projection assumes the creation of approximately 3,400 carbon capture plants before that date. More action is needed, and quickly, to achieve these numbers; only 17 large-scale carbon capture plants exist in the world today, capturing roughly 40 million metric tons of carbon dioxide each year, a scant 0.1 percent of total global emissions.

In and out

The IEA recently noted that carbon emissions from advanced economies rose for the first time in five years in 2018. Given the economic challenge of substantially reducing fossil fuel use in electricity generation and industrial processes, carbon capture appears to be a practical, immediate solution.

Carbon capture and storage consists of three main parts: capturing, transporting and storing the CO2, the latter either underground in depleted oil and gas fields or in deep saline aquifer formations. The first leg of this journey entails separating CO2 from the other gases produced in industrial processes and electricity generation. The CO2 can then be transported to safe storage via pipeline, road tanker or ship.

The Petra Nova Project is the world’s largest carbon capture project at a coal power plant and is located just outside of Houston. The plant can capture about 1.4 million metric tons of CO2 each year.

In 2017, Mitsubishi Heavy Industries (MHI) Group, a carbon capture pioneer, along with its consortium partner, TIC, completed construction of Petra Nova’s post-combustion carbon capture and compression system, which captures more than 90 percent of the CO2 from a flue gas stream.

To provide a revenue stream for carbon capture, the CO2 is compressed, transported and pumped underground into an oil formation to increase overall oil production.

Just one of many

Carbon capture alone will not curtail the progressive warming of the planet, but it is a step that can be taken now — free from political wrangling. In recognition of the technology’s value, the U.S. Congress introduced bipartisan legislation in early 2018 to provide tax credits to companies that capture or reuse their CO2. President Trump signed the legislation into law.

Known as the 45Q tax credit, it provides carbon-producing companies a credit of up to $50 per ton over a 12-year period after the start of operation, depending on how the CO2 is used.

Even Congress could appreciate that a financial incentive was needed for industry to rapidly scale up carbon capture technology. Now the onus is on companies and the government to continue to find ways to realize more carbon capture projects.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

Visions of Commercial Transportation: The Future of Commercial Auto Insurance Is Here

By Russ Banham

Carrier Management

For several years now, the commercial auto segment has been among the property/casualty insurance industry’s worst performing lines. Wounded by a sharp uptick in claims frequency and severity and helped little by inadequate premium increases, average direct loss ratios and combined ratio hover around 66 and 110, respectively.

The market’s distress is attributed to a variety of higher loss exposures, including increasingly distracted and fatigued truck drivers and the high cost of repairing vehicles embedded with expensive collision avoidance cameras, sensors and telematics devices. The good news is that these same technologies are expected to positively alter the status quo, positioning commercial auto for a market rebound within the next five to 10 years.

Transportation experts project that semi-autonomous and fully autonomous electric trucks using crash avoidance sensors, cameras, telematics and deep learning forms of artificial intelligence will guide the way toward unparalleled safety. While semi-autonomous trucks equipped with lane correction and automatic braking systems are already on the road, fully autonomous/driverless vehicles are being tested across the country and the world.

Over the next decade, these trucks will gradually make their way into the nation’s transportation network, initially for long-haul delivery of goods on highways. Tests currently are being conducted in so-called “platooning,” which involves a convoy of trucks lined up in a row on a dedicated highway lane set aside specifically for their use. Only the first truck in the queue is driven by a human being; the remainder use automated driver support systems to maintain a specific distance behind the leader, accelerating and braking as dictated.

Since highway accidents result in the most severe commercial auto insurance claims (due to a greater risk of multiple fatalities), a future in which autonomous trucks travel along a dedicated lane bodes well for commercial fleet operators, their insurers and the driving public at large. In 2017, truck-related fatalitiesreached their highest level in 29 years, up 9 percent to 4,761 deaths. (Source: “Trucking Fatalities Reach Highest Level in 29 Years,” Trucks.com, Oct. 4, 2018, reporting on the U.S. Department of Transportation’s National Highway Traffic Safety Administration analysis, “2017 Fatal Motor Vehicle Crashes: Overview,” page 3)

Accident frequency rates are a different matter. In our Amazon-fueled era of ecommerce, more trucks and delivery vans are congesting shorter-distance routes, particularly in the so-called “last mile” of transport.

“Accidents are increasing because of crowded roadways and streets, driver inexperience, and the rush to deliver goods in a certain time frame,” said Steve Viscelli, senior fellow at the University of Pennsylvania’s Kleinman Center for Energy Policy. “What used to be an experienced UPS driver delivering a package here and there is now 20 or 30 part-time people in vans delivering dozens of small packages in very short time frames.” He’s referring to Amazon Flex and Uber Freight Plus—applications that link drivers with shippers to deliver smaller packaged goods.

These drivers are subject to both traditional collisions and non-accident-related injuries. “There’s been an increase in basic ‘slips and falls’ from drivers bringing bulky packages into a residence,” Viscelli said. “Drivers also double park and fail to use proper techniques for removing heavy packages, hurting their backs. Of course, all that congestion and the need to make so many deliveries in a certain time frame also results in a greater number of collisions.”

Despite higher claims activity, technology is seen as an eventual enabler for safer practices, reducing both accident frequency and severity. Telematics drawing data from onboard cameras and other sensors will alert fleet safety managers in real time to road hazards, weather conditions, vehicle malfunctions, traffic congestion and optimal alternate routes.

As insurers and commercial fleets—large semi tractor-trailers, intermediate-sized trucks, and smaller pickups and vans—begin to share their respective data, expectations are for truly significant decreases in accident-related claims. “In five to 10 years, we will see a period of fast-maturing safety,” said Randy Mancini, vice president of commercial lines at regional insurer Penn National. “Beyond that we will see a transportation system that looks little like the one we’ve long known.”

Handing Off the Baton

This future is already beginning to materialize, evident in subtle changes in the intermodal delivery of raw materials and packaged goods. “Historically, the system has relied on rail, semi tractor-trailers, and smaller trucks and vans,” said Viscelli, author of the book, “The Big Rig: Trucking and the Decline of the American Dream.” “In the expanding ‘I want it now’ ecommerce environment of the future, rail will take on a larger role transporting goods from shipping ports to large warehouse-like depots alongside major highways.”

Fully autonomous trucks will then pick up these goods for transport over long distances in a dedicated highway lane to another depot, Viscelli added. “From there, a semi-autonomous, intermediate-sized truck will transport the stuff on shorter distances to an Amazon or other fulfillment center,” he said. “Smaller trucks and vans driven by for-hire drivers will then handle the last mile of delivery to homes and businesses, as they do now.”

This scenario, assuming it comes to fruition, will result in fewer accidents on highways. “If lawmakers and regulators set aside a designated lane exclusively for self-driving trucks, the chance of a passenger car cutting off a truck is vastly reduced—to the benefit of commercial fleets, drivers and insurers,” said Frank Palmer, senior expert in McKinsey & Company’s insurance practice.

Frank Netcoh, head of middle-market auto for The Hartford, shares this perspective. “If you have a fully autonomous truck on a big stretch of highway designated specifically for these vehicles, the platooning concept would reduce the possibility of making a mistake,” Netcoh said. “It’s hard to get into a collision when no ‘driver’ is screwing up.”

What about the safety of shorter-duration transport vehicles driven predominantly by people?

The experts are optimistic that safety enhancements warning drivers about impending hazards and correcting vehicles to reduce the risk of rear-end collisions, unsafe lane departures and rollovers (caused by the unequal distribution of weight inside a truck) will also increase safety and reduce accident frequency.

“These crash avoidance technologies are already in use today, but the real ‘game changer’ in my mind is telematics—the movement of information from onboard cameras and sensors to fleet safety managers,” said Brian Fielkow, CEO of JETCO Delivery, a logistics business that specializes in regional trucking, heavy haul and national freight.

Depending on the telematics device, a two-way camera films both driver behaviors and the road ahead. If a truck is in an accident, the data can indicate what the driver was doing just prior to the collision—such as reading a text, eating or drinking, or blankly looking ahead with shoulders slumping and eyelids shuttering from fatigue (the camera captures a driver’s body position, in addition to his or her face). This information can then be used for evidence-based training purposes. “It’s like watching game films—a learning tool,” said Fielkow. “Even the best driver isn’t a perfect driver.”

Front-looking cameras, on the other hand, can discern the different factors leading up to an accident or a near-miss. “A trigger event like a hard brake will inform the telematics to send video depicting the road scene a few seconds prior,” Fielkow explained.

In accidents in which a truck rear-ends a vehicle, the video can be the difference between a very expensive claim and no claim at all. “Traditional investigations of an accident almost always attribute the cause of the collision to the vehicle in the rear,” said Fielkow. “However, the video may indicate that the driver of the rear-ended vehicle was swerving in and out of traffic and actually cut off the truck, altering liability.”

“Regrettably, people who get into an accident with a truck see that big name sprawled across the side and tend to blame the company,” said Jennifer Haroon, Nauto chief operating officer and formerly head of business operations at Waymo (Google’s self-driving car project). “Our video evidence exonerates a lot of drivers.”

Gizmos and Gadgets

Telematics devices are becoming increasingly more sophisticated and less expensive, encouraging wider use by fleet operators. The visual sensors, for instance, are smaller, even though the semiconductors within them are more powerful. Sandeep Pandya, president of Netradyne, another maker of computer vision cameras, attributes the enhancements to ongoing technological advancements in smartphones.

“We’re in an age of powerful computing capabilities and ubiquitous connectivity via 4G broadband cellular technology,” he said. “Using deep learning, algorithms can automatically draw insights from visual images, accelerometers, gyroscopes and other sensors…Mass quantities of other data like the weather and road conditions also can be analyzed to guide safer driver behaviors.”

That’s uplifting for both truck fleets and their drivers, an employment category where demand far outstrips supply. The American Trucking Association estimatesthat the transportation industry needs to hire nearly 900,000 additional drivers to meet rising demand for service. Few young people want to drive a truck for a living. (The average age of truck drivers is 55, according to some sources; the ATA puts the average age of just over-the-road, for-hire drivers a little lower, at 49.)

“Professional truck drivers drive 50 hours a week and 150,000 miles per year on average; they take great pride in what they do,” said Pandya. “Almost the entirety of their work is uneventful, but it’s that 2 percent in which they confront hazardous conditions that define them. Following an accident, drivers feel vilified, making it hard for the industry to retain them. Telematics is a way to determine the best of the best for retention purposes.”

Netradyne’s cameras, he explained, will ferret out positive driving behaviors as well as negative ones. “You’re now able to know which drivers are the most safety conscious,” said Pandya. “This gives a truck company the opportunity to address its driver shortage problems; instead of spending $10,000 to recruit a new driver, it can put it toward a bonus rewarding its best drivers.”

Filming a truck driver’s entire time at the wheel creates privacy issues, hence both Netradyne and Nauto have designed their telematics devices to produce short video streams of the triggering safety event, such as images of what the driver was doing immediately preceding the accident. This information is automatically provided to fleet safety managers to reduce driver distractedness and isolate factors causing drivers to become fatigued.

Interestingly, the introduction of electric trucks like Tesla’s first semi tractor-trailer, Tesla Semi, is expected to reduce driver weariness. “Electric vehicles produce less noise and vibration and are easier to drive, helping drivers remain vigilant for longer periods of time,” explained Chris Nordh, senior director of advanced vehicle technologies at Ryder System Inc., a provider of rental trucks and fleet management systems. “Reports indicate that drivers are not as exhausted at the end of the day.”

Electric trucks also are able to withstand more of the damage caused by a collision, said Nordh’s colleague Amy Wagner, Ryder vice president of global product insurance and risk management. “Crash data indicates that electric vehicles are the safest vehicles on the road in the event of an accident,” said Wagner. “Electric vehicles are built on what is called a ‘skateboard platform,’ where the batteries are built right into the frame. This creates a vehicle with a low center of gravity that is also a massive piece of metal. Not surprisingly, we’re at the forefront of bringing more commercial electric vehicles into our fleet.”

Beginnings of a Long-Term Partnership

Insurers are bullish on these varied technological and intermodal transport developments. A case in point is Penn National, which provides financial grants to truck insureds interested in implementing telematics and other safety equipment. “We’ll split the cost of the technology 50-50 up to $2,500, in return for feedback on what the data indicates about claims reductions,” said Mancini. “We can then learn from this data in terms of our underwriting and pricing.”

Other insurers perceive similar value in having policyholders share their experiential data. “It gives us the opportunity to provide more insightful consultation to fleet safety managers,” said Tony Fenton, vice president of Underwriting and Product for Commercial Auto, Casualty and New Product Development at Nationwide. “It puts our loss control/risk management professionals in more of an ‘I know’ environment.”

Netcoh from The Hartford agrees, noting that the insurer’s focus in 2019 is to more closely engage with its truck insureds to better understand their use of telematics. “We’re looking to dovetail their work with the research conducted by our safety risk engineers to improve loss costs,” he explained. Telematics has yet to evolve into a really close data-sharing relationship.”

That’s true. However, Fenton predicts that within the next five to 10 years, such sharing of data will be business as usual. “I can see a day when insureds provide us information on how many drivers are on the road, who these drivers are and how they are driving, and we provide to them information on the location of the vehicle in relation to the accident history of a particular stretch of highway based on weather conditions, traffic congestion and driver time at the wheel,” he said. “As more data becomes available and is shared, insurers can rate a client based on real-time exposures, as opposed to traditional rating programs driven by the type or class of vehicle.”

Wagner from Ryder concurs, pointing to the opportunity for insurers, truck companies and diverse other parties like repair shops to engage in an ecosystem-like software platform. “When you start bringing all these diverse sources of data together, you can create driver scorecards as a means of positive reinforcement to compel better driving behaviors,” she said. “Those scorecards can then make their way into lower insurance rates.”

Need for New Types of Insurance?

As commercial fleets incorporate more autonomous safety features into their trucks, traditional commercial auto policies may need to accommodate new causes of liability. For instance, a collision may not be the result of driver error but in fact be caused by a sensor failure, software glitch or a cyber attack.

Insurer Chubb is presently mulling this new risk landscape. “We have to be prepared to address these new liabilities,” explained Dave Brown, executive vice president and transportation leader in Chubb’s Major Accounts Division.

There are two schools of thoughts on how insurance policies may need to change. “One is whether or not the current commercial auto liability policy is still the appropriate mechanism, assuming some adjustments,” Brown explained. “The other surrounds the use of product liability policies as an alternative or conjunct to commercial auto.”

While Brown is “on the fence,” he said, regarding which concept is best, he believes the longstanding commercial auto policy has a lot of positives to commend it. “For one thing, it allows for claims to be efficiently administered to rapidly take care of injured victims and make them whole financially, whereas product liability claims are highly litigious—consuming time and costs,” he explained. “If a loss is attributed to a sensor or software defect, my thinking is that the commercial auto policy should be the mechanism for the adjudication and payment of the claim, with this insurer subrogating the product manufacturer’s insurer for the loss.”

Down the line, the experts believe that commercial fleets will inevitably become safer. “The next wave is for all these remarkable cameras and sensors to send their data over the Internet to be analyzed with other types of data like historical loss patterns, real-time weather conditions, traffic congestion and even the Department of Transportation’s data on roadside inspections,” said Fielkow. “Really, the sky’s the limit here. Algorithms can then dig through these massive data volumes to unearth specific areas for continuous safety improvements.”

As this happens, Fielkow can see a day when commercial auto insurers and truck carriers work closely together to help each other become smarter about risk management and loss prevention, he said. That’s good news for both parties and for all of us who regularly venture onto a highway.

Russ Banham is a Pulitzer-nominated business journalist and author of 29 books.

Quake Queller: A New Technology Called the Seismic Muffler Brings New Hope for Reducing Earthquake Damage

By Russ Banham

Leader’s Edge

On Jan. 17, 1994, a 6.7-magnitude earthquake struck the San Fernando Valley region of Los Angeles, killing 72 people, injuring more than 10,000, and causing an estimated $40 billion in widespread property damage. Thousands of homes, buildings and cars were destroyed in what remains one of the costliest catastrophes in U.S. history.

The Northridge Earthquake, named for its apparent epicenter (later determined to be the nearby community of Reseda), stunned seismologists with its ferocity. Catastrophe prediction models had estimated the probability of a 6.7-magnitude earthquake as a one in 500-year event. Now, just 24 years after Northridge, scientists at the U.S. Geological Survey predict a 99.7% of another 6.7-magnitude temblor in Los Angeles within the next 30 years.

Much worse is the possibility of a mammoth earthquake striking coastal residents of the Pacific Northwest along the 620-mile Cascadia Subduction Zone, where the Juan de Fuca ocean plate dips under the North American continental plate. The fault zone encompasses the cities of Seattle and Portland, which confront an 8% to 20% chance of experiencing a magnitude-8.0 or higher quake in the next 50 years.

Such doom and gloom projections are daunting for anyone living along the western coastline of the United States. The risk is also of great economic consequence to the global insurance and reinsurance industries, which absorb the financial brunt of earthquakes along with local, state and federal taxpayers. The Northridge Earthquake alone caused insured losses estimated at $25.6 billion in 2017 dollars, more than the industry had collected in earthquake premiums over the prior 30 years. According to the Federal Emergency Management Agency, the damage losses add up to $4.4 billion annually nationwide. Across the planet, earthquake losses in 2016 alone surpassed $53 billion.

Limiting the Impact

On average, roughly 500,000 detectable earthquakes occur each year, of which 100,000 can be felt and 100 cause significant property damage. For millennia, people living in regions prone to earthquakes have tried to limit the impact of earthquakes on buildings. Pliny the Elder’s history of ancient Greece includes a reference to the use of sheepskin between the ground and the foundation of a temple to permit the structure to slip and slide with less damage during a temblor. This ancient prevention technique is actually a primitive version of base isolation, a current protection technology. In base isolation, spring-like flexible pads are inserted between a building’s foundation in the ground and the building itself to absorb devastating ground motions.

Now another earthquake protection technology has been developed to do something similar, albeit in a way that stretches the bounds of credulity. OK, it blows the mind.

Developed by scientists at the Massachusetts Institute of Technology’s Lincoln Laboratory, it’s called a seismic muffler, at least for the time being. The concept calls for drilling a V-shaped array of boreholes hundreds of feet deep that slope away from the protected asset, such as a building, an airport runway or a power plant. The array of boreholes one to three feet in diameter is similar in shape and dimension to a set of trench walls.

Cased in steel or a comparable composite material to maintain the structural integrity of the underlying soil and rock, the boreholes divert hazardous surface waves generated by an earthquake away from the protected asset. The bottom aperture of the borehole array allows only higher-frequency, lower-energy seismic waves traveling from the depths of the Earth to enter and propagate. By the time this wave energy reaches the ground surface, it dissipates in much the same way the sounds emanating from a car’s combustion engine are softened by an acoustic muffler.

Tabletop exercises by MIT’s Lincoln Lab, using 3-D supercomputing calculations, indicate the V-shaped array of mufflers can decrease the ground-shaking effects of a 7.0-magnitude earthquake to a 5.5-magnitude earthquake and lower. That’s a vast improvement, given the logarithmic Richter magnitude scale. For the purposes of simple math, a magnitude-7.0 quake is 10 times stronger than a magnitude-6.0 quake but is 100 times stronger than a magnitude-5.0 quake.

Will a seismic muffler work in practice as it does in the lab? The answer appears to be yes. A patent has been issued for the technology, and a 20-page scientific research paper on the muffler was peer reviewed and published in the November 2018 Bulletin of the Seismological Society of America. “We have already received licensing interest in the technology,” says Robert Haupt, the Lincoln Lab staff scientist leading the development of the new earthquake protection system.

Too Good to Be True?

“Wow” factor aside, the technology has yet to be tested in the field. However, the boreholes in their V-shaped array (see diagram) should perform as intended, diverting surface energy away from the protected asset. Questions certainly remain, including the effect of these reflected waves on neighboring structures, the impact of drilling thousands of boreholes, and the overall cost compared to existing technologies, such as base isolation, which is limited to new construction. The effectiveness of boreholes may be greater than base isolation, since the total surface wave energy is diverted. But more analysis, including cost analysis, would be needed to determine the relative effectiveness of each approach for a particular property.

Putting aside these answers for the moment, we sent a description of the new technology to several structural engineers, a leading catastrophe risk modeler, two state insurance regulators, two large reinsurers, and the California Earthquake Authority. Collectively, their interest was piqued, but they were guardedly optimistic. As Dave Jones, then California insurance commissioner (his term ended in 2018), puts it, “The question comes down to what is realistic and affordable. While this appears promising, will it prove to be practical and affordable?”

“I read the piece you sent with an open mind, and it seems perfectly plausible to me,” says Keith Porter, a research professor in the Department of Civil, Environmental and Architectural Engineering at the University of Colorado. “That’s not saying I would recommend its use tomorrow, because it is not quite there yet, much less available. But I get the concept. The fact that it is peer reviewed in such a reputable journal gives further credence to its scientific validity and usefulness.” Porter holds a PhD in structural engineering from Stanford University.

Certainly, there are obstacles in the way of deploying the technology, including the need to obtain site access permits to bore thousands of holes. But Haupt believes the benefits of the solution overshadow its impediments.

“As long as you’re able to drill the boreholes at a distance of 300 meters or so from the asset, you can protect all kinds of structures—from a nuclear power plant to an entire neighborhood of residential homes,” he says. “If a community like Beverly Hills wanted to invest in putting this in to protect their homes on an aggregate basis, the destructive ground motion from an earthquake would be significantly reduced.”

This possibility caught the attention of Janiele Maffei, chief mitigation officer at the California Earthquake Authority, a privately funded, publicly managed organization that sells California earthquake insurance policies through participating insurance companies. Maffei, a registered structural engineer, is responsible for directing the authority’s statewide residential earthquake retrofit program.

“That’s a very interesting possibility, since we’ve been looking solely at mitigation on a building-by-building basis,” Maffei says. “The possibility of protecting more than one structure at a time is an exciting thing, particularly in California, which bears two thirds of the nation’s earthquake risks.”

Maffei cautioned that her opinion is tempered by financial reality—that is, the cost of the new technology. Other experts agreed. “This all sounds very interesting and promising, but we need to consider the real-world implications of the technology, chiefly its scalability and cost-effectiveness,” says Erdem Karaca, Swiss Re’s head of catastrophe perils in the Americas. (Not incidentally, Karaca holds a PhD in civil engineering from MIT.) “We’re talking thousands of boreholes drilled hundreds of meters deep to protect a hospital or a power plant. Is this safe? And how much will it cost?”

Haupt says the number of boreholes and their dimensions depend on the application. “Say you wanted to protect a kilometer-long airport runway,” he says. “The depth of the boreholes would be approximately 50 meters, the diameter about one foot, and the number of boreholes around 5,000 on each side of the runway.”

He further estimates it would require about 5,000 boreholes to protect a hospital, about 10,000 for a nuclear power plant, 40,000 for a 10-kilometer-long oil and gas pipeline, and 50,000 to 200,000 for a residential community (depending, of course, on its size). That sure sounds like a lot of boreholes, but Haupt countered that drilling with modern technology is “relatively straightforward.”

But what about the cost of all that drilling, compared to the expense of base isolation? According to various estimates, it costs $2,000 to $3,000 to drill a one-foot-diameter well 300 feet into the ground. And that’s just one borehole. Nevertheless, Haupt maintains the aggregate cost of seismic mufflers is much less than comparable base isolation expenditures.

“To build a tall skyscraper today using base isolation costs tens of millions of dollars per building,” he explains. “Based on general calculations from our extensive 3-D supercomputer computations, we estimate we could protect many more buildings at the same cost. So, yes, it would be cost effective.”

A more rigorous cost-benefit analysis will be available following the lab’s field-testing, Haupt says, noting that the lab is looking to drum up a combination of government and private-sector funding to produce a more comprehensive systems analysis. If the test findings are consistent with previous experiments and current cost estimations, former California insurance commissioner Jones says, the technology “could add a new level of protection for Californians. Anything we can do to reduce the potential for loss of life or property from damaging earthquakes we would support.”

Questions on Efficacy

Not all experts are optimistic about the efficacy of the technology. Robert Muir-Wood, the chief research officer at catastrophe modeling firm RMS, who holds a PhD in Earth sciences from Cambridge University, is dubious on several fronts. “This is interesting, ingenious and certainly a novel idea, but my gut reaction is that it’s science fiction,” Muir-Wood says. “Earthquakes are rich in many frequencies of vibration, and this procedure by design cannot anticipate what these frequencies will be prior to an event. It may muffle some frequencies but not all of them. Consequently, I don’t think it will work.”

Apprised of the criticism, Haupt responds that Muir-Wood is correct about earthquake vibration frequencies, which he equated to the frequencies in a broadband spectrum.

“He’s right that an earthquake does not produce a single tone of vibration but many different tones at once,” Haupt says. “But he may be unaware that our system is a broadband defense. By having multiple boreholes surrounding the protected asset, the surface waves approaching the borehole system are reflected or diverted. Any energy coming from below the earth and into the aperture is dissipated, enabling an indifference to frequency. So there is no frequency dependence.”

Karaca, from Swiss Re, brought up another concern—whether or not the V-shaped array of seismic mufflers might divert the energy of an earthquake toward neighboring structures. “Since the technology is designed to deflect surface waves, which are the most damaging aspect of an earthquake, that energy has to go somewhere,” he says. “My question is: where?”

“That’s an excellent query,” Haupt says. “He’s right that the energy will be diverted. However, the array pattern is designed to promote seismic wave self-interference, diverting the destructive effects of the waves. In other words, we’ve designed it in such a way that neighboring structures would not experience anything greater than the ground shaking already produced by the earthquake.”

Now What?

All in all, the unique earthquake protection technology appears to present a viable alternative to base isolation, although Haupt prefers to call it a “supplement” to current mitigations. The next step, he says, “is to go outside.”

The lab is undoubtedly eager to undertake real-Earth scenario testing, which Haupt believes will confirm the findings of the detailed 3-D supercomputer models demonstrating the technology’s effectiveness. Physical tests to date have involved drilling boreholes into thick blocks of plastic topped with scaled-down structures. To approximate different earthquake magnitudes, the blocks were shaken and the effects measured by an accelerometer. “We’re confident that the supercomputer modeling is accurate, but to really prove this works, we need to scale up the testing and experiment with actual boreholes drilled in the earth at different depths, densities, and so on,” Haupt says.

Like all academic laboratories, budgets are tight when it comes to large-scale tests. Is this something the insurance and reinsurance industries might be interested in funding, given the potential for a long-term return in decreased damage losses?

Maffei, from the California Earthquake Authority, is sanguine about the possibility. “Any technology that would mitigate the impact of an earthquake deserves monetary means for further testing,” she says. “When base isolation was explored in the aftermath of the Northridge Earthquake, it was tested first on a single structure. The results were encouraging, guiding its use in additional buildings. Little by little, it has proven itself.”

Richard Quill, expert risk research analyst at Allianz, shares this perspective. “We insure some of the earthquake risks affecting nuclear power plants and large oil refineries,” says Quill, who analyzes and coordinates the large reinsurer’s response to natural catastrophes. “If this technology is proven to work, it would obviously reduce earthquake damage risks to these facilities. From an insurance perspective, it is all very interesting. We’ve invested in the past in new risk-mitigation technologies, including how to make automobiles safer. But we would need more evidence.”

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

How We Can Make Our Power Grids More Stable And Resilient

By Russ Banham

Forbes

It’s a classic supply-demand imbalance. Global energy demand is forecast to rise over the next 20 years, but fossil fuels — currently the principal source of power generation — cannot meet growing needs.

That’s the upshot of the latest global energy report from the International Energy Agency (IEA). While fossil fuels will remain the primary source of electrical power for the foreseeable future, the increasing use of renewable energy sources like wind and solar needs to complement them.

That’s good news for the planet. These varied wellsprings of electricity, however, put added pressure on the country’s aging electrical grid — a challenge that makes necessary its own set of solutions.

A Massive And Ungainly System

Today’s grid is a massive feat of human engineering. Composed of more than 3,200 electric distribution outlets, nearly 10,000 generating units and roughly 300,000 miles of high-voltage transmission lines, it takes the shape of several large, interconnected systems spanning the continental 48 states.

The grid, originally built in the 1890s, has long been a work in progress. Initially, it consisted of more than 4,000 individual electric utilities operating in isolation. After World War II, these utilities began to connect their respective transmission systems. As new technological advancements emerged in each subsequent decade, they were effectively bolted onto this creaking foundation.

The expansion held, but unfortunately, the grid’s patchwork quality makes it more vulnerable to the risk of power outages. When one part of the connected grid needs repair, everyone is at risk of losing electricity. Outdated engineering, antiquated transmission lines and alarming increases in catastrophic weather events and cyberattacks also contribute to an increasing number of outages. When the grid buckles, vulnerabilities emerge — for businesses, for consumers and for the security and economic stability of the country itself.

Several innovative efforts are underway to modernize, strengthen and stabilize the grid. A case in point is the ongoing development of smart grids, using digital communications technologies, such as smart meters and smart appliances, along with other energy-efficient resources. Other promising advances include the deployment of microgrids and autonomous power plants.

Such ingenious approaches to traditional energy generation are direly needed, given the complexity and instability of the current grid, the emergence of increasingly diverse sources of electricity and the rising demand for power worldwide.

Hungry For Power

The IEA estimates that electricity demand worldwide will increase 60 percent by 2040, with developing countries accounting for much of the demand. By 2040, the IEA report states, the Asia-Pacific region itself will consume 46 percent of the world’s energy. In more developed countries, the IEA expects demand to increase about 30 percent through 2040, due in part to the rising use of digital and data technologies by businesses. Electricity markets are particularly vulnerable to the demand brought on by technological change, such as the increasing digitization of the economy and the rise of electric vehicles (EVs).

Despite growing global concern over the climate-changing effects of greenhouse gas emissions, coal is expected to remain the world’s largest source of electricity. However, projections indicate that coal will only generate 25 percent of the world’s total power supply — a 38 percent decrease in share. Renewable sources of energy that aren’t depleted by use, such as water, wind and solar power, will make up the difference.

Integrating renewables with traditional fossil fuel energy sources is crucial to optimizing power plant processes. But the inherent variability of wind and solar energy production is a challenge. Winter typically brings strong winds, while sunlight is at its height in the summer. For now, not all excess capacity can be put to use. Grid operators continue to seek ways to balance a variable supply with fluctuating demands.New concepts and approaches from grid operators are key to ensuring a consistent and steady balance of electricity sources and supply.

Stabilizing The Grid

Microgrids manage to address current grid constraints by circumventing them altogether. When outages require repairs to the larger central system, secondary grids can operate on their own using local energy generation. This method not only provides a backup source of power when the main supply falters, but also empowers small communities to boost their energy independence and avoid a reliance on unmanageable scale. Smart grids, such as the one MHI built and successfully tested in Japan in 2011, represent another encouraging development. In 2017, MHPS launched MHPS-Tomoni, a revolutionary digital solutions platform that discovers ways to optimize power plant operations.

MHPS-Tomoni focuses on providing complete energy solutions in partnership with customers (“tomoni” means “together with” in Japanese). The platform comprises diverse applications that harness and analyze big data to enhance plant performance, improve electricity availability, lower costs and benefit the environment.

Thousands of sensors capture massive volumes of power plant data every second, and MHPS-Tomoni leverages this data to provide insights to operators towards maximizing overall plant performance.

“The initiative is an important step as we seek to build the cognitive power plant of the future,” said Kenji Ando, MHPS president and CEO.

As these examples illustrate, industry leaders are taking key steps to gird the grid for the anticipated rise in global power demand and the move toward a wider energy source array.

Such developments will come as welcome news to the close to two billion people worldwide who still lack reliable sources of electricity. For the rest of us, they’ll streamline access and output, doing so with less waste and a smaller impact on the planet.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.

Doubling Down On ESG: Proxy Season With A Social Conscience

By Russ Banham

Corporate Board member

As the 2019 proxy season commences, shareholder proposals continue to be ripped from the headlines. Board diversity, climate change, the opioid epidemic and gun violence will top the agenda in corporate boardrooms in 2019—with directors called upon to ensure that sophisticated policies are in place to address them.

Given the intense media attention surrounding all-male boards, mass shootings and severe hurricanes, floods and wildfires, such shareholder activism is not terribly surprising. What is unusual is the push by large investors to effect ESG-related changes by companies well before the annual meeting.

“The uptick in engagement is encouraging companies to take actions that previously were accomplished via the more antagonistic shareholder proposal process,” says Courteney Keatinge, director of ESG research at proxy advisory firm Glass Lewis & Co. “The increase in dialogue appears to be diminishing the number of shareholder proposals that go to vote each year.”

Investors have made their positions on ESG clear over the past two years—such issues are actually viewed as material to a company’s long-term financial performance, depending on the entity type.

“Investors are looking at ESG in a more nuanced manner,” Keatinge says. “ESG can have this ‘warm and fuzzy’ connotation—like planting trees and promoting nonviolence. But, these are not ‘feel good’ issues, they’re legitimate operational issues.”

In other words, something that is good for society can also be good for the bottom line. “This is not the tail wagging the dog,” says Anne Sheehan, former director of corporate governance at CalSTRS (California State Teachers’ Retirement System) and a senior advisor at global advisory-focused investment bank PJT Partners. “Companies should have a firm grasp of what risks they need to assess and the investments they need to make in response to our changing climate or other environmental and sustainability risks… there is business value in these decisions as much as social value.”

Clout Matters

That sentiment is reflective of the position on ESG taken by large institutional investors like BlackRock, Vanguard and State Street Global Advisors (SSGA). Together, the three firms are the largest shareholder in 40 percent of all publicly listed firms in the U.S. and 88 percent of the S&P 500.

Over the past few years, the firms have been proactively posting their voting decisions online prior to annual meetings, giving individual investors the opportunity to see where they stand.

“Previously, if shareholders wanted to effect change, they submitted resolutions that went to vote,” says Sheila Hooda, CEO of Alpha Advisory Partners and a member of two boards (public company Virtus Investment Partners and Fortune 300 insurer Mutual of Omaha). “Now, with increased institutional engagement, companies are encouraged to take [early] actions that would have been handled through the shareholder proposal process in the past.”

BlackRock CEO Larry Fink made headlines when his annual letter to CEOs suggested ESG would factor into the $6.3 trillion investment firm’s decisions with statements like, “A company’s ability to manage environmental, social, and governance matters demonstrates the leadership and good governance that is so essential to sustainable growth, which is why we are increasingly integrating these issues into our investment process.”

This position has been proliferating across the investment universe, with Vanguard, State Street and other large investors helping to make ESG the “new normal” in shareholder demands. Among these is SSGA, the world’s third-largest asset manager, with nearly $2.8 trillion in assets under management. Matthew DiGuiseppe, vice president and head of Americas on SSGA’s Asset Stewardship Team, acknowledges that board discussions and evaluations have shifted toward corporate social responsibility, “thinking effectively about the impact of a company’s culture on long-term financial performance and strategy,” he says.

Keatinge agrees. “Board member must undertake a materiality assessment of what I call the ‘non-financial metrics’ to identify issues of material importance to a company’s long-term financial performance,” she said. “In doing these assessments, they may find that climate change or gun violence or the opioid crisis—depending on their business profile—are, in fact, material. If this is the case, board members must take pains to do something about it.”

Bye-Bye, Old Boys Network

While gun violence and opioids affect the performance of only a few companies, the gender diversity of boards affects most companies. A June 2018 survey by consulting firm Aon indicates that 68 percent of 223 institutional investors have expressed concerns over board gender diversity in their proxy voting.

Large institutional investors have put a bulls-eye on all-male boards. It’s been nearly two years since SSGA erected “Fearless Girl,” the bronze statue featuring a young girl squaring off against Wall Street’s symbolic giant bull, in New York’s Financial District in 2017. The firm’s message to the then-787 companies with all-male boards was crystal clear: Add women to your ranks now.

More than 300 companies have since heeded the demand. The remainder suffered the firm’s rebuke: SSGA voted against the chair of the board’s nominating committee entrusted to select new members.

Unquestionably, board diversity is gaining momentum as a priority. “With California breaking ground as the first state to require publicly traded firms to place at least one woman on their board of directors by the end of 2019, we believe board diversity will be a prominent issue among longer-term and activist shareholders alike,” says Dana S. Grosser,

spokeswoman for Vanguard, which tallies more than $5.1 trillion in investment assets. “Boards without women will increasingly be perceived as outliers,” says Marc Goldstein, executive director and head of U.S. research at proxy advisory firm Institutional Shareholder Services (ISS). “Our surveys indicate that far fewer investors are willing to admit that a lack of board diversity is not a problem. Boards have to get it right.”

ISS’s 2018 survey reveals that more than 80 percent of investors consider all-male boards to be problematic, up from 69 percent the prior year. Small wonder many interviewees anticipate that broader representation of women in both leadership and governance ranks will guide much of the proxy discussions in 2019. At present, the U.S. ranks 14th among global markets in gender diversity in the boardroom, according to a 2018 U.S. Board Study reviewing diversity in the boardroom by ISS.

That may be changing, now that U.S. boards are compelled to voluntarily disclose more information about each director, such as their “skill matrices” across age, tenure, varied experiences, expertise and outside commitments. “It’s important for investors and shareholders to have more transparency about who sits on the board and whether that person’s diverse background, experiences and skill sets are aligned with the long-term strategic needs of the enterprise,” says Friso van der Oord, director of research at the National Association of Corporate Directors (NACD).

“Board members have to pick the best candidate—period,” says Stephen Kasnet, vice chair and lead director at both Granite Point Mortgage Trust and Two Harbors Investment Corporation. “If they’re required to pick one or more women, the challenge is to pick not the most capable women, but the most capable directors.”

Changing Climate

Also on the ESG agenda is climate change. EY’s survey of investors indicates that nearly eight in ten (79 percent) believe climate change is a significant risk, with 48 percent stating that enhanced reporting of these risks is a priority.

ISS has long supported shareholder proposals requesting companies to disclose information on their climate change risks. The subject reached a turning point in 2017 when Exxon, Occidental and PPL Corp. passed landmark resolutions to disclose the risks that climate change posed to their businesses. Goldstein projects that similar resolutions will be in the offing this year, “as investors succeed in achieving heightened disclosures of climate change risks on a company-by-company basis,” he says.

ISS recently aligned its policy with recommendations by the Task Force on Climate-Related Financial Disclosures. Chaired by Michael Bloomberg, the Task Force’s guidelines call for greater transparency and board oversight of climate change exposures. The effort paid off. “During the 2018 proxy season, more than 65 resolutions regarding climate change were submitted by U.S. investors, of which 17 involved risk assessments based on the so-called two-degree scenario,” Goldstein notes.

Two-degree scenario proposals are an outgrowth of the Intergovernmental Panel on Climate Change’s determination that two degrees Celsius above pre-industrial levels is the uppermost limit in global temperatures tolerable to the environment.

The proposals generally call for greater disclosure of companies’ preparedness for climate change and efforts to account for climate-related risks and opportunities. Despite rising investor interest, proactive measures by companies led to the withdrawal of the majority of two-degree scenario resolutions in 2018, says Keatinge. “We had awaited an onslaught of such resolutions, but they were withdrawn or negotiated away,” she says. “Ultimately, only five went to a vote, and of those five only two received majority support.”

That’s because many companies had already committed to enhancing disclosure. “This is heartening, and the credit is due the investors,” Keatinge adds.

Ron Schneider, director of corporate governance services at DFIN, agrees, noting that this represents a seismic shift in investor perspective. “When we saw shareholder proposals on climate change in the past, they didn’t get the support of the largest indexed investors,” Schneider says. “While these investors recognized that the climate was changing, they had difficulty connecting the impact to long-term financial performance and shareholder value. That has now changed.”

Sheehan from PJT Partners cites the value of water conservation to a soft drink manufacturer. “If a company is a water-intensive business, conserving water through processes that allow for the reuse of water, for instance, will have positive bottom line impact,” she explains. “That’s also good from an ESG standpoint. For a soft drink maker, these are not two separate silos.”

Social Sensibility

The opioid crisis, firearms and pay equity are also on investors’ ESG agenda. For example, following the February 2018 mass shootings in Parkland, Florida, multiple shareholder proposals asked that retail companies stop selling guns or part ways with the National Rifle Association. In response, Walmart and Dick’s Sporting Goods placed new restrictions on gun sales, while others like MetLife, Delta Airlines and FedEx voted to cut ties with or discontinue preferential treatment of the NRA.

“For some companies, gun violence is an issue of material importance,” says Keatinge. “An example is Sturm Ruger, whose shareholders at the 2018 annual meeting approved a proposal for the company to detail its plans to track violence associated with its firearms, disclose its research on ‘smart’ gun technology, and assess the risks that gun violence poses to its reputation and long-term financial performance.” Sturm Ruger has until February to produce a report addressing the shareholder’ concerns.

In 2018, a number of first-time shareholder proposals involved the opioid crisis. Keatinge and others anticipate more of the same this go-round. “Shareholders at Depomed (a distributor of opioid painkiller Nucynta now known as Assertio Therapeutics) voted in the majority to approve policies enhancing the board’s governance of the impact on the bottom line of continuing to distribute opioids,” Keatinge says.

These varied actions demonstrate the potential for investors and shareholders to, at a minimum, raise the level of dialogue around ESG issues—and possibly effect change.

It’s The Job Title Defining 21st Century Companies — But What Does CDO Mean?

By Russ Banham

Forbes

Today, the current pace of data production is estimated to be 2.5 quintillion bytes per day. To get a sense of this expanse, 2.5 quintillion pennies would cover the Earth’s surface — five times — if laid out flat. This figure is only set to surge with developments in IoT. A 2017 IDC report predicted a ten-fold increase by 2025 — 30 percent of which will need real-time processing.

All of this data equals untapped business value, with innovations that make company operations more efficient, online interactions more seamless, and recruitment and hiring more aligned with business strategy. Yet given this exponential increase in business data, experts say it is wise for companies to hire for specific roles to manage its development and application.

“If you look at the meteoric rise in data from a business strategy perspective, you need to develop ways to collect data, cleanse and archive it, and apply analytics to it to extract insightful information,” said Jim Johnson, senior vice president of technology staffing services at Robert Half, a global job search and staffing firm. “This isn’t just `bits and bytes.’ You need people with the right skill sets who understand the data and digital transformation strategy to bring it to fruition.”

‘Jobs of Great Importance’

Entrusted to garner these benefits are CDOs — yet, here is where things amplify. The acronym comprises two types executive leadership roles — chief data officers and chief digital officers — each unique, yet both focused on data oversight, and in high demand as companies transform operations to understand, analyze, and monetize it for business purposes.

“They are both focused on data, realizing its core value as a business asset,” Johnson said. “But fundamentally they have different corporate missions.”

This explains the recent hiring spree for both types of CDOs. According to a 2018 Forrester survey, more than half of companies have appointed a chief data officer, and 18 percent are preparing to do so in the future. (Another survey by NewVantage Partners pegs the hiring percentage higher, with nearly two-thirds of companies hiring for the role — a considerable leap from the 12 percent that employed such executives in the firm’s 2012 survey.) Similarly, a PWC study indicated that 60 percent of chief digital officers were hired between 2015 and mid-2017.

“Every company is going through a digital and data transformation or darn well should be,” said Johnson, who helps companies hire for both types of CDOs. “This compels them to find people to lead these initiatives.”

The question is, what do these roles look like? (And does your organization need one?)

Chief Digital Officers: A Business-Savvy Bunch

In essence, chief digital officers own customer-related analytics and metrics. What’s more, they are responsible for harnessing these insights across the organization, developing digital approaches to solve business problems. An example in a manufacturing context is analyzing supplier data to determine ways to enhance supply chain efficiencies.

Chief digital officers typically have backgrounds in sales, marketing, or customer service, as well as knowledge of how to use innovative technologies to enhance business functions. These leaders generally partner with sales and marketing executives to streamline and improve the customer experience across mobile and digital channels, and at times will guide the development of marketing and advertising strategies.

These responsibilities, however, are evolving. “Chief digital officers are now being asked to take on the additional role of business information analyst,” Johnson said. “Given these newer job tasks, the best chief digital officers need to have significant strategic, financial, operational, and interpersonal communications skill sets, in addition to technological chops.”

Chief Data Officers: Data Science to Boot

By contrast, the chief data officer is charged with managing the access, analysis, and dissemination of data to effect better business outcomes. To do that requires transforming the enterprise around data to identify new product opportunities, enhance customer experience, improve regulatory compliance, spot competitive challenges earlier, and make operations leaner and more efficient.

“Anything that involves data can have an impact on financial performance,” Johnson explained. “We’re now at the point where technology is beginning to catch up to the massive volumes of structured and unstructured data being generated. What used to be stored in a database is now stored in the cloud, where it can be sliced and diced into insightful bits of knowledge.”

Given the focus on data and analytics, these CDOs often have backgrounds in mathematics, computer modeling, and data science. Typically, they lead the Data and Analytics department that is focused on achieving the corporate vision for enterprise-wide data access, use, and governance.

A recent Forbes Insights study shows this is no easy task — almost 80 percent of executive survey respondents said 40 percent or less of their organization’s data is available for sharing across the company. “It’s up to the CDO to make data available for use by all parts of the organization — to identify business opportunities and solve business problems,” said Johnson.

One example is the wide-ranging use of data in the insurance industry, where data drawn from image recognition software embedded into drones is improving the underwriting of homes and office structures, as well as data within claims filed by policyholders which can help customers mitigate potential losses.

An Evolving Role

The responsibilities of both CDOs are expected to become broader, as newer technologies like deep learning, blockchain, and the IoT burgeon. Business disruption is a constant today, requiring companies to stay on top of the latest digital and data trends — and the technologies that power them — to remain competitive.

This responsibility is the job of the CDO, both of them, albeit in different ways.

“Many CEOs perceive CDOs as `change agents’ — digital and data transformation specialists,” said Johnson. “Now and in the near future, it’s a job that holds great importance.”

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author writing frequently about the intersection of business and technology.

Five Ways to Recruit and Retain Generation Z

By Russ Banham

Perspectives

They’re fresh-faced, smart, and headed your way. Gen Z, true digital natives born post-1996, are entering the workforce with skills and expectations that require rethinking current hiring and workforce paradigms.

For one thing, these entry-level workers are not all college graduates, as many Gen Z’ers are bypassing higher education to obtain prized internships and an early career start. The downside is their steeper learning curve due to their abridged educations, made more problematic by possible shortcomings in social skills, since technology has shaped much of their thinking and intellectual curiosity. As a recent Deloitte report stated, “a shortfall in highly cognitive social skills such as problem solving, critical thinking, and communication (could be) evident.”

Companies are challenged to position this young talent in a workforce comprised of multiple generations, given postponed retirements for Baby Boomers and the continuing presence of Gen X’ers and Millennials. That’s a lot of generations with different mindsets, cultures, and technological proficiencies to manage, which requires new job and workplace considerations.

A good start is knowing what Gen Z’ers want. According to a 2017 Medium survey of 5,000 Gen Z’ers, they seek an empowering work environment with autonomy. Other studies, including a survey by Dell Technologies that spoke to more than 12,000 Gen Z’ers from 17 countries, indicate that 80 percent aspire to work with cutting-edge technology: Nearly all (91 percent) said technology would influence their job choice, and 80 percent feel technology and automation will help create a more equitable work environment.

Interestingly, they also appear to favor face-to-face communications in their work interactions, despite the stereotype that Gen Z’ers are antisocial. Still, who (or rather, what) they work for is important. For the most part, Gen Z wants employment at companies that are socially conscious, with transparent stands on the environment, social, and governance issues.

As employers prepare to recruit, retain, and retrain Gen Z’ers, here are five moves to consider.

1. Help Them Chart Their Career Paths

Gen Z’ers often have a singular set of talents, such as social media proficiency. However, softer skills are more difficult to tease out of a resume or glean from in-person interviews.

To nurture both skill sets, recruiters should consider the value of tools that combine neuroscience games and artificial intelligence (AI) to evaluate a job applicant’s cognitive and emotional characteristics. Such tools, like Pymetrics, which uses AI to match talent, not only ferret out these traits, they also match employees with tasks they can perform well now. This gives employers time to target appropriate on-the-job training and upskilling to fill talent voids.

At AXA XL, for example, the large global insurer recently developed a self-assessment data analytics tool to help employees chart their own career paths. “As part of the onboarding experience, we request new hires score their proficiencies, from one to five, in 87 distinct skills,” said Henna Karna, AXA XL chief data officer. “With that data, we’re able to glean skill set gaps and their motivation areas for career growth, which then informs the need for related training and upskilling.” (The new tool will be rolled out in 2019.)

2. Consider Shared Workspaces

As technology natives, Gen Z’ers are also adept multitaskers, able to monitor inputs from diverse sources simultaneously. Office environments that inhibit their independent use of technology to solve problems are thus a bad fit. In these regards, many employers are moving toward shared workspaces, or hot desking.

Rather than confine a Gen Z’er to a single desk, allow them to roam freely and connect their work to a set of free desks. By designing a workspace with workstations dotting the landscape and giving Gen Z’ers (and everyone else) the opportunity to grab an open desk, they’re able to work with others — or when inspiration strikes — on the fly.

“The cubicle was created in the 1960s, and it provides a functional purpose. But work has changed drastically. The management styles and tools we use have evolved,” explained Dawn Longacre, Dell’s global workplace strategist. “Work is much more collaborative today, so we need to break down the walls.”

Shared workspaces allow for more spontaneous engagement with colleagues, the option to secure advice and counsel, and perhaps the ability to align interests with unexpected mentorship opportunities.

At Dell, leaders have converted virtually every space into a communal work environment, overhauling building terraces with chaise lounges, patio seating, and Wi-Fi outlets to work outdoors and soak up some vitamin D. Indoors, company cafeterias are revamped with coffee bar tables, picnic tables, benches, lounge areas, two-seater booths for private conversations and WiFi hookups.

3. Offer Empathy Workshops

While Baby Boomers grew up in hierarchical, military-style “command and control” work environments, Gen Z’ers tend to flourish in environments where they can execute tasks and solve problems on an independent basis.

“Tenured managers often are used to telling the people under their supervision what they need to do, as opposed to listening to what they’re interested in and good at doing,” said Cecile Alper-Leroux, a seasoned economic anthropologist who is vice president of human capital innovation at Ultimate Software, a leading provider of human capital management solutions.

“Gen Z’ers, on the other hand, are extremely good at independently solving whatever it is that managers need them to solve; they’ve been doing this with their smartphones since they were kids,” she said. “What they’re really interested in is a collaborative relationship with their managers, where they can have an exchange of ideas and dialogue.”

These different styles require a fair amount of give and take across the generations, and that’s where empathy workshops come in. “Empathy or openness workshops, in which managers and staff share their career interests, challenges, and goals offer a way to bridge generational differences,” Alper-Leroux said. “They also present the opportunity for everyone to participate in open communication, furthering workplace relationships and the goal of a more engaged workforce.”

4. Develop a Unique Innovation Sabbatical

Given the interest among Gen Z’ers to have an equitable work-life balance, employers should consider the plusses of innovation sabbaticals, whereby employees are given time off from their regular jobs to develop other projects. Free from their daily constraints, independent-minded Gen Z’ers can generate out-of-the-box innovations, fueling their own desire to innovate and, perhaps, enhancing corporate performance.

While this may sound cutting-edge, innovation sabbaticals have been around for nearly a half-century. 3M, a conglomerate of health care, consumer goods, and other businesses, is credited with developing the concept in 1974, calling it the “15 percent program.” The title references the 15 percent of paid time given to employees to pursue (so-called) outrageous ideas. Google followed suit with its “20 percent program,” which hatched such breakthrough concepts as Gmail and Google Earth.

A similar practice is in play at BlackLine, a publicly traded provider of financial and accounting software automation solutions. “We encourage employees to identify projects outside of their core responsibilities to broaden their experience,” said Susan Otto, BlackLine’s chief people officer. “This not only widens their perspective of the company, it also increases their understanding of cross-functional dependencies, and allows them to work on initiatives important to them and the company.”

5. Promote Cross-Pollination

A successful way to flatten any Gen Z learning curve is cross-pollination — assigning them to different parts of the business to get a crash course in how everything comes together.

BlackLine is in the process of developing a work study-abroad program designed to provide short term international opportunities and expand global growth. “As opportunities are presented, we’ll look internally to offer employees the chance to step out of their current roles for a period of time, to take on a different job assignment in one of the 10 countries where we currently operate,” Otto said. “Not only can they absorb a country’s unique culture and business practices, they will acquire a better appreciation for the work their colleagues perform abroad, aligning everyone in a truly unified enterprise.”

Her last point resonates. With so many different generations and work behaviors crowding companies, employee alignment in pursuit of the business vision is crucial to long-term financial performance.

Russ Banham is a Pulitzer-nominated financial journalist and best-selling author.