The Risk Implications of Smart Technology

Artificial intelligence, drones, and the internet of Things are some of the most exciting developments happening in the tech world, but with these advancements come new and unforeseen risks for companies, governments and private citizens.

Last week, The Risk Institute hosted a continuing professional development session on the risk implications of smart technologies with experts from DHL, Cisco, EY and The Ohio State University.

“Ten years ago, people thought of drones as something used by the military on covert operations. Today, drones are available for a couple bucks and can fit in your pocket,” said Jim Gregory, Director of the Aerospace Research Center at The Ohio State University.

Gregory believes that the possibilities of drones are far-reaching, from package delivery to search and rescue, but that in order for the economics to work themselves out, the legality of drones needs to shift because under the current FAA regulations, most of these use cases are still illegal.

The most significant risks involving drones are

  • Loss of control link
  • Collision with another aircraft
  • Collision with people or property on the ground
  • Emergent behavior of autonomous system

Many of those risks can be mitigated with “redundancies on redundancies,” which according to Gregory can take the form of robust control links and never allowing one system to become so important that its failure results in catastrophic failure of the entire system. These redundancies are also essential in artificial intelligence.

“Artificial intelligence is not a singular concept,” said Chris Aiken, Executive Director, Advisory Services. “It’s a science-based, multi-disciplinary combination of software and computations presented in a human-like manner.”

Just as the cotton-gin spurred on the first industrial revolution, many experts believe that artificial intelligence will fundamentally shift the workforce, but are also quick to point out that AI is not necessarily smarter that humans, it’s just different.

“The real power of AI is to augment and amplify human intelligence and performance,” said Aiken.

And world leaders are taking notice with the competition between countries like China, Russia, Canada and the United States heating up for a global arms-race to dominate AI.

But what value is there in artificial intelligence really? According to Aiken, the real value of AI exists in five areas:

  1. Revealing insights
  2. Optimize performance
  3. Harness automation
  4. Enhance experience
  5. Sustain trust

As with any disruptive technology, it’s valuable to consider the predominant ethical, legal, risk and social issues associated with it. In the case of AI, companies should:

  • Start any project by examining the ethical and legal impacts
  • Evaluate the consequences on jobs
  • Communicate to win employee approval

Building trust between the user/impacted parties and AI is imperative for the success of the technology and business. Taking a holistic, human-centered approach, focusing on outcomes, and being pragmatic and ethical are common sense steps to take in order to build trust.

The Risk Institute remains committed to leading the conversation on risk in partnership with our member organizations. We examined the risk impacts of artificial intelligence in the risk function in our 2018 Survey on Integrated Risk Management. The findings might surprise you.

FCPA & Ethics highlighted at latest Risk Series

Ethics is more relevant than ever before; In public and private entities, ethical decision making seems to have taken a back seat to short-term profitability be it monetary gain or popular opinion. Companies like Volkswagen, Wells Fargo and Equifax are seeing share prices in freefall while simultaneously dealing with a public relations nightmare.

The Risk Institute hosted its first continuing professional development session of the academic year on October 11, 2017, on the Foreign Corrupt Practices Act (FCPA) and ethical decision making. Speakers included David Freel, a professor at The Ohio State University; Eric Lebson, a vice president at the Crumpton Group; Vlad Kapustin from New York Life; and Bill Foale, an investigator from EY.

Globally, corruption accounts for more than 5 percent of the global GDP — more than $2.6 trillion. Most of that corruption occurs in developed countries with approval from senior management. Which leads us to think that the tone at the top and organizational culture need work.

Organizational culture is the shared beliefs or expectations that influence thinking and behavior; it’s the glue that holds the company together. According to Prof. Freel local companies Nationwide, Cardinal Health are best-practice examples of excellent, ethical organizational culture.

And ethics is important to consumers too. The data currently says that consumers are more likely to do business with companies they perceive to be of a high moral fiber.

Since a company’s ethics is a priority to consumers, it could be assumed that strides have been made across industries to clearly define ethical behavior for its employees, provide training, improve whistleblower policies, etc.

Unfortunately, that’s not the case.

Over the last 30 years, there’s virtually been no change in anti-corruption policies. According to Eric Lebson, “It’s difficult to get a company that has never experienced a FCPA incident to take action.”

An FCPA investigation can be crippling. On average, an FCPA investigation lasts 3.7 years, 92.42% of defendants who settle with SEC, and 76.44% of defendants who settle with DOJ.

Bill Foale encouraged executives to empower their audience to make compliance second nature. Many anti-corruption policies are dense and jargony and therefore difficult for even a native English speaker to comprehend. Foale suggests asking the following questions about your anti-corruption policies:

  • Is the material understandable?
  • Is it written in a way that the information is relatable to the audience responsibilities? As in, not just a list of “do nots” and includes examples of practical tips
  • Language? Keep in mind that many of your employees may be native English speakers.
  • Is there a resource available for questions/assistance?

Many ethical challenges like transparency, privacy, self-interest, and data protection lie ahead. But with proper prior preparation, any organization can avoid ethical conflicts.

For more on this topic and many others, visit fisher.osu.edu/risk. Risk Series V continues on November 14 with a conversation on Mergers & Acquisition Risk. M&A is a high stakes game and getting it right matters. Join The Risk Institute and our experts from academia and industry for a lively discussion about the delicate balance of risk and reward in M&A.