Flight Rules for Risk Management - Top Takeaways from NASA's Norman Knight

By Jeffery Wacker


A new year brings with it many unknowns, and this year is no exception. It remains to be seen how the pandemic will impact the industry in the year ahead, and companies are still evolving strategies and plans to address the increased uncertainty.

Norman Knight, Deputy Director of Flight Operations at NASA, is no stranger to the unknown. He specializes in human space flight, and he has seen everything in orbit. He spoke at SFNet's first-ever online Annual Convention in November 2020 to share what he has learned from his decades at NASA.

Knight shared advice about building a culture of technical truth, openly discussing mistakes to improve systems, and planning for the unimaginable. Here are a few flight rules he suggested for companies looking to manage risk and uncertainty in the year ahead.

Talk Openly About Risk

For Knight, risk management is a matter of life and death. The mission control room is about the size of a football field, and every inch contains tools necessary to protect astronaut safety.

Flying people into space comes with a level of unavoidable risk, but Knight and his team do everything they can to identify and minimize those risks.He breaks down each risk by its impact. What would happen if this particular system failed? What's the backup plan? And what is the workaround?

From there, he and his team openly and candidly talk about the possible challenges and flaws in their plan.

"What we're most interested in is having a culture of technical truth," Knight said. "That is talking with no agenda about every possible risk and problem. Tell us the bad news."

Building a culture of technical truth is especially important, Knight says, because companies can often learn more from failures than successes. A culture where people are afraid to criticize plans or mention past mistakes can stifle growth.

"At NASA, we spend most of the time talking about failures. Some are small, but some are very large," Knight says. Whether a failure is big or small, his team talks through it to identify what could have been done differently."

"We track mistakes—not for retribution, but to identify flaws in training or procedures. We want people to be open about mistakes so we can improve our process."

By talking openly about possible weaknesses and risks, his team knows how much risk they are taking—both on a daily basis and on the aggregate for a six-month stay on the Space Station.

Plan for the Unexpected

No plan survives first contact, and even the most thorough risk analysis is likely to miss something.

"I've seen things that I think would defy physics," Knight said of what he has seen in orbit. The key is to balance preparation with an ability to think quickly and pivot in real time.

NASA handles this by developing a set of flight rules: procedures and processes that help manage risk in real-time. These processes are tested, developed, and re-examined constantly, but Knight pointed out that they are still not sufficient on their own. Equally important is the ability to know when to deviate from the flight rules.

"We are constantly doing simulations," Knight said. He'll have his team run simulations where the flight rules don't have the answers, so they need to think on their feet. "You don't need a simulator to run simulations," he added—they spend much of their time doing "paper simulations" or talking through potential issues around tables.

Companies looking to plan for the year ahead could benefit from doing the same: sitting down with their teams to talk about what processes and procedures they have, and what those existing processes might not cover. It is a near certainty that the year ahead will have some unexpected twists. Practicing how to handle the unexpected could pay off for companies in every industry.

People First

"Looking back over my 30-year career, what I see as probably the most important piece of risk management goes back to that human element," Knight said. "We can have algorithms that can generate risk profiles for us, tell us what the system risks are, tell us what the aggregate risks are… those are great. When you get into a real time environment, where things are happening, things are changing, those algorithms are only so good."

When push comes to shove, he said, he ends up falling back on his people: their critical thinking skills, their leadership ability, and their ability to metabolize new information without ever having the full data set on hand.

For a time, NASA was moving towards automated systems, but then people lost the ability to really understand what the computers were doing, Knight explained. Risk increases when people are making decisions they don't understand. NASA ended up moving towards a more balanced approach to make sure that employees saw the computers, algorithms and processes as a tool. Most of their flights are automated, but every astronaut is trained to fly manually as needed.

"It's the human element that gets you out of those pinches that you can't make up," Knight said, citing COVID-19 as an example. He and his boss were talking about how, this time last year, they would have never guessed they would be planning for the first commercial space flight amidst a global pandemic.

"If you asked me tenmonths ago how we'd get through this with a success, I would not know. But we have a great team here," Knight said. He and his team took baby steps day by day to build a new plan in the face of the pandemic.

He recommended everyone focus on their people and their culture to help prepare for the year ahead.

"While we're NASA, and you're in finance… there are threads of commonality in leadership, in risk management, in how you approach and assess problems and find solutions, build teams that are resilient… it doesn’t matter what industry you're in. There's commonality there, that benefits us all."

 


About the Author

Jeff Wacker LinkedIn Headshot
Jeffery Wacker is head of U.S. Asset Based Lending Originations, TD Bank.