Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Three Best Practices for Unilever’s Global Analytics Initiatives
    This article from Morgan Vawter, Global Vice...
Getting Machine Learning Projects from Idea to Execution
 Originally published in Harvard Business Review Machine learning might...
Eric Siegel on Bloomberg Businessweek
  Listen to Eric Siegel, former Columbia University Professor,...
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
SHARE THIS:

10 years ago
Government Data Chiefs Find Varied Measures for Analytics Value

 

WASHINGTON—For Michael Wood, who recently retired from his position as executive director of the Recovery Accountability and Transparency Board (RATB), success of his team’s analytics work has been a difficult thing to measure.

The RATB was created by the American Recovery and Reinvestment Act (ARRA) of 2009, better known as the stimulus program, in order to provide transparency and accountability in the disbursement of $800 billion plus in funds. It built the Recovery Operations Center, which used data analytics to detect possible fraud. Overall, the program detected a very low amount of fraud, which is likely the result of a deterrence effect. But it’s hard to measure success when a lack of something proves its effectiveness.

“We’re sort of like the Secret Service,” he said. “Their successes aren’t obvious. Their failures are extremely obvious.”

The measures of success for analytics projects can differ from one federal agency to another. While fraud detected or deterred is a sign of progress at one, other directors of federal government analytics projects cited traditional ROI, time-to-answer metrics and cultural shifts as signs of progress.

Speaking during a panel discussion at the Predictive Analytics World Government conference on September 18 here, the executives pointed to experiential gains over time. For example: Although the RATB is winding down its work on the stimulus funds, said Wood, the board is now shifting to monitoring the disbursement of $64 billion in Hurricane Sandy relief.

Timely Answers as a Measure of Value
Greg Elin, chief data officer at the U.S. Federal Communications Commission, used the term “time to value” as a metric of success. Although, “we’re all still trying to determine how best to measure success,” he said, today he judges success by how quickly he can answer someone’s question. Ideally, it should be immediately, or at least that same day, he said. “The faster we can answer a question, the better we are doing with data,” he said. “Saying, ‘We’ll get back to you,’ represents a failure.”

Elin said he also determines success by the extent to which the culture in the organization is shifting. Are FCC staff starting to use data in new, interesting and valuable ways without coming to Elin first?

Bryan Jones, the director of the Countermeasures and Performance Evaluation unit of the U.S. Postal Service Office of Inspector General, said “it’s a mistake to look at one metric for success.” Instead, he looks for different types of success with different stakeholders.

There are the traditional metrics, such as ROI in terms of money, but other measures can be very important. User adoption, for example.  The more people that successfully adopt the use of your analytics, the better chance they’ll start talking about how much they like it and promote its use within the organization. And there is “perceived success with executives,” for example. “Someone has to go tell my boss that my group is doing really interesting things,” said Jones. “He needs to hear that from someone other than me.”

Dean Silverman, senior advisor to the commissioner and director of the Office of Compliance Analytics at the U.S. Internal Revenue Service, moderated the panel. Noting how difficult it can be to gauge the success of analytics, he wondered how an analytics group could possibly show their progress over the years. “Do we need to have some metric, by which we can show a 2 percent per year improvement?” Or perhaps the gauge is less quantitative. Silverman asked panelists to project three years into the future and explain what they’d like to have accomplished that would fulfill the promise and prove the value of their data analytics.

Ahead: Data-Driven, Rather Than Tradition-Driven Decisions
Panelists agreed that organizational resistance to the use of data analytics should ease over the next several years as retiring federal workers are replaced by a new generation comfortable with new ways of looking at data.  “Where I see all of our organizations going is that they will make decisions based on what is, based on real information, rather than the way we’ve always done things,” said Jones of the Postal Service.

Wood, the outgoing Recovery Board official, said he’d like to see a shift from reliance on a data analytics center to an open platform that would provide users with access to tools and data. The center would then focus on managing the data and helping users get access to the right data.

For example, RATB is now helping the state of New Jersey gain access to the audits of the recipients of Hurricane Sandy relief money. Organizations that receive federal grants above a certain amount are required to get an independent audit, he explained. “That typically goes into a data warehouse, where no one ever looks at it,” Wood said.

The RATB is starting to pull that data and analyze it. “We’re attempting to tell the state of New Jersey what audits have been done at the various localities and grant recipients,” he said. This analysis gives the state a better handle on where the funds are going and whether they are being properly spent, he added.

Tam Harbert is a freelance writer based in Washington, D.C. She can be contacted through her website.

By: by Tam Harbert
Originally published at data-informed

Leave a Reply