Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
Three Best Practices for Unilever’s Global Analytics Initiatives
    This article from Morgan Vawter, Global Vice...
Getting Machine Learning Projects from Idea to Execution
 Originally published in Harvard Business Review Machine learning might...
Eric Siegel on Bloomberg Businessweek
  Listen to Eric Siegel, former Columbia University Professor,...
Effective Machine Learning Needs Leadership — Not AI Hype
 Originally published in BigThink, Feb 12, 2024.  Excerpted from The...
SHARE THIS:

By: Adam Mazmanian
Originally published at fcw 

 

The Securities and Exchange Commission is using predictive analytics to evaluate risks facing the brokerage industry regulated by the agency and, potentially, to pinpoint firms headed for trouble.

Beginning this financial quarter, the SEC’s newly created Division of Economic and Risk Analysis (DERA) is applying a risk assessment model that processes real-time data, financial statements and other information to give investigators a heads up on those tilting toward default. Giulio Girardi, an economist at DERA’s Office of Quantitative Research, explained the parameters of the program at the Predictive Analytics World Government conference in Washington, D.C., on Sept. 18.

To evaluate risk, the SEC divides the 4,500 registered broker-dealers it regulates into seven groups for comparative purposes. Each of the seven groups is given a risk score for three categories: finances and operations, workforce and structure and supervision. Girardi would not disclose how the firms were categorized, but said that the 4,500 firms comprised a heterogeneous group and the divisions were necessary to make for valid comparisons.

Firms are ranked under a system similar to a mortgage scorecard. The system is designed to send up red flags for investigators, but investigations are not automatically triggered. Girardi explained that it might help investigators know which doors to knock on and what questions to ask when they do.

The analytic model is informed by historical data of firms that have defaulted. For example, models could be run on the defunct firm of MF Global in the months leading up to its meltdown to see if its risk is accurately predicted. “That’s the kind of analysis we can use to back-test this model,” Girardi said.

More and more agencies are using predictive analytics for fraud prevention, risk assessment and improving business practices. The Recovery Accountability and Transparency Board tracked spending on the economic stimulus package enacted in the wake of the 2008 financial meltdown and has won plaudits on the left and the right for its work. The inspector general at the U.S. Post Office has leveraged mountains of data on mail routes, employee disability claims, and private sector data from heavy users such as Netflix and Gamefly to create models to investigate health care fraud and mail theft.

Evangelists of predictive analytics say these are the early days for applying the process to the federal enterprise. “The ability to do fraud detection is the table stakes,” said Federal Communications Commission Chief Data Officer Greg Elin. “It’s a great thing to be able to determine billions in fraud,” he said. “The scarier thing is finding out whether it’s the right thing to spend money on a program.”

In the future, analytics could be baked into government interactions by ordinary citizens. Dean Silverman, senior adviser to the commissioner of the Internal Revenue Service, said  his agency had floated the possibility of creating a real-time online tax system that would raise red flags for filers using the basic tax return forms. He said that a variety of barriers, including the complexity of the tax code and the voluntary nature of tax filing prevented such a system from being developed.

To this could be added the institutional hindrances preventing government IT systems from rapidly adopting new business models.

“We can learn about patterns or things to watch for. But whether or not those can be operationalized in government [depends on] procurement, the rulemaking process, and IT infrastructure that is not agile,” Elin said.

By: Adam Mazmanian
Originally published at fcw

Leave a Reply