The Commonwealth Bank of Australia (CBA) has announced that it will be deploying new artificial intelligence technology and machine learning techniques to proactively detect financial abuse made using its platforms.
CBA general manager for community and customer vulnerability Justin Tsuei said that technology-facilitated abuse is a serious problem for the bank, describing it as “completely unacceptable behaviour”.
“The new model, which uses advanced AI and machine learning techniques, allows us to provide a more targeted and proactive response than ever before,” Mr Tsuei explained.
The bank’s new AI model complements the less-sophisticated block filter that CBA implemented in 2020, which prevented transaction descriptions that included language that the bank identified as threatening, harassing or abusive.
“We want to ensure our customers feel safe when they are using our platforms, and it’s our responsibility to do everything we can to provide the right measures of protection across our channels,” Mr Tsuei said.
CBA framed this move as an extension of the bank’s broader strategy around the deployment of AI-powered technology and innovation in its digital channels.
“As Australia’s leading digital bank, we are continuously looking for new and better ways to improve our products, channels and services,” Mr Tsuei said.
CBA estimated that over 100,000 abusive transaction descriptions were blocked over the three-month period from May to the end of July 2021.
“With this new model in place, not only are we able to proactively detect possible instances of abuse in transaction descriptions, but we can do so at an incredible scale,” Mr Tsuei said.
The bank revealed that the new AI-powered detection model identified 229 senders of potentially serious abuse across both NetBank and the CommBank app, who were then manually reviewed by the bank.
In some extreme circumstances, CBA said that it would go so far as to terminate a customer’s banking relationship if they continued to breach their policies by engaging in abusive, threatening or harassment behaviours via transaction descriptions.
“The use of AI technology and machine learning techniques to help us address a serious issue like technology-facilitated abuse demonstrates how we can use innovative technology to create a safer banking experience for all customers, especially for those in vulnerable circumstances like victim-survivors of domestic and family violence,” Mr Tsuei said.