As disclosed on the first page of the ChatGPT display, it may occasionally generate incorrect information, and produce harmful instructions or biased content.
he application of artificial intelligence (AI) and machine-learning technologies has emerged as a hot topic in discussions on the digital transformation and innovation in the financial services industry. ChatGPT is the latest technology application that is currently gaining traction.
Developed by OpenAI, ChatGPT is a language model that generates human-like responses to questions on a variety of topics. One of its uses in the financial services industry is as a chatbot for banks. Chatbots are computer programs that can simulate human conversation for a variety of applications. Customers can use chatbots to get information like account balances, credit card usage, interest rates and exchange rates. ChatGPT can enhance chatbot capabilities by leveraging natural language-processing technology to provide customers with high-quality, fast and friendly responses.
In theory, using chatbots to manage routine customer inquiries can improve a bank’s operational efficiency. It can also boost customer satisfaction by consistently providing instant services 24 hours a day, seven days a week. Customers can benefit from a more convenient banking experience, while banks can optimize resource allocation.
Nonetheless, there are social and ethical concerns arising from chatbots' inability to understand customers’ emotions due to a lack of human empathy. Furthermore, when banks explore the feasibility of adopting this technology there are also legal challenges that must be observed by all stakeholders, including customers and relevant authorities.
One of the challenges is ChatGPT's legal presence in Indonesia as user-generated content, a private Electronic System Provider (ESP). According to Communications and Information Minister Regulation No. 5/2020, private ESPs that allow users to provide, display, upload or exchange information and electronic documents fall under this category. Further, private ESPs that have a portal, website or application used for sending digital content for payment must comply with registration requirements in accordance with the prevailing regulations.
Given the significance of compliance risks for banking institutions, banks must ensure their adherence to all applicable laws. When banks use third-party services, they must also pay attention to the third party's compliance with prevailing regulations.
Another challenge is related to technology limitations. As disclosed on the first page of the ChatGPT display, it may occasionally generate incorrect information, and produce harmful instructions or biased content. Using technology with such limitations can undoubtedly expose banks to significant risks. If a bank's chatbot provides incorrect or misleading information, not only will customers suffer, but so will the bank's reputation.
In addition to the reputational consequences, providing misleading information can have serious legal ramifications for the bank. Law No.4/2023 on the development and strengthening of the financial sector requires banks to provide customers with clear, accurate, honest, easily accessible and non-misleading information. When these provisions are violated, banks must bear responsibility for losses caused by errors, negligence and illegal acts.
Another challenge is related to data privacy and security. When a chatbot collects and processes valuable customer data, such as financial information and transaction history, it can be vulnerable to cyberattacks and data breaches. To protect this sensitive data, banks must implement strong data-protection measures. Additionally, when leveraging third-party technology, banks must ensure that it adheres to the highest standards of security. This is critical because any vulnerability could be exploited by malicious parties.
Moreover, according to Law No. 27/2022 on personal data protection, data controllers must adhere to the regulations governing the processing of personal data, particularly when processing individual customer data. This covers various processes such as data collection, analysis, storage, as well as presentation, transfer, dissemination, disclosure and destruction. The recent data leakage incident involving ChatGPT's service highlights the importance for banks, as data controllers, to exercise caution in utilizing technology, considering the risks it poses to data security.
Reportedly, ChatGPT was temporarily unavailable on March 20. This took place after some users discovered that the titles of other people's conversation histories were visible in their accounts. Although only the titles and not the entire conversations were visible, this incident demonstrates the possibility of leaks when users share information or ask questions via ChatGPT. Further investigation revealed a more serious security issue, namely the potential leakage of personal data of several ChatGPT Plus customers.
If a leakage incident were to occur with a bank's chatbot service, it would be considered a failure to protect personal data, with profound consequences. As controllers of the data, banks are obligated to provide written notice to the relevant customers in the event of such a breach. In some cases, banks must also announce it to the public. This notification should include the personal data that was disclosed, when and how it occurred, and the measures taken to address and recover from the incident.
Moreover, the bank would be liable if the leak results in customer losses. Therefore, data leakage carries not only reputational risk but also legal and compliance risks, as well as the potential for significant financial losses to the bank.
As a result, in addition to evaluating the opportunities and benefits of ChatGPT technology, banks must also consider appropriate risk-mitigation measures. First, banks must ensure that the implementation of ChatGPT technology in Indonesia complies with all applicable laws and regulations. Second, to ensure the accuracy of the information provided, the technology must be well-integrated with existing systems and equipped with effective supervision and control mechanisms. Third, banks must adopt stringent security standards when processing customer data and ensure that the technology used meets those standards. Fourth, banks must provide training to employees and increase customer literacy regarding the use of this technology.
The potential for ChatGPT technology to revolutionize the banking industry by enhancing customer interactions and improving the overall banking experience is significant. However, banks must be mindful of the associated risks and challenges and implement suitable measures to mitigate them. Regardless of the application and technology, ensuring a customer’s seamless and superior banking journey requires striking a delicate balance between innovation and adherence to regulatory requirements.
***
The writer is head of Legal and Corporate Secretariat at Bank DBS Indonesia. The views in this article are his own.
Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.
Thank you for sharing your thoughts. We appreciate your feedback.
Quickly share this news with your network—keep everyone informed with just a single click!
Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!
Get the best experience—faster access, exclusive features, and a seamless way to stay updated.