She highlights the importance of collaboration, regulation, and ethics-by-design in ensuring responsible innovation. ABBL is actively engaging stakeholders across sectors and at EU level to foster a secure, ethical, and innovation-driven financial ecosystem.
From facilitating working groups to launching pioneering initiatives like federated AI in anti-money laundering (AML), Luxembourg is setting the pace for sustainable digital transformation in finance.
What are the main barriers to adopting AI, HPC, and quantum in finance, and how can cross-sector collaboration help?
The adoption of emerging technologies in finance is challenged by several factors:
- Compliance complexity: Financial institutions must navigate evolving regulatory frameworks such as the AI Act, DORA, and GDPR.
- Skills gap: There is a pressing need for expertise in AI, cybersecurity, and data science.
- Infrastructure investment: Implementing these technologies requires significant capital and technological readiness.
Cross-sector collaboration is essential to overcoming these barriers. It promotes knowledge exchange, supports the co-development of standards, and enables pooled investments and shared infrastructure—particularly relevant in a collaborative environment like Luxembourg.
At ABBL, we actively foster this collaboration by coordinating working groups, advocating for innovation-friendly regulation, and contributing to European-level discussions to shape practical, responsible frameworks.
How can we ensure these technologies align with ethical standards and EU regulations like the AI Act?
Ethical AI is about creating systems that are fair, transparent, and aligned with the public interest. At ABBL, we believe the foundation lies in embedding compliance and ethics from the design stage. We focus on three pillars:
- Governance: Robust internal controls, oversight by risk and compliance teams, and board-level understanding of AI applications are essential. We encourage our members to integrate AI and data ethics into their digital governance frameworks.
- Transparency and documentation: Institutions must document the full AI lifecycle—from data sourcing to risk evaluation. This is a regulatory necessity under the AI Act and a key factor in building public trust.
- Regulatory collaboration: ABBL engages closely with national and EU regulators to ensure that the financial sector’s voice is considered. We advocate for clarity in the classification of AI systems and practical compliance obligations under the AI Act.
Can you hint at any standout use cases in finance that will be discussed at the SCynergy event—particularly in fraud detection, risk modeling, or trading?
Yes, a standout example is our recent collaboration with the University of Luxembourg’s Interdisciplinary Centre for Security, Reliability and Trust (SnT). Together, we launched a project exploring federated AI for transaction monitoring in anti-money laundering (AML) efforts.
Traditional rule-based systems often yield high false positives and struggle with dynamic fraud patterns. Federated learning—a privacy-preserving AI technique—allows institutions to train a shared model without sharing sensitive client data. This significantly improves AML efficiency while ensuring full GDPR compliance.
Our proof-of-concept showed the feasibility of this approach and its alignment with Luxembourg’s collaborative ecosystem. The next step is real-world deployment, which could position Luxembourg as a leader in responsible AI adoption in finance and highlight how shared innovation can enhance both compliance and operational effectiveness.