top of page

Gambling AI One Year into the EU AI Act

Emma Shilling, Int.Dip (AML), PC.dp
Emma Shilling, Int.Dip (AML), PC.dp

When I first wrote about the growing role of artificial intelligence (AI) in the online gambling sector, the conversation was focused squarely on innovation; how AI could enhance the user experience, improve risk management, and support responsible gambling. In just a short time, the conversation has shifted. While the benefits of AI remain clear, the regulatory issues have become more complex and more urgent.

 

The EU’s Artificial Intelligence Act (EU AI Act), formally adopted and now entering its implementation phase, has introduced the world’s first broad legal framework governing AI systems. This is not just a European concern; its ripple effects are global, especially in sectors like gambling, which are inherently data-driven, digitally native, and highly regulated.


As we mark over a year since the EU AI Act was agreed upon, it’s time to revisit how the online gambling sector must respond.


A Quick Recap of AI’s Role in Online Gambling

 

AI technologies have become embedded across nearly every layer of the online gambling experience. Some of the most impactful applications include:

 

Personalisation algorithms 

Recommend games, tailor marketing content , and customise interfaces to improve user engagement.

Behavioural analytics 

Monitor user activity to detect problematic gambling patterns and flag potentially at-risk players.

Fraud detection systems 

Identify suspicious betting activity or account behaviour.

Dynamic odds generation 

Use real-time risk trading models for sportsbooks.

Automated customer service 

Deliver support through AI chatbots and virtual assistants improving responsiveness and efficiency

 

Each of these innovations brings benefits in speed, efficiency, and customer satisfaction. But they also raise serious questions about transparency, bias, consent, and control. All questions the EU AI Act is designed to address.

 

Understanding the EU AI Act

 

The EU AI Act introduces a risk-based framework that classifies AI systems into four main categories: unacceptable risk, high risk, limited risk, and minimal risk. The level of regulatory obligation increases with the degree of potential harm an AI system may pose to safety, rights, or public trust.

 

The key pillars of the EU AI Act are:

 

High-Risk Systems 

Must meet strict requirements for transparency, human oversight, data governance, and cybersecurity.

Transparency Obligations 

Players must be informed when interacting with AI systems, such as recommendation engines, chatbots, or automated affordability checks.

Prohibited Uses 

Include certain manipulative or exploitative AI applications, such as systems that unduly influence vulnerable players, exploit behavioural addictions, or covertly profile users without their knowledge or consent.

Post-Market Monitoring 

AI systems deployed by operators must be regularly reviewed to ensure they remain compliant, particularly if used to influence user behaviour or manage risk in real time.

 

The Act applies to both providers (those who develop AI systems) and deployers (those who use them), even if they are located outside the EU but serve EU-based users, a reality for most online gambling platforms.

 

What This Means for Online Gambling Operators

 

For gambling companies, the impact of the EU AI Act is not hypothetical, it’s immediate and concrete. Several AI systems used within the sector fall into the high-risk category, and compliance will require changes across legal, technical, and operational domains.

 

Responsible Gambling Tools

 

Many operators use AI to monitor player activity and flag harmful behaviour. While this is crucial for player protection, such systems can have significant implications on an individual's access to services. These tools are likely to be classified as high-risk because they can influence decisions related to self-exclusion, affordability checks, or account restrictions. Operators will need to ensure:

 

  • Full explainability of how decisions are made

  • Human oversight of outcomes

  • Clear communication channels for user appeals or complaints


Player Profiling and Personalisation


AI-driven personalisation, while enhancing the user experience, can also be seen as a form of psychological targeting. Under the Act, if such systems use sensitive behavioural data to influence choices may be subject to enhanced scrutiny.

 

Companies will need to evaluate whether their recommendation systems meet transparency obligations and avoid exploitative practices.


AI Chatbots and Customer Interfaces


Where AI systems engage directly with users, for example, chatbots providing betting advice or account support, operators must disclose that users are interacting with a machine. This may require changes to user interfaces, updates to terms and conditions, and clearer notices throughout the customer journey.

 

Third-Party Tools and Supplier Risk


Many gambling operators rely on third-party AI solutions for data analysis, risk modelling, or customer interaction. Under the EU AI Act, compliance responsibility is shared between developers and deployers. This means gambling companies must:

 

  • Vet third-party AI providers for compliance readiness

  • Ensure contractual terms reflect regulatory obligations

  • Maintain audit trails and system documentation

 

Where to Start

 

Navigating this new regulatory terrain can feel overwhelming, but early action can turn compliance into a strategic advantage. Here are steps operators should be taking now:


Map Your AI Systems

Create a full inventory of where and how AI is used in your organisation. Categorise these systems according to the EU AI Act’s risk tiers.

Conduct Risk Assessments

For any system that may fall into the high-risk category, undertake a comprehensive risk assessment that includes impact on users, data governance, and model explainability.

Reinforce Governance Structures

Create cross-functional teams that bring together legal, compliance, technical, and product leadership. Ensure there is clear accountability for AI oversight.

Review Data Practices

The EU AI Act reinforces the importance of high-quality, unbiased training data. Review how data is collected, cleaned, labelled, and stored—and whether user consent is adequately managed.

Establish Human Oversight Mechanisms

Even for automated systems, there must be clear human involvement at key decision points, especially where outcomes affect users’ rights or access.

 

A Call for Sector Leadership

 

The online gambling industry has always operated at the intersection of technology and regulation. The EU AI Act is not a disruption, it is a call to evolve. Operators that treat it as a box-ticking exercise may find themselves struggling to keep pace, while those that embed these principles into their product development and user care strategies will be better positioned to win trust and foster long-term growth.

 

AI holds enormous potential to make gambling safer, smarter, and more sustainable. But this potential can only be realised if innovation is balanced with responsibility.

 

One year into the EU AI Act, the message is clear: the future of gambling is not just about what AI can do, but what it should do and how transparently and ethically it does it.

© 2024 SolutionsHub Limited.

SolutionsHub Limited, a company registered in the Isle of Man (130728C) at Ground Office Suite, 11-13 Hill Street, Douglas, IM1 1EF, Isle of Man. Copyright © 2024 SolutionsHub Limited – all rights reserved. No information provided by SolutionsHub Limited should be construed as specific VAT or taxation advice. All information provided is, to the best of our knowledge, correct when published. SolutionsHub Limited are not professional tax advisors. Any information provided is based on our Company’s many years of experience in the industry and practical knowledge of workable solutions. We propose tried and tested solutions which we have utilised and are happy to propose to potential clients. We always recommend that clients obtain independent advice.

Privacy Policy   |   Cookie Policy   |   LinkedIn

 

bottom of page