BNP Paribas has launched an internal "LLM as a Service" platform to accelerate the industrialization of generative AI use cases across the Group. Designed and operated by the bank’s IT teams, the platform offers secure, centralized access to large language models (LLMs) for all BNP Paribas business entities. It forms a core part of the bank’s broader tech strategy to enhance customer personalization and operational performance through AI.
Hosted entirely within the bank’s own data centers, the infrastructure features GPU-enabled systems and supports open-source models, as well as those from Mistral AI, a partner of BNP Paribas. Future iterations will also include models trained on internal data to meet entity-specific needs.
A shared framework for rapid innovation
The platform allows different business units to access and integrate LLMs into internal tools and workflows via a standardized interface. This shared approach enables scalability, reduces duplication of effort, and ensures data security and compliance with regulatory standards.
Several generative AI projects are already underway or in production within the Group, including virtual assistants, document generation tools, and advanced document search capabilities. The platform streamlines the deployment of such tools, offering a unified and secure environment that helps teams focus on business outcomes rather than technical implementation.
Balancing flexibility and control
BNP Paribas has begun rolling out the service across selected units, including Hello bank! and its general inspection department. A phased expansion will follow, with adjustments made based on operational feedback and local business requirements. The platform also enhances developer productivity by integrating code generation, DevOps tools, and AI technologies within a centralized ecosystem.
The initiative positions the bank to expand its AI capabilities while maintaining strong oversight, cost control, and data protection.