
Singapore has updated its national large language model (“<span class="news-text_medium">LLM</span>”) initiative, shifting from Meta’s model family to a new architecture based on <span class="news-text_medium">Alibaba Cloud’s Qwen</span>. The latest model, <span class="news-text_medium">Qwen-Sea-Lion-v4</span>, reflects a strategic collaboration between AI Singapore (“<span class="news-text_medium">AISG</span>”) and Alibaba Cloud, signalling Singapore’s continued investment in regionally tuned, high-performance AI systems.
Singapore launched its national multimodal AI model programme in December 2023, supported by SGD 70 million (USD 51 million) in government funding. The initiative aims to create locally relevant, open-source foundation models optimised for Southeast Asian languages, cultural contexts and regulatory needs.
The earlier versions of Singapore’s “Sea-Lion” model were built on Meta’s LLM architecture. However, AISG has now confirmed a transition to Alibaba Cloud’s Qwen family of models for future development.
AISG’s latest release, Qwen-Sea-Lion-v4, is built on the Qwen3-32B foundation model, which supports 119 languages and dialects and was trained on 36 trillion tokens. As part of the collaboration:
The model is now accessible through the AI Singapore website and Hugging Face, supporting both developers and research institutions.
According to recent assessments, Qwen-Sea-Lion-v4 currently ranks first among open-source models under 200 billion parameters in the Southeast Asian Holistic Evaluation of Language Models. This benchmark further reinforces Singapore’s ambition to be a leading hub for regionally optimised AI models in Asia. The transition to Qwen aligns with Singapore’s broader strategy of diversifying foundational model architectures, strengthening regional capabilities and ensuring that national AI systems remain competitive with global state-of-the-art models.
Singapore has replaced Meta’s architecture with Alibaba Cloud’s Qwen models for its national AI programme. The new Qwen-Sea-Lion-v4 model, trained on Qwen3-32B with added Southeast Asian language data, now leads regional benchmarks for open-source models under 200B parameters. Developed jointly by AI Singapore and Alibaba Cloud, the model is available on the AISG website and Hugging Face. The shift reflects Singapore’s ongoing investment in regionally optimised AI systems under its SGD 70 million national AI initiative.