Zalo, a Vietnamese tech company, surprised experts by developing a Large Language Model (LLM) with 7 billion parameters in just six months.
By late 2023, Zalo’s LLM exceeded OpenAI’s ChatGPT in capacity by 150%, based on the Vietnamese Multitask Language Understanding Benchmark Suite.
Challenges faced by Zalo’s engineers during the LLM training process included a lack of computing infrastructure, data scarcity, and limited resources.
Vietnamese engineers struggled with inadequate computing resources compared to global competitors like OpenAI and Meta, which had access to thousands of Nvidia GPUs.
Training an LLM in Vietnamese was more challenging due to the language being considered “low-resource,” lacking digitized data compared to languages like English and Chinese.
Zalo made significant efforts to build a powerful computing infrastructure, acquiring 8 DGX H100 GPUs from Nvidia.
Through strategic data training methods, Zalo overcame data scarcity issues, playing a key role in the success of their LLM.
Zalo aims to leverage their LLM technology to develop smart chatbots for customer support and productivity tools.
The integration of the Kiki Giao Thong app on the Zalo platform, featuring accurate Q&As on Vietnamese traffic laws, has garnered praise from users.
Zalo’s development team views challenges as opportunities for growth and innovation in AI solutions tailored to Vietnam’s needs.
Despite notable achievements, ongoing efforts are required as AI development in Vietnam presents significant challenges.
Ranked 59th globally in AI readiness, Vietnam continues to enhance its AI capabilities, moving up in the ASEAN region rankings.
Zalo, a trailblazer in AI since 2017, operates 4 AI Labs with 80 researchers and advanced computing capabilities, including a server system with 8 DGX H100 GPUs.
Zalo’s AI products such as the Kiki voice assistant, FaceID, and AI Avatar have made a mark in the tech industry.