Liquid AI announced a multi-faceted partnership with Shopify to license and deploy Liquid AI's Liquid Foundation Models (LFMs) across quality-sensitive workflows on Shopify's platform, including search and other multimodal use cases where quality and latency matter. The first production deployment is a sub-20ms text model that enhances search. The agreement follows Shopify's participation in Liquid AI's $250 million Series A round in December 2024, and formalizes deep co-development already underway between the companies.
As part of the partnership, Shopify and Liquid have codeveloped a generative recommender system with a novel HSTU architecture. In controlled testing, the model has proven to outperform prior stack, resulting in higher conversion rates from recommendations. Liquid's LFMs are designed for sub-20 millisecond, multimodal, quality-preserving inference.
On specific production-like tasks, LFMs with ~50% fewer parameters have outperformed popular open-source models such as Qwen3, Gemma3, and Llama 3, while delivering 2-10x faster inference, enabling real-time shopping experiences at platform scale. The partnership includes a multi-purpose license for LFMs across low-latency, quality-sensitive Shopify workloads, ongoing R&D collaboration, and a shared roadmap. While today's deployment is a sub-20MS text model for search, the companies are evaluating multimodal models for additional products and use cases, including customer profiles, agents, and product classification.
Financial terms are not disclosed.


















