KEY TAKEAWAYS
- Theta Labs partners with Liner to enhance its AI search engine using Theta EdgeCloud’s decentralized GPU infrastructure.
- Liner, a leader in generative AI-powered search, aims to scale its services to over 10 million users, primarily in academia.
- The collaboration supports Liner’s growth, backed by $29 million in Series B funding, and strengthens its presence in the U.S. market.
- Utilizing advanced AI models, Liner delivers hyper-personalized search results, leveraging trusted academic sources.
Theta Labs has announced Liner as its latest enterprise customer for Theta EdgeCloud. Liner, a global leader in generative AI-powered search solutions, is ranked among the Top 10 globally in Andreessen Horowitz’s Top 50 Generative AI web services.
Liner plans to utilize Theta EdgeCloud to scale its innovative AI search engine, which serves over 10 million students and researchers. This collaboration aims to enhance Liner’s capabilities by leveraging EdgeCloud’s decentralized GPU infrastructure.
Expanding Reach in Academia
Since its launch in June, Theta EdgeCloud has gained significant traction in U.S. and Korean academia. Institutions such as the University of Oregon, Korea University, Seoul Women’s University, KAIST, and Yonsei University are utilizing its infrastructure to advance AI research in various fields.
Liner’s primary user base consists of students and researchers, aligning well with EdgeCloud’s existing academic customers. This shared focus enables Theta to provide robust infrastructure for high-quality AI search engines.
Liner’s Growth and Technological Advancements
Liner recently secured $29 million in Series B funding led by INTERVEST and Atinum Investment, with support from Samsung Venture Investment and others. This funding solidifies Liner’s position as a leader in specialized information retrieval.
The startup is rapidly expanding in the United States, its largest market, with over 10 million registered users from prestigious universities, including UC Berkeley, Texas A&M, and The University of Southern California. According to founder and CEO Luke Jinu Kim, two-thirds of Liner’s paying users are in U.S. academia.
Liner’s AI search engine operates on a combination of GPT-4, GPT-3.5 Turbo, and its own fine-tuned Liner 7B model. This technology is designed for hyper-personalized information retrieval, delivering precise, context-aware answers from trusted sources like journals and papers.
Through its partnership with Theta, Liner aims to accelerate its AI inference tasks by utilizing EdgeCloud’s high-performance hybrid decentralized GPU resources, enabling faster and more efficient search results.
Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the official policy of CoinsHolder. Content, including that generated with the help of AI, is for informational purposes only and is not intended as legal, financial, or professional advice. Readers should do their research before taking any actions related to the company and carry full responsibility for their decisions.