AI-Powered Platform Redefines Intelligent Assistance
Advertisements
The surge of generative AI is currently unfolding in simpler terms, revealing a landscape rich with possibilities. While the advent of truly revolutionary applications may still remain on the horizon, an unmistakable wave of optimism about the future of artificial intelligence can be observed among professionals across various sectors. This is particularly evident in business-to-business (B2B) environments, where multi-modal AI systems are acting as powerful catalysts, propelling industries such as finance, manufacturing, healthcare, and telecommunications towards smarter, data-driven transformations. The AI agents have emerged as intelligent allies, progressively infiltrating core processes and expanding the reach of AI solutions within enterprises.
After enduring a season of both hype and inevitable downsizing, generative AI seems poised to deliver on its promises. This dynamic has been supported by substantial data from reputable research organizations. For instance, a report by Gartner predicts that by 2025, more than half of large enterprises will fully embrace AI assistants in their operations. Meanwhile, MarketsandMarkets anticipates that the global market for AI assistants will exceed $28.5 billion by 2028, showcasing a staggering annual growth rate of 43%. Such projections further reinforce the notion that the era of AI accountability is imminent.
However, this promising outlook doesn't imply a smooth road ahead. The role of smart assistants as pioneers of practical AI application inevitably exposes them to unprecedented challenges. For enterprise clients, safeguarding data while ensuring efficient operations ranks foremost among their priorities. Moreover, minimizing development and maintenance costs is crucial; however, off-the-shelf AI assistants often fall short of meeting the stringent demands of the B2B sector. Thus, the need for tailored solutions to address specific pain points becomes increasingly pressing.
Recognizing this necessity, the exploration of new pathways for intelligent assistant evolution based on cutting-edge AI application architecture has indeed become a central issue. This context has fostered the emergence of customized AI integrated systems designed for enterprise clients. A notable recent development involves the collaboration between openGauss and DingTalk—a leading enterprise communication platform in China—to create a new vector-based AI integrated machine. This innovative solution not only alleviates data security and efficient storage challenges but also significantly lowers the barrier for businesses to leverage AI for innovation, paving new pathways for intelligent assistants to thrive in complex environments.
For businesses, the journey toward fully harnessing the potential of AI is not merely a fleeting desire. Instead, it stems from a deep-seated need arising from their past experiences with automation and digital transformation. These enterprises are determined to integrate their substantial history with a forward-looking approach that embraces intelligence. As a case in point, DingTalk exemplifies how enterprise applications that grow alongside their clients can successfully navigate this new wave of AI enthusiasm.
Since its inception in 2015, DingTalk has adopted a pioneering spirit, playing a significant role in the digital and intelligent evolution of enterprises in China. Initially branded as a singular collaboration tool, it has transcended into a "super application" with over 700 million users and 25 million organizations worldwide. The mantra of "Let progress happen" encapsulates its commitment to empowering businesses through effective digital solutions.
In response to the opportunities and challenges posed by generative AI, DingTalk has strategically built a fortress around three core scenarios: collaboration, communication, and knowledge management. By striving to create an optimal environment for large model applications and a testing ground for innovations, DingTalk aligns its AI tools and solutions to mirror the seamless integration of artificial intelligence in everyday business activities.
Being a leader in the enterprise market, DingTalk is acutely aware of its customers' evolving needs for intelligent solutions; thus, activating their digital assets and constructing bespoke AI solutions is seen as the way forward. Through continual exploration of breakout paths, DingTalk seeks to harness different types of digital assets—be it knowledge, data, software, or models—transforming how businesses perceive their operations. Examples of this include the implementation of ChatBL for natural language inquiries and ChatMemo for enhancing internal knowledge documentation, both aimed at elevating efficiency and quality across business functions.
This is a challenging arena, especially given the driving force of the new wave of intelligent assistants, but it has opened significant opportunities for building enterprise-specific AI solutions. As individual team members or entire groups begin to rely on a cadre of AI assistants for various tasks such as research and development, sales, marketing, or human resources, it may radically alter the landscape of office practices. However, with this evolution comes the potential for complications in foundational software and architecture, introducing new demands that AI integrated machines must now address.
Promisingly, the rise of AI integrated machines signals a bright future. In the current phase where generative AI is capturing computational resources, AI integrated machines have become remarkably popular. This includes set-ups covering a broad range of scenarios from public cloud AI Generation Content (AIGC) services to privatized deployments for large enterprises and government institutions. The niche also caters to edge computing and long-tail scenarios with affordable AI modules, facilitating diverse applications of large models across many industries.
As the deployment and implementation of AI accelerate, ensuring robust functionality in foundational software becomes imperative. Challenges remain, particularly regarding how to amplify the capabilities of databases that support these AI applications effectively. Typically, existing AI integrated machines necessitate providing both vector databases and relational databases, complicating their capacity to deal with diverse challenges.
Collaborative ventures undoubtedly stand out as a solution for overcoming these obstacles. The release of the AI integrated machine through DingTalk and openGauss serves as a benchmark, innovatively merging relational and vector databases to achieve synergistic benefits—transformative possibilities that herald significant progress in enterprise-specific private intelligent assistants.
Critically, this new model supports up to 32 DDR4 memory channels and 8 AI chips, being fully compatible with Kunpeng, Ascend, openGauss databases, and several popular domestic and global models. It boasts an impressive generation time of less than 30 milliseconds for every hundred Chinese characters. Moreover, customers can customize large models tailored to their diverse needs while achieving an efficient operational experience.
One major priority for enterprise-level clients is ensuring data security and efficient data storage, which DingTalk and openGauss address through cutting-edge AI application architectures. By facilitating local deployments within internal networks, sensitive data remains secure, rendering strong protection for digital assets. Their collaboration also sees the formulation of a comprehensive private digital asset storage solution that leverages the DataVec engine for deep integrations of relational databases and vector databases, achieving a 20% lead over competitors in retrieval performance thanks to effective search algorithms and solid computational capabilities.
Beyond merely aiming for higher performance, enterprises also prioritize innovative technologies that can reduce the barriers associated with AI development and application. Given the multitude of complex business scenarios, pure vector retrieval typically demands bespoke development and manual tagging procedures, leading to heightened development and maintenance costs. Here, openGauss's DataVec engine implements hybrid querying technology, allowing mixed scalar and vector searches that dramatically reduce the need for custom development. Additionally, the multi-label filtering functionalities grounded in standard SQL syntax not only lower onboarding costs but also enhance query accuracy by more than 30%, thereby establishing a robust foundation for large-scale AI innovation.
As the evolution of AI integrated machines continues, the cooperative initiative between DingTalk and openGauss showcases a dynamic interplay of software and hardware functionalities that breathe life into the ongoing effort to elevate enterprise applications towards becoming AI super applications.
Over the next two years, the partnership between DingTalk and openGauss is set to deepen, with ambitious plans to produce heterogeneous accelerators that could amplify vector indexing performance by over three times. Utilizing KVCache's “search instead of computation” methodologies aims to enhance long-sequence prompt storage in vector databases by a factor of one hundred, with the capability of ultra-large shared memory between nodes enabling sub-millisecond queries on billions of data entries.
Undoubtedly, these technological innovations will substantially elevate the efficiency and performance of AI applications while propelling industries toward new levels of intelligence. With a broader perspective in mind, the confluence of open-source principles and synonymous innovative self-sufficiency will unleash immense potential. Since its inception over four years ago, openGauss has adhered faithfully to the tenets of “co-construction, sharing, and collaborative governance,” progressively uniting innovative forces within the community, which now boasts over 850 corporate members and more than 7,600 contributors. This fertile ecosystem of open-source endeavors is well-poised to catalyze a flourishing of AI applications.
Encouragingly, building a thriving ecosystem through hardware openness and invigorating innovation via software sourcing is evolving into a common consensus within the industry. If DingTalk and openGauss's AI integrated machine is merely the overture for the intelligent agent's ascendance, then the main composition surrounding the explosion of AI applications is yet to unfold.