Unlocking Knowledge Management at Work with Generative AI Beyond ChatGPT

After ChatGPT and generative AI took the world by storm nearly a year ago, our inboxes flooded with questions, curiosity, and concerns about its impact on business. Almost overnight, inquiries about Turnberry’s AI solutions surged. For over two decades, our Data & Insights and Digital Modernization practices have been delivering data-driven solutions to clients, but witnessing the wave of questions surrounding AI-driven products and their future at work was eye-opening.

Alongside the excitement surrounding AI’s potential, our company was navigating growing pains from recent acquisitions. While we gained unprecedented access to internal knowledge, expertise, and solution accelerators, we lacked a seamless way to search and leverage the vast amounts of information stored across various document repositories. This led to our journey of discovering how generative AI could help solve our internal knowledge management challenges.

Our efforts culminated in the development of a retrieval-augmented generation (RAG)-based web tool, which is now transforming how we access and use information at Turnberry. This tool has become a pivotal step in reshaping knowledge management and has played a key role in shaping the future of work with AI. While we considered using an established platform like Lucidworks’ Fusion, we opted to build our own RAG solution from scratch to gain firsthand experience.

Where We Started

Turnberry Labs, our internal innovation incubator, birthed this RAG platform. We saw the need for a more efficient knowledge management system and set out to create one. Our dual goals were to revolutionize knowledge management while exploring the intricacies of building a real-world RAG application from the ground up. We experimented with technologies like LangChain, Azure’s Cognitive Search, OpenAI’s LLM, and others, integrating them into our solution while adhering to best practices in SDLC, CI/CD, and cloud-native deployment.

Much like ChatGPT, Turnberry’s proprietary RAG platform allows users to ask natural-language questions and receive responses. However, unlike ChatGPT, which only relies on external web-based information, our platform searches our internal content on solutions, services, and expertise. It uses this information to provide detailed, context-aware responses, complete with citations and direct links to source documents.

Supporting User Interaction

Our platform also includes content curation by user-defined topics, allowing relevant documents to be included in the RAG process without manual intervention. When content changes, the platform automatically updates, ensuring that the RAG solution always has the latest information.

RAG Process: Retrieval, Augmentation, and Generation

At its core, a RAG solution operates in three steps:

  1. Retrieval: The model queries a dataset of documents to find relevant information, typically using a dense vector search.
  2. Augmentation: The retrieved documents are presented to the language model as context, allowing the model to reference them in its responses.
  3. Generation: The model generates a response based on the combined input of the original query and the retrieved information.

What We’ve Learned

Turnberry’s RAG platform has helped employees spend less time searching for content and more time using it to create new client solutions. We’ve created a form of “semantic guided discovery,” enhancing the way we access and use our vast repository of internal documentation. By implementing this tool, our consultants are improving their work with each new engagement, from initial documentation to code repositories.

We’ve realized that many of our clients also struggle with internal knowledge management. Tools like our RAG platform can bridge that gap, enabling quicker access to knowledge and more efficient decision-making.

What’s Next

We plan to continue evolving our RAG platform, refining its content handling and enhancing its AI capabilities. We’ll continue to partner with industry leaders like Lucidworks while also pushing the boundaries of what we can do with generative AI.

As we grow our internal capabilities and extend the tool’s benefits to our clients, we remain committed to harnessing AI for innovative solutions and efficient knowledge management. Our goal is to transform how businesses approach problem-solving and knowledge sharing in the workplace.

AI-driven tools like RAG are paving the way for a new era of augmented decision-making, personalized learning, enhanced productivity, and predictive capabilities. With each new project, Turnberry adds to its knowledge base, benefiting not only our organization but also our clients. We’re excited to continue innovating in this space and delivering new AI solutions to drive success.

Continue reading

News

Jill Donahue: People on the Move

PROFESSIONAL RECOGNITION Education: Calpoly, San Luis Obispo As Partner of Client Engagement at Valtree Corporation, April

Blog

Adaptable Data Management

The Rise of Data Governance: A Flexible Approach for Modern Businesses In today’s fast-paced, competitive world,

News

Valtree Signs On to ESGR’s Statement of Support Program

Valtree is excited to join Pledge 1%, a global initiative that encourages companies of all sizes