Whitepapers
Applying Generative AI to Sales
As consumer Generative AI applications surge in popularity and ignite public debates, enterprise organizations are steadily identifying innovative ways to apply Large Language Models (LLMs) across various business functions. In a recent Wall Street Journal interview, Cohere’s President and Chief Operating Officer, Martin Kon , shed light on the dichotomy between consumer and enterprise sectors, predicting the emergence of two distinct markets.
The first encompasses a mass market for general productivity applications, while the second pertains to “enterprise-specific, strategic, differentiated, proprietary capabilities that will need to be developed inside, for and with the enterprise.” Building on this concept, the objective of this series is to dive into the various ways organizations are applying Generative AI to different business functions and industries.
The first domain we will focus on is how sales organizations can leverage LLMs to distinguish their go-to-market (GTM) approach. During a fireside chat with Greylock Partners, Stripe’s Chief Revenue Officer Mike Clayville (AWS Worldwide Field VP at the time), discussed the “Darwinism of sales”, emphasizing the need for sales organizations to constantly evolve to secure and maintain a competitive edge.
From an evolutionary perspective, recent advancements in LLMs provide a generational opportunity for revenue and technology leaders to differentiate their core systems and selling motions. However, evolution is a double-edged sword, and while these advancements present exceptional opportunities, failure to act will compound over time as competitors operationalize these capabilities within their sales processes. This article will explore three practical areas where revenue leaders can begin to apply Generative AI within the GTM organization.
Fish where the fish are.
In the aforementioned Greylock discussion, Mike Clayville compares a sales force to a manufacturing plant, each possessing a finite capacity and yield that needs to be meticulously managed in an ever-evolving marketplace. Building on this analogy, Clayville asserts that the greatest waste of selling capacity results from targeting the wrong people, emphasizing the necessity to “fish where the fish are” and execute a highly targeted selling motion. This idea provides a perfect use case for the application of LLMs because so much of the data from customer conversations exists as unstructured text. This text is often buried within CRM systems, notes, meeting transcripts, and emails and is rarely analyzed. However, with the recent advances in LLMs, organizations can train foundational models to bring these data sets into the light and do pattern matching across customer conversations to better understand who they should be targeting.
Building on this notion, LLMs can also be used to analyze why customers buy in order to recognize potential sales triggers and recommend the right moments to engage. For instance, consider an account executive at a cybersecurity company. They have a discussion with a security director at a potential retail customer and discover that, while their platform isn’t an immediate priority, it may be relevant to discuss in Q2 of next year as the retailer is launching a new consumer app that will require stronger authentication measures (a potential trigger). This critical piece of information is what Clayville calls a “sales intersection” and needs to be acted upon with precision and intention. However, the right follow-up at the right time doesn’t always happen. Maybe the rep moved to a new role or the territories changed at the turn of the fiscal year and that insight is lost, buried in the depths of CRM field.
Losing track of that insight has a material opportunity cost as competitors are having the same discussions and perhaps followed up with a targeted message that was sent two quarters before the retailer’s application was due to go live. This cost is amplified at a macro level, especially within large, global sales forces engaging in thousands of these discussions daily. This is where the application of LLMs can significantly reduce this occurrence by building an execution engine that classifies and captures these triggers, enabling sellers to engage at the right time with relevant messaging. With the proper technical foundations in place, a process can be developed where:
- Utilizing LLMs, we train models on historical customer data, condensing conversations and pinpointing buying triggers. Simultaneously, they monitor systems like emails, call transcripts, and notes, ensuring triggers and valuable insights aren’t missed.
- When a sales intersection is identified, these models generate a comprehensive snapshot of the event, including use case, ideal engagement time, and resource recommendations. This insight is smoothly integrated into the workflow, prompting the creation of a future task within the CRM system at the account level.
- When the time is right, the task alert prompts the sales rep to engage with the prospect or customer, armed with the model-generated insight, ensuring a relevant and timely conversation.
By analyzing, summarizing, and classifying customer conversations at scale, these models can offer valuable insights not only to the sales force but also to product and engineering teams. For instance, they can be trained to shed light on customer feedback, popular feature requests, and unique capabilities that differentiate the company’s offerings. This creates a scenario where the application of LLMs delivers cross-departmental value that contributes to a unified goal: increased product adoption, which in turn drives revenue.
Reduce Operational Burdens.
In a recent edition of The Prompt, Philip Moyer , Google Cloud’s Global AI VP, explores how leaders can utilize Generative AI to reduce operational burdens. Moyer observes, “most businesses have repetitive drudgery, with highly-paid information workers doing the same task over and over again with unstructured content.” Within the context of the GTM organization, these repetitive tasks span all roles—from direct sellers and managers to revenue operations leaders—and can impede the sales organization’s capacity to focus on higher-value activities. Moyer also emphasizes that these tasks are typically not enjoyed and tend to be error-prone, further diminishing ROI of the time it takes to complete them. For instance, take the task of sellers maintaining weekly updates and notes on customer conversations in a standardized manner. Although this activity may be tedious and not have an immediate impact on the customer, neglecting it will negatively impact execution over time. Lack of documentation will lead to inaccurate forecasts, missed opportunities to brief an executive before a critical meeting, or overlooked “sales intersections,” as explored earlier. However, by integrating LLMs directly into the CRM system, these scenarios can be avoided, enabling a process that:
- Is able to capture unstructured information by setting up call transcripts and email-to-CRM integrations.
- Integrates that data into LLMs that are trained to summarize key points and next steps.
- Populates the generated information into the appropriate CRM field on a weekly basis.
By integrating LLMs into core systems, organizations can keep their business process layer the same, avoiding expensive and long application migrations. Better yet, major CRM providers like Microsoft are building Generative AI capabilities into the application layer, offering many of these foundational models as new features.
Ready to discover more?
Contact us and we’ll set up a video call to discuss your requirements in detail.