AI Suggestions

Rizer uses AI to analyze your lost deals and suggest how to handle them. Instead of manually figuring out why each deal was lost and when to follow up, the AI looks at deal notes, communications, and patterns to give you a starting point.

This article explains how AI suggestions work, where you’ll find them, and how to get the most value from them.

What the AI Does

When you recycle a deal, there’s information to gather: Why was it lost? When should you follow up? Was there a competitor involved? Did a missing feature cause the loss? Were there process issues that contributed?

Answering these questions manually for every deal takes time. You have to read through notes, remember conversations, and make judgment calls. Multiply that by dozens or hundreds of lost deals, and it becomes a real burden.

The AI handles the initial analysis. It reads through what’s available — deal notes, activity history, any communications logged in HubSpot — and suggests answers. You review the suggestions, adjust what needs adjusting, and move on.

Think of it as a first draft. The AI does the research and proposes answers. You provide the final judgment.

Where AI Suggestions Appear

You’ll encounter AI suggestions in several places throughout Rizer.

In the Recycling Form

When you recycle a deal — whether from HubSpot or from Rizer — the form opens with AI suggestions pre-filled.

[Screenshot: Recycling form showing AI-suggested recycle reason and callback date with reasoning displayed]

The AI suggests:

Recycle reason — Why the deal was lost. The AI reads deal notes looking for clues: mentions of budget, timing, competitors, feature requests, or other signals that indicate what happened.

Callback date — When to follow up. The AI considers the recycle reason, any timing mentioned in the notes, and patterns from similar deals in your history.

Competitor information — If the AI detects a competitor mentioned in deal notes, it suggests which competitor won and sometimes which product.

Missing feature — If the AI finds references to features you don’t have, it suggests linking the deal to those tracked features.

Each suggestion appears with a brief explanation of why the AI chose that answer. This helps you evaluate whether the suggestion makes sense.

On the Dashboard

The main dashboard includes an AI insights section showing patterns across your recycled deals:

  • Common loss reasons trending up or down
  • Suggested timing improvements based on what’s working
  • Competitor trends
  • Deals that might need attention

These insights help you see the bigger picture, not just individual deals.

In the Ready for Callback List

When deals reach Ready for callback, AI suggestions appear alongside each deal:

  • Suggested approach for re-engagement
  • What’s changed since the deal was lost
  • Relevant context to consider before reaching out

This helps your team prepare for re-engagement conversations without having to dig through old notes.

During Onboarding

When you first set up Rizer, the AI analyzes your HubSpot deal history to pre-populate:

Competitors — The AI scans deal notes and communications across your lost deals to identify competitors that come up repeatedly. Instead of manually listing every competitor you lose to, you get a suggested list to review and refine.

Missing features — Similarly, the AI identifies feature requests mentioned across your deal history. Features that caused multiple losses appear on a list for you to review, saving setup time.

[Screenshot: Onboarding competitors step showing AI-suggested competitor list with edit options]

How the AI Generates Suggestions

Understanding what goes into AI suggestions helps you evaluate their quality and know when to trust them versus override them.

What the AI Analyzes

The AI looks at several sources of information:

Deal notes — Notes logged on the deal in HubSpot are the primary source. The more detailed your notes, the better the suggestions.

Activity history — Emails, calls, meetings, and tasks associated with the deal. The AI looks for patterns in communication and any content that explains what happened.

Deal properties — Standard HubSpot fields like close date, amount, stage history, and any custom properties you’ve configured.

Your website — During setup, you provide your company website. The AI uses this to understand your products, positioning, and industry context.

Historical patterns — The AI learns from your deal history. If deals lost to “no budget” in your organization typically convert after 6 months, it factors that into callback suggestions.

Similar deals — The AI compares this deal to others with similar characteristics. What recycle reasons worked well for similar deals? What callback timing led to successful re-engagement?

What Makes Suggestions Better

AI suggestion quality depends heavily on the quality of your data:

Detailed deal notes help a lot. A note saying “Lost – will follow up later” gives the AI almost nothing to work with. A note saying “Prospect liked the demo but said they need Salesforce integration before they can move forward. Budget resets in January.” gives the AI clear signals for both recycle reason and timing.

Logged communications help. If emails and call notes are captured in HubSpot, the AI has more context to analyze.

Consistent terminology helps. If your team uses consistent language — always saying “Competitor X” instead of sometimes “X” and sometimes “X Corp” — the AI picks up patterns more reliably.

Historical data helps. The more deals in your history, the better the AI understands patterns specific to your business.

What the AI Can’t Do

The AI has limitations:

It can’t read minds. If the real reason for a loss wasn’t documented anywhere, the AI has to guess based on incomplete information.

It can’t access conversations outside HubSpot. Slack messages, verbal conversations, and anything not logged in HubSpot aren’t visible to the AI.

It can’t predict the future. Callback date suggestions are based on patterns, not certainty. Circumstances vary.

It can’t override your judgment. You know things the AI doesn’t. Maybe you had a conversation yesterday that changes everything. The AI’s suggestions are a starting point, not a final answer.

Working with AI Suggestions

Here’s how to interact with AI suggestions effectively.

Reviewing Suggestions

When the recycling form opens with AI suggestions:

  1. Read the suggestion. What did the AI choose?
  2. Read the reasoning. Why did it choose that? Does the explanation make sense?
  3. Consider your own knowledge. Do you know something the AI doesn’t?
  4. Decide. Accept the suggestion as-is, or adjust it.

Most of the time, this takes just a few seconds. Quick scan, makes sense, move on. Occasionally you’ll catch something the AI got wrong and fix it.

Accepting Suggestions

To accept an AI suggestion, simply leave it in place. When you save the recycling form, whatever values are in the fields — whether AI-suggested or manually entered — become the actual values.

You don’t need to explicitly “accept” anything. The suggestion becomes real when you save.

Overriding Suggestions

To override an AI suggestion:

  1. Click on the field (recycle reason, callback date, etc.)
  2. Select or enter a different value
  3. The field updates to show your choice
  4. An indicator shows the field was modified from the AI suggestion

Your override is saved when you submit the form. The AI doesn’t argue or ask for confirmation — your judgment wins.

[Screenshot: Recycling form field showing “AI modified” indicator after user changed the value]

When to Override

Override AI suggestions when:

You have direct knowledge the AI doesn’t. Maybe you had an unlogged conversation where the prospect said exactly why they’re not moving forward. Your knowledge trumps the AI’s analysis of incomplete notes.

The suggestion doesn’t match reality. Sometimes the AI misreads notes or picks up on something irrelevant. If the suggestion clearly doesn’t fit what happened, change it.

You want to test different timing. The AI suggests 6 months, but you have a hunch that 3 months is better for this particular deal. Go with your instinct.

The reasoning doesn’t hold up. The AI shows its work. If the reasoning seems off — maybe it’s latching onto a mention that wasn’t actually relevant — that’s a sign to override.

When to Trust the AI

Trust AI suggestions when:

You don’t have strong prior knowledge. If you’re recycling a deal you weren’t personally involved with, the AI’s analysis of the notes is probably as good as your guess.

The reasoning makes sense. If the AI says “suggesting ‘Missing feature’ because notes mention ‘need Salesforce integration'” and that matches what you know, trust it.

You’re processing high volume. If you’re working through a backlog of deals, spending 30 seconds per deal is more practical than deep-diving into each one. The AI helps you move efficiently.

The stakes are moderate. For routine deals, AI suggestions are usually good enough. Save your careful manual analysis for high-value opportunities.

AI-Suggested Recycle Reasons

The AI looks for signals in deal notes and history to identify why a deal was lost.

How It Works

The AI scans for keywords, phrases, and patterns that indicate specific loss reasons:

  • Mentions of “budget,” “cost,” “expensive,” or “price” suggest pricing-related reasons
  • Mentions of specific competitors suggest competitive loss
  • Mentions of features you don’t have suggest missing feature as the reason
  • Mentions of “timing,” “not ready,” or “next year” suggest timing issues
  • Lack of recent activity combined with no clear resolution suggests the buyer stopped responding

The AI weighs multiple signals. A deal with both “too expensive” and “missing feature” mentions gets analyzed to determine which was the primary driver.

Reviewing the Reasoning

Each suggested recycle reason comes with a brief explanation:

“Suggesting ‘No available budget’ because notes mention: ‘loves the product but no budget until Q1′”

“Suggesting ‘Better price by competitor’ because notes reference: ‘went with Competitor X at lower price point'”

“Suggesting ‘Buyer stopped responding’ because: no activity in final 6 weeks of deal, no close reason documented”

This transparency helps you evaluate the suggestion quickly.

Common Suggestion Patterns

Clear signals → High confidence suggestions

When notes explicitly state what happened, the AI gets it right most of the time. “Lost to Competitor X on price” gives the AI everything it needs.

Mixed signals → Best guess suggestions

When notes mention multiple factors, the AI picks what seems primary. It might be right, or you might need to adjust.

No signals → Generic suggestions

When notes are sparse, the AI falls back to “Buyer stopped responding” or “No feedback provided” — the catch-all reasons for deals that died without clear explanation.

AI-Suggested Callback Dates

The AI suggests when to follow up based on the recycle reason and any timing clues in the notes.

How It Works

The AI considers several factors:

Recycle reason defaults — Each reason has a typical callback period. “No available budget” defaults to 6 months (budget cycle timing). “Not the right time” might default to 3 months.

Explicit timing in notes — If notes say “check back in January” or “revisit after Q1,” the AI picks up on that and adjusts the suggestion.

Historical patterns — If deals with this recycle reason in your organization tend to convert after a certain period, the AI factors that in.

Feature timing — If the deal is linked to a missing feature with an expected ship date, the AI might suggest a callback around that date.

Reviewing Date Suggestions

The reasoning for date suggestions looks like:

“Suggesting June 15 because: notes mention ‘new fiscal year starts in June'”

“Suggesting 6 months (December 1) based on: typical callback period for ‘No available budget'”

“Suggesting when feature ships because: deal linked to ‘Salesforce integration’ (expected Q3)”

If specific timing was mentioned, the AI cites it. If not, it defaults to the standard period for that recycle reason.

Adjusting Dates

Override date suggestions freely. The AI works from patterns and documented timing, but you might know:

  • The prospect mentioned timing verbally that wasn’t logged
  • Industry-specific cycles that affect this deal
  • Relationship context that suggests sooner or later follow-up
  • Recent events that change the timeline

Click the date field, pick a different date, and move on.

AI-Suggested Competitor Information

When the AI detects competitor mentions in deal notes, it suggests competitor details.

How It Works

The AI scans for:

  • Names of competitors you’ve configured in Rizer
  • Variations and common misspellings of those names
  • Phrases like “went with [competitor]” or “chose [competitor]”
  • References to competitive evaluations or bake-offs

When it finds a match, it suggests:

  • Switched to: Competitor (vs. in-house, no solution, or unknown)
  • Competitor name — The specific competitor detected
  • Competitor product — If mentioned and mapped in your settings

Reviewing Competitor Suggestions

The AI shows what it found:

“Detected competitor mention: ‘ultimately decided to go with Competitor X’s enterprise product'”

Review this and confirm it matches what happened. Sometimes the AI picks up on a competitor that was mentioned but didn’t actually win — maybe they were in the evaluation but the prospect chose a different path.

AI-Suggested Missing Features

If deal notes mention features you don’t have, the AI suggests linking to tracked missing features.

How It Works

The AI compares feature names in your missing features list against mentions in deal notes. When it finds a match:

  • It suggests linking the deal to that feature
  • It might suggest an importance rating based on how the feature was discussed

For example, if “Salesforce integration” is in your missing features list and the notes say “need Salesforce integration before we can proceed,” the AI connects them.

Reviewing Feature Suggestions

The AI shows its reasoning:

“Suggesting link to ‘Salesforce integration’ because notes mention: ‘can’t move forward without SFDC sync capability'”

Confirm the feature mentioned in notes is actually the one in your list. Sometimes similar-sounding features are different things.

AI-Identified Sales Execution Issues

Beyond recycle reasons and timing, the AI identifies potential process problems that contributed to the loss.

What It Looks For

Timing gaps — Long periods without activity. If there was a demo on March 1 and the next touchpoint was April 15, that gap might have cost you momentum.

Stakeholder coverage — Multiple contacts involved but engagement concentrated on just one. If three people were in meetings but only one got follow-up, decision-makers may have been neglected.

Response delays — Slow responses to prospect inquiries. If they asked a question and it took a week to answer, that’s a flag.

Missing steps — Expected sales process steps that didn’t happen. If deals typically include a technical review and this one skipped it, that might explain the loss.

How These Appear

Sales execution issues show up in the recycling form as observations:

“Potential issue: 14-day gap between discovery call and follow-up”

“Potential issue: Only 1 of 3 stakeholders received direct outreach”

You can acknowledge these, dismiss them, or add notes. They’re informational — helping you understand what might have gone wrong and learn from it.

Using This Information

Sales execution issues feed into reporting. If the same issues appear repeatedly across lost deals, there’s a process problem worth addressing. Maybe your team needs to follow up faster, engage more stakeholders, or add a step that’s being skipped.

Managers can use this data for coaching. If one rep’s deals consistently show long response times while others don’t, that’s a specific behavior to work on.

Improving AI Suggestions Over Time

AI suggestions get better as you use Rizer. Here’s how to help.

Correct Wrong Suggestions

When you override an AI suggestion, Rizer learns. If the AI keeps suggesting “Buyer stopped responding” for deals that were actually lost to competitors, your corrections teach it to look for different signals.

You don’t need to do anything special — just make the corrections as you recycle deals. The learning happens automatically.

Keep Deal Notes Detailed

The single biggest factor in AI quality is note quality. Encourage your team to document:

  • Why deals were lost (in plain language)
  • What competitors were involved
  • What features or capabilities came up
  • Any timing mentioned by the prospect
  • Key conversations and decisions

Good notes now mean better AI suggestions later — for this deal and for future deals with similar patterns.

Configure Competitors and Features Completely

The AI can only suggest competitors and features it knows about. If a competitor isn’t in your list, the AI can’t suggest it. Take time to:

  • Add all relevant competitors during setup
  • Add missing features as they come up
  • Keep names consistent and recognizable

The more complete your configuration, the better the AI’s suggestions.

Review AI Performance Periodically

Every few months, take a look at how AI suggestions are working:

  • Are you overriding suggestions frequently? That might indicate the AI needs better data, or your notes need more detail.
  • Are certain recycle reasons always wrong? Maybe the AI is misinterpreting common phrases in your notes.
  • Are callback dates working out? If deals keep becoming ready at the wrong time, the AI’s timing model might need adjustment from better historical data.

You can’t directly tune the AI, but you can improve the inputs (notes, configuration) that drive its suggestions.

AI Suggestions During Onboarding

When you first set up Rizer, the AI does a bulk analysis of your HubSpot history to jump-start your configuration.

Competitor Discovery

The AI scans all your closed-lost deals looking for competitor mentions. It builds a suggested list of competitors you’re losing to, ranked by how often they appear.

During the competitors setup step:

  1. Review the AI-suggested list
  2. Remove false positives (things that aren’t actually competitors)
  3. Edit names for consistency
  4. Add any competitors the AI missed
  5. Map competitor products to your products

This saves significant setup time compared to manually building a competitor list from scratch.

Missing Feature Discovery

Similarly, the AI scans for feature requests and capability gaps mentioned across your lost deals. It suggests a list of missing features that caused losses.

During the missing features setup step:

  1. Review the suggested features
  2. Remove things that aren’t actually features or aren’t relevant
  3. Edit names for clarity
  4. Set the current status (unplanned, planned, in progress, completed)
  5. Add any features the AI missed

Again, this gives you a running start. You’re refining a suggested list rather than creating one from nothing.

What If the AI Doesn’t Find Much?

If your deal notes are sparse or don’t contain clear competitor/feature mentions, the AI won’t have much to suggest. That’s okay — you can:

  • Add competitors and features manually
  • Focus on improving note quality going forward
  • Let the AI learn as better-documented deals come through

The onboarding AI is helpful but not required. Manual setup works fine.

Common Questions About AI

How accurate are the suggestions?

It varies based on data quality. With detailed deal notes, suggestions are right most of the time. With sparse notes, the AI often falls back to generic answers that may need adjustment.

For most teams, AI suggestions are accurate enough to save significant time. You’ll override maybe 20-30% of suggestions, which is still much faster than filling everything in manually.

Does the AI access confidential information?

The AI only analyzes data already in HubSpot and Rizer:

  • Deal notes and activities
  • Contact and company information
  • Your product catalog
  • Your public website

It doesn’t access anything outside these systems, and your data isn’t shared with other Rizer customers or external parties. The analysis happens within Rizer’s secure infrastructure.

Can I turn off AI suggestions?

You can’t disable AI suggestions entirely, but you can ignore them. The suggestions are just pre-filled values — change them to whatever you want. If you prefer to fill in everything manually, just overwrite the suggestions each time.

What if the AI suggests something inappropriate?

The AI occasionally makes mistakes — suggesting a competitor that wasn’t involved, or a recycle reason that doesn’t fit. Just override it. There’s no penalty for changing suggestions, and your corrections help improve future suggestions.

If the AI is consistently wrong about something specific, check whether your notes or configuration might be causing confusion. Clear up ambiguous data and suggestions should improve.

How long does it take for the AI to generate suggestions?

Suggestions typically appear within a few seconds of opening the recycling form. You might see a brief loading indicator while the AI analyzes the deal.

For complex deals with lots of notes and history, analysis might take 10-15 seconds. If it takes longer than that, there may be a temporary system issue — try refreshing.

Why did the AI suggest something different for similar deals?

Even similar deals have different notes and history. Small differences in language or what was documented can lead to different suggestions. The AI looks at each deal individually, not just at surface-level similarities.

If two deals seem identical but got different suggestions, compare the notes. You’ll usually find something that explains the difference.

Further reading:

Still stuck? How can we help?