Building a Repeatable Signal-Based Outbound Motion
A devtool's journey with Common Room: learnings, pitfalls, and further opportunities.
If you’re reading similar content as I am, your LinkedIn feed is probably overflowing with posts about signal-based warm outbound. It's the current darling of the B2B SaaS world, promising to transform your outbound motion from cold calls into perfectly timed, personalized outreach that magically converts.
However - there's a huge gap between the hype we all see and actually making this work in practice.
After nine months of implementing signal-based outbound at Prefect using Common Room, I've seen the good, bad, and ugly of what really happens when you try to take the hype and make it reality. It’s messy but so, so fun. Here was our journey.
Why Signal-Based Outbound Makes Sense for PLS
Product-led growth (PLG) is all about positioning the product as the main driver for acquisition through monetization. But the reality is, not every product is perfectly suited for a pure PLG motion. Many companies are finding better success with a hybrid approach in product-led sales (PLS).
PLS acknowledges that while product usage can drive growth, strategic sales involvement helps accelerate conversion, especially for larger accounts. The challenge is identifying which self-serve users are genuinely sales-ready versus those needing more nurturing.
That's where signal-based outbound becomes crucial. By capturing the signals potential ICPs leave across platforms, we can:
Identify users showing genuine buying intent
Engage them with personalized outreach at the perfect moment
Nurture those not yet ready for sales conversations
Build a repeatable, scalable outbound process that isn't just spray-and-pray
The signal-based approach is particularly appealing for developer tool companies, especially those with open source products. OSS users are notoriously hard to identify, and show up in places far beyond the product itself. They are still - however - product users that could be in a buying cycle.
Prefect’s OSS server usage grew exponentially since the launch of Prefect 3.0. However, businesses are made on revenue, not free usage. Turning this OSS usage into a sales cycle was key.
This is where the urgency to implement a signal-based motion came from. Doing so successfully? There’s much more to it than meets the eye.
The Expectation vs. Reality Gap
The expectation: Connect a signal tool, watch the high-intent leads roll in, and watch your sales team close deals like there’s no tomorrow.
The reality: We signed the annual agreement with Common Room with a lot of excitement, only to be drowning in signals for days.
What we identified as the key to success with signal-based outreach:
Parse the signal from the noise, and iterate until you find it
Centralize automation and use it for personalization
Common Pitfalls (And How We Survived Them)
1️⃣ Drowning in Signal Noise
What happened: We took ALL signals from ICPs (company size, title) and started outbounding at first.
The reality check: We wasted sales time for a bit outbounding to poeple who weren’t. in a buying cycle, and lost some trust. In reality, they needed far more nurturing.
The fix: We got ruthlessly selective about which signals actually indicated buying intent versus casual interest. Not all signals are created equal, and what works for one company won't necessarily work for another (a bit more on this later). By adding more focus, we were able to also increase the hit rate within our sales team. More focus means more time for meaningful connections, follow up, research, etc.
2️⃣ Not Using AI for Personalization
What happened: Our early outreach was generic, with basic personalization that’s at this point easy to spot from a mile away. It felt like a waste since it didn't reflect the depth of signals we were capturing.
The reality check: Signal-rich outreach needs signal-rich personalization to be effective. Otherwise, you’re just finding another source to spray and pray.
The fix: We unlocked a whole new level when we created a few themes for outreach and developed a system that summarized behavior patterns and job titles to map a person to our hypothesis about their specific problem. Each problem was matched to a specific outreach theme and thus more targeted messaging. (Huge shoutout to Daniel Adayev on my team who implemented this whole thing).
Surprisingly, we didn't have much success with Common Room's Roomie AI, as it sometimes hallucinated and the copy wasn't quite crisp enough for our developer audience. Maybe we didn't prompt it well enough, but we found that our own custom approach worked better. Note: Roomie AI went through several iterations since we tried it, so probably still worth experimenting with.
3️⃣ Scattered Enrichment Chaos
What happened: Data was everywhere, in different tools, with different formats. The quality was also quite questionable.
The reality check: Common Room was a great aggregator of signals, but lacked some automation needed to be the processor of signals too.
The fix: We centralized everything with webhooks. The winning flow: Common Room → Pipedream → SFDC. We used Pipedream in the place of a marketing workflow automation tool. There, we combined Common Room enrichment with Apollo data to develop a full view of each lead.
This is where the magic happened - the workflow had all the necessary information to determine whether people should be:
Sent to sales for custom outreach
Dropped into automated nurture sequences
Added to automated nurture campaigns
Without this middleware layer, our signal-based outbound would have been a chaotic mess of disconnected tools that we wouldn’t have been able to align with lead qualification expectations we set outside of Common Room itself.
Pipedream ended up being the middleware layer for all leads so we could be consistent across lead sources.
4️⃣ Tossing People Into Tools Without Context
What happened: As much as I appreciate Common Room, it's not the most intuitive tool to navigate. Segments are hidden in mysterious places, and workflows have quirks that make you question your sanity (see below for more quirks).
The fix: We took the tool complexity out of the equation for our sales team. We gave them two options:
A one-stop shop where outreach candidates were clearly surfaced within Common Room itself
SFDC leads they could work from their familiar environment
We chose option #2 so they wouldn't have to learn yet another UI, but could have gone with #1 had we spent more energy in enablement.
5️⃣ Surface-Level Success Metrics
What happened: Initially, we only tracked how many qualified leads came from signal-based outbound. Sounds reasonable, but it missed some nuance.
The reality check: Not all engagement would result in meetings, so we needed to differentiate measuring custom sales outreach to nurturing our community.
The fix: We developed a more nuanced approach:
Ultimate goal: Qualified leads (duh)
Along the way: Sequence engagement metrics, community growth
Continuous improvement: Signal refinement to reduce noise
We realized that not everyone showing interest is ready to buy right now. Some need weeks or months of nurturing before they're ready to engage with sales, and that should show up in our metrics.
What I Would've Done Differently
If I could go back in time, here's what I'd change:
1️⃣ Multiple Touchpoints > Single Outreach
When we saw people engage, we funneled them to sales, and if they didn't respond, there was no follow up. The "one perfect email" myth is just that - a myth.
We should have built more sophisticated nurture journeys from day one including nurturing people after the sales touchpoint that combined:
Email sequences triggered by signals
Retargeting ads to reinforce the message
Social engagement to build familiarity
Content that actually addressed their specific pain points
This would have kept our brand top of mind when they eventually entered a buying cycle, instead of being forgotten after that initial outreach.
2️⃣ Automated Retargeting and More Touchpoints
Anything that happened from Common Room was manual - primarily manual sales outreach. We automated sequences but didn't see them perform well, so we continued with manual outreach and never really got nurture to perform as well as we would’ve liked.
We launched retargeting campaigns toward the end of my implementation, and that's where I'd expect to see more holistic value. The combination of signal-based qualification with automated retargeting would have given us much better results than either strategy alone.
I’d expect the retargeting to evolve with AI personalization around retargeting with specific messaging based on the signals observed.
3️⃣ Better Experimentation with Signal Prioritization
We made assumptions and didn't effectively course-correct fast enough around which signals were truly "hot." We did eventually, but not fast enough. The mental model landed on in the end was:
HOT signals: Immediate sales outreach (likely in buying cycle and active)
WARM signals: Nurture sequence with educational content (ICP but more passive)
COOL signals: Brand awareness campaigns (brand interest, probably never a buyer)
No matter how much research you do, the first implementation of what “hot” means has a 99% change of being wrong - and that’s ok. Measuring success of those signals early is key to being able to iterate effectively.
4️⃣ Clearer Goals Beyond Conversion
Not everyone is in a buying cycle, so we should have established clearer goals for specific types of signals and developed a plan for continued nurture. Some signals indicate future potential rather than immediate intent, and our motion should have reflected this reality. We measured engagement with our nurture campaigns and community growht, but those metrics had a fairly fuzzy mapping to what we were actually trying to achieve.
5️⃣ Personalization That Actually Feels Personal
Shoutout to Letterdrop that does this on LinkedIn super well - whenever I visit their website, I get a targeted LinkedIn message on the topic I was just reading about. A little creepy? Maybe. Effective? Also maybe. It felt like a real human who was trying to help.
We never reached this level of sophistication in our motion. Most of our outreach remained at the "I noticed you're a platform engineer and here’s how other platform teams used our product" level rather than "I noticed you spent time asking about X, which tells me you might be struggling with Y. Here’s an article to help you with that."
The irony isn't lost on me - we were gathering rich, contextual signals but then watering them down in our outreach. Real personalization means using the specificity of the signal to craft an equally specific message about the exact problem they're trying to solve.
When done right, it feels like someone who's been paying attention and genuinely wants to help.
An Aside: Weird Common Room Quirks
A short list of slightly bizarre but somewhat expected quirks any tool would have. This section is only for deep Common Room users.
Workflows can’t be edited. Once you start a workflow, you must recreate it from scratch - no way to edit it. This is because Common Room has no way to not re-run organizations/people through a workflow’s new version. Solution: make workflows not contain business logic, and keep logic within segments. Then, changing the segment changes the workflow.
Webhooks don’t contain all the information. Webhooks sent from Common Room don’t contain all the information you see in the UI. For instance, URL data from web visits is nowhere to be found in webhooks. Solution: Create specifically named segments to differentiate different behaviors and react accordingly to the segment name; alternatively, use other features outside of webhooks (like team alerts) to send data to Slack and pick that up with a downstream automation tool. (Second credit to Daniel who came up with this approach)
There’s probably more that aren’t top of mind and this section is likely going to be added to as time goes on.
The Bottom Line
Signal-based outbound is powerful when done right, but the path to "done right" isn’t the most paved.
The mindset that helped us most was viewing our outbound as a series of experiments, not as machines from the get-go. This perspective shift helped us stop chasing the most scalable way forward and start embracing the messy reality of figuring out what actually works for our specific audience.
If you're just starting your signal-based journey, my advice is simple: start with very small use cases, test to a fault, and remember that the perfect signal strategy doesn't exist - only the one that works for your specific customers.
Thanks for reading! I talk all things marketing, growth, analytics, and tech so please reach out. I’d love to hear from you.
This was a great post, Sarah. Thanks for sharing.
Love this 'from the trenches' perspective. Thanks for sharing! You are spot on about signal 'noise' and signals not always a good fit for demand capture but for nurture. We see the same issues when working with some of our customers. Also, 💯 agree on meeting GTM folks where they work - in CRM mostly 😀 - instead of having them use yet another UI. Another learning that we have had as well.