Software CMO shares best practices and sage advice for a rewarding, ROI-generating AI journey
Eighteen months ago, the team I lead at software company Unanet began exploring what artificial intelligence could do in marketing-specific use cases.
Now, a journey that began with modest expectations, unanswered questions, a lot of early-stage due diligence, and a methodically plotted plan is rewarding not only our team but the entire organization. We have supported and augmented the creative process, cut the time we spend on repetitive tasks, elevated the quality of our work, and freed ourselves up to focus on the higher-value ideation, execution, and strategic work that human beings do best.
I talk in terms of “we” because after quite a bit of upfront information-gathering on my own (thank you, Marketing AI Institute and others!), I brought my team into the AI dialogue early to get their buy-in and input. That involved listening to their concerns, soliciting their ideas for how and where to deploy AI, and explaining the “why” behind AI — the rationale and objectives for exploring it, how AI capabilities could benefit them in their work, and the critical role they would be playing in implementing, testing, and assessing an exciting and potentially game-changing technology.
My goal from the outset was to engage the team in pursuit of ways to apply AI that aligned with our business goals and would enhance outcomes for us individually, as a team, and as a company. It was about leveraging our team’s skills, industry expertise, and understanding of our organizational business goals to guide how we explored and implemented AI. Instead of just seeking out the shiniest new AI tool and trying to shoehorn it into our operation, we focused on identifying use cases that would deliver real value and ROI to the team and the business, then finding the most suitable AI tools for those use cases. It was also about ensuring that everyone on the team understood that AI wasn’t here to replace them, but rather to support them in their day-to-day work, so they treated it as a constructive tool, not a threat. Concerns about AI marginalizing the work we do as people are valid. Putting those concerns to rest within my team was a top priority early in our AI initiative.
From initial due diligence to now, that AI journey has been eye-opening and rewarding. We’ve experienced nothing less than a paradigm shift in how we approach AI. Originally, we viewed it mostly as a tool to integrate around the edges to help us improve efficiency. Now, we see it as a core part of strategy development and execution, where we approach many challenges and tasks through the lens of how AI can help fuel creativity, reinvent processes, and augment, not replace, the higher value work we do. AI has become an intrinsic part of our day-to-day work. Following initial proof-of-concept evaluations in several targeted areas like content creation and video editing, we’re consistently finding new applications for AI that support, sharpen, and enhance our work. And with them, our teamwide comfort with, and confidence in, AI keeps growing.
Along the way, we’ve picked up plenty of valuable insights about what it takes for a marketing team to successfully leverage AI — insights other marketing teams may find helpful in their own AI journeys. Ours began with an open mind and a commitment to following a methodical implementation process. Where it ends — who knows? Here’s a look at how that process unfolded:
Step 1: Identify specific areas of your marketing operation where AI can provide value by meeting a need or solving a problem. We sought AI use cases that would enhance the expertise of our team, make strategies more on-point, and elevate the creativity that fuels campaigns. We divided our use cases into six categories: campaigns and content; image and creative design; video development; predictions and insight; strategy; and product marketing. We then identified specific areas like copywriting and video production as the best initial use cases in which to test AI.
Step 2: Assess the available tools. Next, we began to evaluate the AI products that could work in the potential use cases we identified, assessing the quality of the platform and the provider (the latter is just as important as the former) based on the following:
- Business reputation.
- Features and capabilities that map to the needs of our team and our business.
- The strength of its AI roadmap relative to our goals for deploying AI.
- The strength of its support, including whether it provided a dedicated customer success manager along with robust training resources.
- The strength of its security and compliance measures and policies.
Meanwhile, as we were evaluating products and in advance of proof-of-concept testing those we found most promising, we had our legal and IT security teams vet the products we were considering to ensure they would comply with data protection laws as well as our own internal standards and requirements for AI ethics and governance (establishing and socializing these internal policies is another important early step in the process). We needed assurance that whatever tools we invested in would preserve the privacy and trust of our customers, and the confidentiality of our internal data.
Step 3: Start piloting. Here’s where things get fun: Seeing how AI performs in an actual pilot use case. We took a “pilot before purchase” approach, which enabled us to test several solutions, compare them side-by-side, and get a feel for each vendor’s approach to support before committing to a specific solution and provider. We wanted a solution that would enable us to capture quick wins, that readily integrated with our existing martech stack, and that had capabilities we could scale to other marketing use cases (to lower the marginal cost associated with the AI investment).
Having chosen a generative AI-driven copywriting solution for our initial pilot, our team received training on the functions of the software to ensure we were using it correctly. When it comes to maximizing the value of AI software, we’ve learned that you don’t necessarily need deep technical expertise to use it, but you do need training to use tools powered by genAI, machine learning, etc.
Throughout the pilot, members of our team also made a point of sharing and documenting how we used the AI capabilities, our successes, challenges, and wins. We had plenty of back-and-forth with the provider’s support team, and we stayed alert for opportunities to apply AI in other use cases, knowing that a pilot project can uncover other compelling AI applications we may not have considered.
As far as other best practices for piloting AI, we started small and didn’t try to do too much too soon. We set up controlled experiments to test the software in a low-pressure environment where we could hopefully capture quick wins and, if not, fail fast, with minimal consequences, and learn from our missteps. I encouraged people to continue experimenting with the software, knowing some experiments would turn out better than others.
Step 4: Track impact. We committed to measuring the performance of our new AI software against a set of predefined KPIs, like time savings, quality improvement, and, of course, campaign performance improvement. We gathered and analyzed user feedback and documented outcomes and best practices in order to refine our specific use cases and our broader AI rollout strategy.
Like any technology, AI has to prove its value in a real business context. For us, it has done just that. We’ve sliced the time to create blog posts from existing content in half. We’ve developed new capabilities to quickly roll out email programs and to easily tailor existing campaign assets to specific verticals and customer personas. We’ve cut ad production time from two hours to 30 minutes. Because of wins like these — and the ROI they are delivering — we’ve invested in additional AI capabilities that we’re deploying in a growing number of use cases.
The results have inspired confidence across our team in the AI tools themselves, and in our approach to implementing and using them. As a leader, this confidence is critical to building and sustaining a positive mindset about AI across your team.
Stage 5: Continuous learning. As successful as our AI journey has been, there’s always more to learn, new tools and use cases to consider, and new value to capture. To those ends, we’re developing ongoing, role-specific AI education and training for team members, providing people on the team with custom functional learning paths to reflect the different ways they use AI in their work. We regularly celebrate not just the wins but also the failed experiments that lead to learning.
Now, deep into the second year of our AI journey, all our work on the front end is paying off, often in ways we didn’t imagine when we first dipped our toe into the AI waters. The same can be true of your own organization’s AI journey. With a measured approach, an explorer’s mindset, and a commitment to finding use cases that align with your business strategy and goals, AI can indeed transform the work you do as a marketer and a marketing team.