Software-as-a-Service (SaaS) platforms enable continuous, on-the-fly improvement without complex, lengthy installation and update processes. This means we can update Civis Platform in the background, usually without any outage or client…
The modern analytics tech stack features a range of software tools to help nonprofit, advocacy, and political organizations better leverage the data they collect and make more informed, data-driven decisions. This post addresses how data analytics teams can help others in these organizations use these tools more efficiently and collaborate more effectively.
Your goal as an analytics team is to empower your organization to make data-driven decisions. That means creating a feedback loop between your team and the rest of the organization, starting with a collaborative relationship with the teams collecting your first-party data.
Other teams within your organization are doing outreach — whether that’s organizing, fundraising, digital, etc. — and collecting a lot of valuable data as they do it. In addition, they are also building universes or audiences, reporting on their programs, and trying to decide how to optimize their time.
The modern analytics tech stack includes an ever-growing number of outreach tools, coupled with software tools, to make the most of the data these teams are collecting, and create a collaborative ecosystem that helps organizations more efficiently use the data they’re already collecting. In this post, I talk about the users that you should be thinking of when you’re investing in tools; how this software is being used; tips for more effective collaboration; and how you can make sure your tools and teammates work together.
Most organizations start with a customer relationship management tool (CRM). Common examples include Salesforce, Blackbaud, EveryAction, VAN, ROI, and Engaging Networks.
The purpose of a CRM is to provide end users at an organization an easy way to log and track person-level interactions. It’s not designed for storage, querying, or omnichannel analysis: it’s for users to retain data that the organization finds valuable, and to track information on folks your larger team is interacting with, whether that’s donors, volunteers, voters, whomever.
CRM users are typically non-technical staffers. They’re fundraisers, organizers, program managers, and other staff representing your organization on the frontlines, having 1:1 conversations with supporters and trying their best to track the right email or phone number for a particular person. Usability is critical: if it isn’t easy to update contact information or view a donor’s most recent gift, staff are not going to do it, and your organization’s investment in that interaction will lose value as you miss the opportunity to retain and further use that data point.
It’s important to adopt a reporting mindset when onboarding new users to the CRM. If they’re seeing the source of truth for how successful they are (like a report measuring progress-to-goal for a campaign), and they understand where in the tool it’s being pulled from, they’re going to be more incentivized to capture relevant information in the right places. For example, an organizing director on a campaign might say “We want you (organizers) to track your 1:1 meetings with volunteers in VAN.” If the organizing director works with the data director to include 1:1 meetings in the report organizers are looking at, they’re going to be incentivized to retain those 1:1 interactions in VAN, knowing if they don’t, the report won’t reflect hard work they’ve already done.
Why should you care? Well, what does that 1:1 mean? It is probably safe to assume that person is a.) a supporter who is b.) willing to spend their time with a campaign staffer, and thus likely to volunteer. Additionally, the next time your communications or marketing team asks for some quick stats on how the program is going, you now have clean metrics showing the quantitative output of one part of the organizing program.
Adjacent to the CRM are digital organizing and outreach tools to personalize your organization’s calls to action. These tools gather and retain information collected from a particular 1:1 medium (texting, email, phone calls, door-knocking, etc.), and their users are non-technical volunteers, organizers, and digital staffers trained to have conversations, retain information, and get through a script.
You want to get to a place where this data isn’t just living in these tools, and not just exported as CSVs and then moved back into a CRM. If you’re moving data in and out manually, you’re more likely to have broader, staler universes, because manually creating 30 lists takes an unreasonable amount of time. The goal should be automating list-loading and results-extracting from these tools, and that requires compatibility with your data platform and analytics workbench (which makes the work of the analytics team happen; I’ll focus on the analytics workbench in greater depth later in this post).
In addition, ask yourself “What else can I do with these outreach tools?” How can you use information from your CRM to customize volunteer recruitment texts, for example? It starts by leveraging the engagement history stored in your CRM to tweak your script to something like “We’re grateful to count you as a donor to our organization. Can we count on you to increase your impact by volunteering as well?” Think about how you can tailor your message for different audiences, because as you get more efficient with the analytics workbench, you’re going to free up time to customize messages even further.
Next we have one-to-many tools, like those for email blasts, social media, broadcast SMS, and so on. These tools are less personalized and cast a wider net, and they too are most frequently used by non-technical users — digital teams, comms teams, marketing teams, etc.
Some big things to keep in mind here include message testing. You’re sending out messages to really large groups, and it’s important to make the most of that wide net and the money you’re spending.
It’s also important to think about what you can feed into the tool to make the most of insights these audiences have already baked in. Facebook Audiences is a good example: what sort of information can you feed into Facebook Advertising platforms to best inform the tool and best create your audiences, and what insights can you take back?
Often we see teams emphasizing the universe they’re creating, but they aren’t necessarily getting all the reactions back or getting all the data out of the tool, because they’re thinking “Well, I’ve already run the ad, so why would I need that information?” But information pulled out of these tools can be used for other opportunities, and for other tools. How many people were reached by your ad? Are people opening the emails you send? That’s a helpful piece of information to know. Or what time were they on their phone — is that a good time to call or text? And how can you think about retaining those insights to create groups for other parts of the program?
Now we cross over into tools that are commonly owned by the data and analytics team — where people start to see the beginning of a data staffer role. The competencies for visualization tools can be a good entry point for data staffers: you might find people are doing some crossover into learning how to use Tableau, for example. They might not necessarily be a data scientist or know R, know Python, etc., but they can figure out Tableau, and it’s an entryway for them.
One thing to keep in mind here is that these tools are fun. People like the outputs, and they like that the tools make inaccessibly large datasets digestible. (If only I had a dime for every time someone asked me for a heat map choropleth just because they liked seeing the info on a map!)
It’s really important that we use these tools to surface themes in the data, but it’s also important to make sure that you and the report’s stakeholders are aligned on its purpose or what decisions will be made from it before doing the work to avoid wasting time and resources. Even if the point of the work is exploratory, that’s really helpful context for the analytics team to have clarity on before investing in building it.
The last thing to keep in mind is your update cadence. People often talk about how they want constant updates. But how often are people in your organization really going to make decisions on this data? How often are they really going to be able to change course? You want to make sure your analytics staffers are most focused on the analysis and the quality of the work, not on arbitrarily making things as fast as possible, especially if the folks asking for that cadence aren’t aware of the investment it would take to make it possible. It can require days or weeks of engineering time to optimize a pipeline such that it can run every 30 minutes vs. every two hours. If you have stakeholders who are adamant about updating more than twice daily, make sure they’re aware of the tradeoffs.
The data warehouse is the last piece of the analytics puzzle. It’s designed to store a variety of datasets, and it gets you one step closer to breaking down silos, because everything’s in one place.
When implemented strategically, a data warehouse sits at the heart of an organization’s tech stack, centralizing its data from all the sources mentioned above, and providing the foundation for omnichannel campaigns and programmatic efforts. It’s used by technical users — analysts, data engineers, data scientists, etc., with an interest and ability to leverage things all in one place. But it’s just for storage, so you want to make sure it’s compatible with the tools you’re moving data in and out from, and you want to make sure it’s secure.
The analytics workbench on top of the data warehouse, and it’s designed for analysts and data scientists to unearth insights, and to make the most of having your data in one place. It also provides tooling to:
Some Civis Platform users might not know we provide you an Amazon Redshift cluster. Platform — which is itself an analytics workbench — sits atop that cluster storing all your data, meaning the compatibility between the warehouse and the workbench is seamless, which is what you want.
Keep in mind that analysts should be more than the code they can write. Giving them access to the tools being used around the organization can help them optimize their work, and presents them with opportunities to leverage different parts of your data sources for different parts of the program. it’s going to help them gut check and innovate even further if they’re able to understand the program needs and goals, and can test out how the front end experience translates to what they’re seeing in the back end
The modern analytics tech stack encompasses so many products and tools. It’s a user-friendly CRM; it’s compatible outreach tools; and when it’s coupled with an effective analytics workbench that sits on top of a compatible data warehouse, it enables you to break down silos and create a holistic view of your organization, your supporters, and your outreach programs.
At the end of the day, having a lot of data is a GOOD thing. Having tools designed exactly for the kind of outreach program your organization is running is also a GOOD thing! And a tech stack, managed by people who value the data your organization collects in the process, turns an overwhelming and siloed experience into a goldmine.
The workbench allows this 360-degree view by enabling what we like to call “all contacts,” meaning the ability to identify all the ways in which your organization has interacted with a given person, and what the results were. It’s a dataset that allows you to understand the results of the efforts being made by all the teams, and to consolidate all those different kinds of first-party data. It not only helps you to cut lists, but also to reduce overlap — people are only getting one call a day now, instead of one from each team — and your organization is saving money.
Now you’re able to start building real data science products from these tools. Not only are teams equipped to do their outreach in a data-driven way, but you, the analytics team, have completed the feedback loop — you’ve improved collaboration and communication, reduced overlap, and informed your models. The rest of the organization can’t wait to see what you do next.