If you manage a community fund—whether it's a revolving loan fund, a grant program, or an impact investment pool—you've likely asked yourself: Is our money actually helping the local economy? Many funds track outputs like 'loans disbursed' or 'jobs created,' but few dig into whether those jobs are quality, whether businesses stay local, or whether dollars recirculate. This guide offers a 30-minute checklist to conduct a local economy audit. It's designed for busy practitioners who need a practical, repeatable framework without hiring an economist. By the end, you'll have a clear picture of your fund's real impact and where to adjust. This overview reflects widely shared professional practices as of April 2026; verify critical details against current official guidance where applicable.
Why Conduct a Local Economy Audit?
Community funds exist to strengthen local economies, but without a structured audit, it's easy to confuse activity with impact. A local economy audit helps you answer fundamental questions: Are we creating jobs that pay a living wage? Are we retaining businesses beyond the grant period? Is our money multiplying through local supply chains? These questions matter because a loan that creates ten low-wage, part-time jobs may have less real value than a grant that helps one local manufacturer expand and hire three full-time workers with benefits. The audit also surfaces unintended consequences—for example, a fund that prioritizes retail businesses might inadvertently push up commercial rents, displacing long-standing services. By spending just 30 minutes with a checklist, you can identify these patterns and make data-informed adjustments. Moreover, funders and board members increasingly demand evidence of systemic change, not just anecdotal success stories. A consistent audit process builds credibility and helps you communicate impact in terms that resonate with stakeholders. It also reveals gaps in your data collection—perhaps you've never tracked business survival rates or average wage levels. Once you know what you're missing, you can improve your application and reporting forms. The audit isn't a one-time exercise; it's a habit that turns your fund from a passive check-writer into an active economic development partner.
What the Audit Reveals About Your Fund
A typical audit will surface three key insights: first, whether your fund is reaching the intended population (e.g., low-income neighborhoods, minority-owned businesses); second, whether the businesses you support are creating sustainable, quality jobs; and third, whether your investments are generating local multiplier effects—meaning dollars spent at a local business stay in the community through wages and local procurement. For example, one community fund in the Midwest discovered that 70% of its loan recipients were in retail, but those businesses had a high failure rate within two years. By adjusting criteria to favor manufacturing and professional services, the fund improved job retention by 40% over three years. Another fund found that its grant recipients were more likely to buy supplies from outside the region, limiting local impact. The audit prompted them to add a local sourcing requirement to grant agreements.
Who Should Use This Audit
This checklist is for community fund managers, board members, program officers, and impact investors who want a quick, credible assessment. It's not a substitute for a full economic impact study, but it's a starting point for funds of any size. If you're a solo administrator with a small revolving loan fund, you can complete it in 30 minutes. If you're part of a larger community foundation, you might delegate sections to different team members and then discuss findings together. The audit is also useful for funds that are considering new strategies—for instance, shifting from grants to loans, or from general business support to sector-specific programs. By running the audit before and after a strategic change, you can measure whether the shift improved outcomes.
The Core Metrics: What to Measure in 30 Minutes
To keep the audit short, focus on five metrics that capture the most important dimensions of local economic health: job quality, business retention, local sourcing, wage levels, and multiplier effect. These metrics are chosen because they are relatively easy to collect (often from existing records or short surveys) and they provide a balanced view of economic resilience. Job quality goes beyond headcount—it includes wages, benefits, and full-time status. Business retention measures how many funded businesses are still operating after one, two, and three years. Local sourcing tracks the percentage of inputs (supplies, services) that businesses purchase within the community. Wage levels compare funded business wages to the local median or living wage. Multiplier effect estimates how many times a dollar spent at a local business recirculates—this can be approximated using a simple ratio (e.g., for every $100 spent, $60 stays local). These metrics are not perfect, but they are practical. Many funds find that just collecting these five data points reveals patterns they hadn't noticed, such as a cluster of businesses that create many jobs but pay below the living wage, or a sector with high retention but low local sourcing. The checklist below walks you through how to gather and interpret each metric.
Job Quality Score
To calculate job quality, assign points for each job created: 1 point for part-time (under 30 hours), 2 points for full-time, and an additional point for offering health insurance or a retirement plan. Sum the points and divide by the number of jobs to get an average quality score. Compare this to your fund's benchmark—if you don't have one, a score of 2.5 or higher is a good target (meaning most jobs are full-time with benefits). Many funds are surprised to find their average score below 2.0, indicating many part-time or benefit-free roles. This metric is especially important for funds focused on poverty reduction, because part-time jobs rarely lift families out of poverty.
Business Retention Rate
Track the percentage of funded businesses still operating after one year, two years, and three years. You can pull this from your own records or use a simple follow-up email. A retention rate below 60% at three years may indicate that your fund is supporting businesses with weak business models or that market conditions are challenging. Compare rates across sectors to see if some types of businesses are more resilient. For example, one fund found that restaurants had a 50% three-year retention, while professional services had 80%. This insight led them to offer more technical assistance to restaurant owners.
Local Sourcing Index
Ask funded businesses what percentage of their supplies and services are purchased locally (within your defined region—usually the same county or metro area). Many funds find that businesses that source locally have higher survival rates because they are embedded in the community. Aim for an average of 50% or more. If your fund's portfolio averages 20%, consider adding a local sourcing requirement or preference in your application criteria. You can also offer a small bonus or lower interest rate for businesses that commit to local sourcing.
Data Sources: Where to Find the Information Quickly
You don't need to commission a survey to complete the audit. Most data can be gathered from existing sources: your own application forms, follow-up reports, and publicly available data from government agencies or economic development organizations. For job quality and business retention, your internal records should be sufficient—just ensure you have a system to track each funded business's status annually. For wage levels, you can compare your businesses' reported wages to the local median wage published by the Bureau of Labor Statistics or your state's labor department. For local sourcing, you may need to ask businesses directly, but you can embed a few questions in your regular check-in calls or emails. The multiplier effect is trickier, but you can use a rough estimate: many community development practitioners use a multiplier of 1.5 to 2.0 for local businesses (meaning each dollar spent locally generates $1.50 to $2.00 in local economic activity). To refine this, you can look at input-output models available from your regional planning commission or university extension service. If those aren't accessible, a simple survey question—'what percentage of your revenue stays in the local economy?'—can give you a proxy. The key is to be consistent: use the same data sources and methods each time you run the audit so you can track trends. If you find gaps in your data, that's a useful finding in itself—it tells you what to improve in your reporting system.
Internal Records You Already Have
Most funds have application forms that ask for number of jobs, wages, and business type. The challenge is that this data often sits in spreadsheets or PDFs and isn't aggregated. Spend 10 minutes to create a simple tracker with columns for business name, sector, loan/grant amount, jobs created, average wage, and retention status. Once you have this, you can calculate averages and rates easily. If you have a CRM or portfolio management tool, see if you can export the data. Many funds are surprised to find that they have more data than they think—they just haven't looked at it holistically.
Public Data Sources
For wage comparisons, the Bureau of Labor Statistics (bls.gov) publishes county-level median wages by occupation. For local sourcing benchmarks, your local economic development office may have data on the percentage of businesses that source locally. Some states also have 'multiplier' estimates for different industries. If you're in a rural area, your USDA Rural Development office may have relevant data. These sources are free and often updated annually. The challenge is that public data may lag by a year or two, but it's still useful for trend comparison.
Common Pitfalls and How to Avoid Them
Even with a clear checklist, audits can go wrong. One common pitfall is focusing only on job numbers without considering quality. A fund might celebrate '100 jobs created' without noticing that most are part-time or seasonal. To avoid this, always calculate the job quality score alongside raw numbers. Another pitfall is ignoring businesses that closed. Many funds only track success stories and forget to follow up on failures, which skews the picture. Make sure your retention rate includes all funded businesses, not just those that reported recently. A third pitfall is using inconsistent definitions—for example, one year defining 'local' as within the city limits, the next year as within the county. This makes year-over-year comparisons meaningless. Agree on a geographic boundary and stick with it. Fourth, beware of survivorship bias: if you only survey businesses that are still active, you miss learning from failures. Try to interview or survey a few businesses that closed to understand why—this can inform your future funding criteria. Finally, don't over-interpret small numbers. If you've funded only five businesses, a 60% retention rate is just three businesses; a single closure can swing the rate dramatically. In such cases, focus on qualitative insights from those businesses rather than making big strategic decisions based on shaky percentages. A fifth pitfall is neglecting the 'counterfactual'—what would have happened without your funding? While a full counterfactual analysis is too complex for a 30-minute audit, you can ask businesses: 'Would you have started/expanded without our support?' This gives you a rough sense of additionality. Many funds discover that a significant portion of their recipients would have proceeded anyway, which raises questions about whether the funding is truly catalytic.
The Trap of Anecdotal Success
It's natural to highlight your fund's best stories—the bakery that grew from 2 to 20 employees, the manufacturer that started exporting. But if you only report these, you miss the broader picture. One fund we know celebrated a single high-growth tech company while ignoring that most of its loan recipients were struggling. The audit revealed that the overall portfolio had a negative return on investment. By balancing anecdotes with systematic data, you can avoid misleading yourself and your stakeholders. The audit doesn't replace stories; it contextualizes them.
Over-Reliance on Self-Reported Data
Businesses may exaggerate job numbers or wages to look good. If possible, verify a sample of wage data against payroll records or tax filings. For many funds, this is too invasive, but you can at least ask for documentation for larger grants or loans. Another approach is to triangulate with public data: if a business claims an average wage of $25/hour but the industry median in your area is $15, ask for an explanation. Even without verification, being aware of potential bias helps you interpret the numbers with appropriate caution.
Step-by-Step Guide: Running Your 30-Minute Audit
Set a timer for 30 minutes and follow these steps. You'll need a spreadsheet or paper with your business list, and access to public wage data. If you have a colleague, split steps to save time. Step 1 (5 minutes): Pull up your list of funded businesses for the past three years. For each, note the sector, loan/grant amount, and status (active, closed, or unknown). Step 2 (5 minutes): Calculate job quality scores for all businesses that reported jobs. If you don't have wage or benefit data, use the best approximation you have (e.g., full-time/part-time). Step 3 (5 minutes): Calculate retention rates for each year cohort. For example, businesses funded in 2023—how many are still operating in 2026? Step 4 (5 minutes): Look up the local median wage for the sectors you fund most. Compare your businesses' reported wages to that median. Step 5 (5 minutes): Estimate local sourcing by reviewing any data you have or sending a quick email to a sample of businesses (if you don't have time, just note 'data not available' and plan to collect it next quarter). Step 6 (5 minutes): Apply a rough multiplier. For each sector, use a standard multiplier (1.5 for retail, 2.0 for manufacturing, 1.8 for services) and multiply by the average wage or revenue to get a sense of local economic impact. Write down your findings and one or two key observations. That's it. The goal is not perfection but a consistent snapshot. Over time, you'll refine your data and see trends. If you find that you're spending more than 30 minutes, you're probably overcomplicating it—narrow your focus to the most important metrics for your fund's goals.
Pre-Audit Preparation (5 Minutes)
Before you start, decide on your geographic boundary (e.g., county, city, or multi-county region). Also decide which years to include—typically the last three fiscal years. If you have a large portfolio, you can sample 20-30 businesses instead of all of them. Make sure you have a consistent definition of 'job' (e.g., any W-2 employee working 20+ hours per week). Write these definitions down so you can repeat them in future audits. If you have multiple funding streams (grants vs. loans), you might run separate audits for each, as their impacts may differ.
Calculating Key Ratios (10 Minutes)
For job quality, sum the scores and divide by the number of jobs. For retention, divide the number of active businesses by the total funded in each cohort. For wage comparison, calculate the percentage of businesses paying above the local median. For local sourcing, calculate the average percentage reported. For multiplier, multiply the total revenue or wages of your businesses by the estimated multiplier. These ratios are your baseline. If any ratio surprises you, note it and investigate later. For example, if only 30% of businesses pay above the median wage, that's a red flag worth exploring.
Interpreting Results (10 Minutes)
Look for patterns: Are certain sectors underperforming on retention? Are smaller loans associated with better job quality? Is there a trade-off between job numbers and wages? Write down three key findings. For example: 'Our retail sector has high job creation but low wages and retention; our manufacturing sector has moderate job numbers but high wages and retention.' This insight can guide your future funding priorities. Also note data gaps: 'We don't have benefit data for 40% of businesses; we need to collect this next year.' Finally, decide on one action to take based on the audit—perhaps adjusting loan terms for a specific sector, or adding a local sourcing requirement.
Comparing Fund Types: How Different Strategies Affect Local Impact
Not all community funds are the same, and the audit will look different depending on whether you run a revolving loan fund, a grant program, a microfinance initiative, or an impact investment fund. Each has distinct strengths and limitations when it comes to local economic development. For example, revolving loan funds often have higher business retention because they require repayment and ongoing contact, but they may be less accessible to very early-stage businesses. Grant programs can support riskier ventures but may have less accountability for outcomes. Microfinance funds typically target very small businesses but may struggle to track job quality. Impact investment funds can bring larger capital but often expect financial returns that may conflict with deep community impact. The table below summarizes key differences. Understanding where your fund type sits helps you interpret your audit results in context. For instance, a microfinance fund with low average wages might be acceptable if it's reaching very low-income entrepreneurs who would otherwise have no access to capital—but you should still track whether those businesses eventually grow wages. A grant program with low retention might be fine if it's funding startups that are inherently risky, but you should then ask: are the few successes generating enough impact to justify the failures? The audit helps you have these nuanced conversations.
| Fund Type | Typical Focus | Strengths | Weaknesses | Key Audit Metric |
|---|---|---|---|---|
| Revolving Loan Fund | Small business loans | High accountability, repayment creates recycling | Lower risk tolerance, may miss neediest | Retention rate, default rate |
| Grant Program | Non-repayable funding | Can support high-risk ventures, flexible | Less sustainability, harder to track outcomes | Job quality, local sourcing |
| Microfinance | Very small loans (under $10k) | Reaches underserved, builds credit | High administrative cost per loan, low wage impact | Wage growth over time |
| Impact Investment Fund | Equity or debt with social goals | Large capital, potential for scaling | Return expectations may limit risk-taking | Multiplier effect, additionality |
When to Choose One Type Over Another
If your goal is to create stable, long-term jobs, a revolving loan fund with a focus on established businesses may work best. If your goal is to support startups and innovation, a grant program with technical assistance is more appropriate. If you're targeting very low-income entrepreneurs, microfinance combined with training can be effective. If you have large pools of capital and want to scale proven models, impact investment may be the way. Most funds actually blend these approaches—for example, a community foundation might have a grant program for startups and a loan fund for growing businesses. Your audit should consider each stream separately and then look at the combined portfolio to see if you're balancing risk and impact effectively.
Real-World Examples: What Other Community Funds Discovered
These anonymized examples illustrate how the audit can lead to meaningful changes. Example 1: A rural community fund in the Appalachian region had been making small grants to retail businesses for years. The audit revealed that 80% of these businesses closed within two years, and the few that survived paid wages below the poverty line. The fund shifted its focus to supporting local food producers and small manufacturers, which had higher retention and better wages. Over three years, the fund's average job quality score rose from 1.8 to 2.7, and retention improved to 70%. Example 2: An urban revolving loan fund targeting minority-owned businesses discovered through the audit that its loan recipients had excellent retention but were not creating many new jobs—most were solo entrepreneurs. The fund added a requirement for loan recipients to hire at least one employee within 12 months, and offered a wage subsidy to make it feasible. Within two years, the average number of jobs per loan doubled. Example 3: A community foundation's impact investment fund found that its investments in local real estate development had a high multiplier effect (because construction workers spent wages locally) but low job quality (construction jobs are often temporary). The fund balanced its portfolio by also investing in a manufacturing cooperative that offered permanent, well-paid jobs. These examples show that the audit doesn't just measure impact—it drives strategy. The key is to use the findings to ask 'so what?' and then experiment with changes.
Lessons from a Failed Fund
One community fund that did not conduct regular audits ultimately shut down after five years. It had focused on making small loans to home-based businesses, but never tracked whether those businesses grew. When a new board member asked for impact data, they found that most loans were defaulted and few businesses had expanded. The fund had been relying on anecdotal success stories that masked the overall poor performance. A simple audit would have revealed the problem early and allowed for course correction. This cautionary tale underscores that even a 30-minute check can prevent fund failure.
Comments (0)
Please sign in to post a comment.
Don't have an account? Create one
No comments yet. Be the first to comment!