Farmers Market Metric Program Update from FMC Researcher, Darlene Wolnik

      Posted On: September 10, 2015

Darlene Wolnik Williamsburg

Indicators for Impact: Farmers Markets as Leaders in Collaborative Food System Data Collection and Analysis

Update: Market Observations 2014-2015, By Darlene Wolnik FMC Senior Researcher, September 2015

Indicators for Impact: Farmers Markets as Leaders in Collaborative Food System Data Collection and Analysis is a project in which the University of Wisconsin-Madison in collaboration with the Farmers Market Coalition is seeking to identify data measures and collection tools and help markets and their partners integrate these into a long-term measurement strategy through training and technical assistance.

Nine markets across the U.S. are currently testing data collection and reporting protocols. The project team gathered known metrics used over the last decade in farmers markets and food system research and prioritized those that could be easily gathered by the market community itself. The metrics were grouped into one or more of four types of benefit they provide: economic (i.e. sales or job creation), ecological (land stewardship), social (new relationships) and human (skills gained or knowledge transferred).

The project’s principal investigator Alfonso Morales, Assistant Professor at University of Wisconsin-Madison is leading the research for this three-year study funded by the USDA’s Agriculture, Food, and Research Initiative (AFRI).

The nine markets are:

  • Athens Farmers Market, Athens OH
  • Chillicothe Farmers Market, Chillicothe OH
  • Crossroads Farmers Market, Takoma Park MD
  • Hernando Farmers Market, Hernando MS
  • Oxford City Market, Oxford MS
  • Ruston Farmers Market, Ruston LA
  • Spotsylvania Farmers Market, Spotsylvania VA
  • Williamson Farmers Market, Williamson WV
  • Williamsburg Farmers Market, Williamsburg VA

As Year One of data collection winds down, it is time to begin to look at the data collected by the markets in order to prepare for Year Two. Over the last year, I traveled to all nine sites and talked with the markets about their data needs, issues with data collection and their differing levels of capacity and context. At six of the markets, I attended on a day on which they were doing data collection for the Indicators for Impact project.

Some market characteristics:

  • The population in the municipal area of the markets ranges from 3,090 to 125, 684.
  • The years that the markets have been in existence range from two to over 40 years.
  • Only one of the markets operates year-round. However, some of the markets also run markets for part of the winter season.
  • Only one market has a full-time person overseeing the market, but that person also has other responsibilities included fundraising and strategic planning for the organization.
  • One market has no paid staff; the board does all of the market management and organizational development
  • Three do not have formal boards overseeing the market.
  • Two of the organizations manage more than one market day per week during the summer season.

Project details

The markets selected their own metrics and their own data collection dates. Additionally, they were given four metrics to collect that were selected by the project team. The four common metrics are:

  • Number of visitors
  • Total vendor sales
  • Average distance in miles traveled from farm to market
  • Total agricultural acres owned, leased, or managed by market vendors

Four markets opted for mobile data collection and received project mini-iPads for data collection. The University of Wisconsin-Madison team created a data upload site for markets to enter and view data. Most of the data could be entered on market day using the iPads for visitor surveys or vendor or program data, but since the majority of markets did not choose to use mobile data collection, data entry was completed at a later date. In at least one case, the market’s Wi-Fi was not strong enough to use the iPads for data collection as planned. Market feedback thus far suggests that the data entry is being done by the market managers and not by volunteers.

Two of the markets attempted two different visitor-counting methods on the same day. The primary protocol, written by University of Wisconsin-Madison, asked markets to count every visitor entering during the second 20-minute interval of each hour and then to multiply by three to get an hourly estimate. One market also did “full counting”(counted every adult and every child entering the market) to find out if the two methods resulted in significantly different numbers. The other maintained its own version of sample counting everyone in the market once per hour.

All nine markets enlisted volunteers to collect the data for this project. Some supplied incentives for their volunteers such as market t-shirts, free drinks/food or market gift certificates.

The market volunteers interviewed by the project team were knowledgeable and enthusiastic about the project and offered good informal feedback about the logistics of data collection. During visits, volunteers were often observed completing more data collection than required or offering to return for another round of volunteering. In one case, however, some volunteers left before their assignment was complete, leaving the rest of the volunteers to complete the day’s data collection on their own.

At least one market used an organization that recruits corporate volunteers. Two others drafted university students to be their data collectors.

At least two of the markets had to reschedule or delay some of their scheduled data collection, because of inclement weather or volunteer cancellations.

When asked, vendors were supportive of the project and knowledgeable about the need for data collection. In at least one case, market vendors’ university-aged children served as data collectors for the market. Board members were universally supportive of the data collection project, although very few participated in the data collection work.

Vendor sales were collected using two methods in Year One: vendors were asked to submit their sales from the previous year at the beginning of the season and to submit anonymous sales slips weekly to the market for the present year. In those cases where markets already gathered weekly sales data (four of the markets already asked for vendor sales), whatever method the market already used was maintained. In the markets that had never before asked for vendor sales, all reported some resistance to the collection. One market had only one vendor refuse; another market found passive resistance to the collection (no outright refusals slips promised but not turned in regularly). Two other markets expected more resistance and delayed asking for weekly sales until later in the season.

We observed even more information during these visits than I’ve included here, but I’ll stop there. What is crystal clear to the entire project team is what most of us already know:

1. Market leaders are endlessly curious about ways to measure their market and interested in knowing how to understand other markets, but have a list of tasks that often pushes that curiosity aside, especially in the summer season.

2. Markets need job descriptions and specific training materials for seasonal volunteer and intern positions to complete regular data collection. Collecting and managing a data collection team is a time consuming process and cannot start in the busy part of the season or be built from scratch each year.

Finally, thanks to all of my market hosts who were gracious in offering their time and feedback during this first year of data collection: Jenny, Maria, Helen, Gia, Leslie, Kip, Dwayne, Tom, Christie W, Katie, Betsy, Taylor, Elizabeth, Tracy, Leanne, Christie B, Michelle, Shawn, Jean and Kit.