Tag

data

The Power of Story

By | Art & Social Change, Art That Counts | No Comments

I was heavily involved and invested in museums for the first decade of my career — as a staff member, a fellow, an intern, a volunteer and a museum studies student. So it was a delight to attend the annual meeting of the American Alliance of Museums in Baltimore this week, greeting the people in the field that I follow avidly via Twitter and blogs and the icons of the museum world to the city of which I’m such a fan.

AAM Schedule (photo by Michelle Gomez)
Photograph of AAM program/schedule courtesy of Michelle Gomez and via Instagram.

The theme of this year’s conference was “The Power of Story.” And while that might not seem that relevant to data and evaluation on first glance, it’s data that gives power to our stories. Inside museums, evaluation and measurement are done in some ways that might be familiar to the casual visitor (e.g., visitor surveys, comment cards, program evaluations), but also some that might be unexpected or go unnoticed, as a profile from the Wall Street Journal illustrates:

Matt Sikora doesn’t look at the Rembrandts and Rodins at the Detroit Institute of Arts. His eyes are trained on the people looking at them. Mr. Sikora watches where visitors stop, whether they talk or read, how much time they spend. He records his observations in a handheld computer, often viewing his subjects through the display cases or tiptoeing behind them to stay out of their line of sight. “Teenage daughter was with, but did not interact, sat on bench, then left,” read his notes of one visit.

It’s not uncommon for museum evaluators to shadow visitors in the galleries, learning from their movements what areas or objects are engaging and for how long. In addition, before an exhibition opens to the general public, many elements, including label text and interactive gallery displays, are prototyped and tested. Through these evaluations, exhibit designers, curators and museum educators learn more about visitors’ reactions to exhibits: which elements are engaging, confusing or overlooked. In addition, some evaluation tools also provide information about what visitors take away from their time in the gallery — what was learned, what inspired them, what connections they made and, hopefully, what will draw them back again.

What was so empowering about this year’s conference was being able to evaluate those tools themselves, and to learn. Surprisingly, technology is not always the answer. Visitor evaluation consultants and staff members from the Brooklyn Museum and Monticello shared various scenarios where their attempts to survey visitors went awry because technology got in the way or skewed results, the target audience was elusive or just straight-out avoided their polling attempts. It just goes to show that even bad data can teach you something, even if it’s not what we set out to learn!

Even more surprising was the lesson that data doesn’t necessarily persuade, no matter how clear or comprehensive. Often, beliefs trump facts. As Stephen Bitgood, Professor Emeritus of Psychology at Jacksonville State University and Founder of the Visitor Studies Association, said, “When strong belief is pitted against reason and fact, belief triumphs over reason and fact every time.” Despite our expectation that data should persuade, prove and set people on the right course, it simply doesn’t override gut instinct, what people feel or believe to be true. Again and again, presenters told tales of data being met with questions or disbelief. Unfortunately, no solutions were presented to either circumvent or resolve this issue, but I am filing this under “knowing is half the battle” and keeping it in mind when data is presented as all-powerful or all-knowing.

Display at AAM2013 (photo by Mariel Smith)Display at AAM2013 (photo by Lindsay Smilow)
Photographs of AAM display, top to bottom, courtesy of Mariel Smith via Instagram
and Lindsay Smilow via Instagram.

So evaluation and measurement can fail or go awry. Testing our tools and techniques in small batches prior to rolling out the full survey or other strategy gives us an opportunity to see it in action and identify areas to fix or improve. If evaluation and measurement are treated as afterthoughts, as so often is the case, these tests are even less likely to occur and, as a result, the final data may prove useless, further cementing the idea that evaluation itself is a useless activity. It’s a difficult cycle to break out of, but worth identifying and tackling so that we can truly tell a more powerful story.

HIPerwall Demo: Cultural Analytics by Flickr user guategringo

Big Data Meets Art

By | Art & Social Change, Art That Counts | 2 Comments

Big data isn’t something that’s just being covered breathlessly by the likes of Forbes and Fast Company; arts and culture organizations and nonprofits are generating, collecting and sifting through their own data and collaborating to make sense of it all. Initiatives like the Cultural Data Project (CDP) and the National Arts Index have been collecting and sharing data since 2004 and 1998 respectively (check out Baltimore’s Local Arts Index).

The CDP is an online tool which allows arts and cultural organizations to report, review and analyze organizational, programmatic and financial data. Originally developed through a collaboration of Pennsylvania funders, the project expanded to other states beginning in 2007 with Maryland. It now includes 12 states and the District of Columbia. Locally, Maryland State Arts Council is a member of the Maryland CDP Task Force and requires many grantseekers to complete a CDP organizational profile. More than 14,000 arts and cultural organizations have completed a profile, including 447 Maryland organizations (as of December 1st, 2012).

Screenshot from 2013-03-13 19:35:58

This data collection process results in reliable longitudinal data that is useful to researchers and advocates, as well as grant makers and the participant organizations. Participants can run and download reports that compare their activity from year to year, as well as comparisons against data aggregated by other participating organizations on the basis of organization type, geography and budget size.

While it has been run and organized by The Pew Charitable Trusts for the past eight years, the project is currently in transition and will begin operating as an independent nonprofit as of April 1st, 2013. In addition, it announced a collaboration with the arts and business schools at Southern Methodist University (SMU) and other partners to create a National Center for Arts Research (NCAR) at SMU. Together, these organizations look to be a nationwide resource on arts attendance and patronage, the impact of the arts in our communities and the financial trends and health of arts nonprofits. This new center will build upon the comparison reports currently available via the CDP:

NCAR will maintain a website with an interactive “dashboard,” created in partnership with IBM, which will be accessible to arts organizations nationwide. Arts leaders will be able to enter information about their organizations and see how they compare to the highest performance standards for similar organizations in areas such as community engagement, earned and contributed revenue, and balance sheet health.

The current shortcomings and the future potential of the CDP have been outlined in a great article by Talia Gibas and Amanda Keil. Issues such as this were much on my mind as I attended the Greater Baltimore Cultural Alliance (GBCA)’s gathering of cultural data collectors. While the original invite and some of the presentations focused on mapping data, a broader conversation also took place about the challenges local arts nonprofits face when collecting and analyzing data. I was actually delighted that representatives of the Baltimore tech community (Sharon Paley of the Greater Baltimore Technology Council and Kate Bladow, coordinator of the Tech and Social Change meetup) were attending and a partnership with GBTC has resulted.

There was some discussion that an ongoing group would meet around these issues and, should that come to fruition, I look forward to the opportunity to participate further and meet more individuals involved in the arts and nonprofits who are looking for data-driven answers about the impact of their work. One of my major takeaways from this session, however, was that my consideration of data shouldn’t be limited to metrics of impact, but also the power of data to describe our community. I look forward to highlighting some of this work already ongoing in Baltimore in future columns.

Metrics for Joy and Life

By | Art & Social Change, Art That Counts | No Comments

Many art programs and projects exist because they seem like good ideas — some because they made good use of an existing space, others because they have good intentions to draw attention to or even solve a community problem. As an example, I present the missions of two now-defunct Baltimore projects:

  1. Operation: Storefront: To match landlords of vacant spaces with tenants to fill space and create life on the street.
  2. Black Male Identity Project (BMI): [To serve] as a catalyst for a national campaign to build, celebrate, and accentuate positive, authentic images and narratives of black cultural identity.

Both of these projects had laudable goals. But there can be substantial difficulty in evaluating the successes and failures in achieving such goals. What does it mean to be a catalyst on a national level or for there to be “life” on a street? Furthermore, how should these things be measured? What can they be compared to?

Evaluations such as this are often considered as an afterthought and usually as an angst-inducing or frustrating rite of passage to receive funding. Clayworks’ founder, Deborah Bedwell, once wrote:

…when I would see the words ‘measurable outcomes’ on a grant proposal, I would experience a wave of nausea and anxiety. I would be required, the grant stated, to prove to the prospective funder that our programs and activities had created a better life for those who touched clay and for the rest of the city — and maybe the rest of humanity.

So, just as an organization or project’s mission and goals can be far reaching and even dramatically overstated, the bar for measurement can also seem impossibly high. In an effort to create one-size-fits-all metrics, some have focused on the most obvious and simple things to identify and measure — such as attendance or economic impact. For some organizations and projects, even these metrics can be challenging. For example, how should the attendance to a mural or other public work of art be estimated? Some sites are using QR codes to track visits, but the necessity of smartphones is an obvious limitation to the resulting data. Some funders have developed their own gauges, such as ArtPlace’s vibrancy indicators, in an effort to create a level playing field among grantseekers and with a hope to create a more useful and larger pool of results data from their activities.

In the case of Clayworks, Bedwell was interested in capturing and communicating something beyond raw numbers about participation in their community arts program and saw the need to “figure out how to evaluate joy, how to measure creativity, and how to quantify that ‘I get it!’ moment that makes weeks of hard work worth the effort.” While many might give up before they started on such an effort, Clayworks received assistance from the Maryland Association of Nonprofits in tackling their evaluation dilemma; they adopted a model used by The Kellogg Foundation, which Bedwell described enthusiastically in an article for the NEA’s web site (Note: This article is no longer online, but is available as a PDF download. All Bedwell’s quotes are originally from this article.).

So, if one can measure the joy found in creating, then it is likely also possible to measure — with adequate thought and planning — the “life” or vibrancy of a street or neighborhood, the changes in attitudes inspired by a photograph or a lecture. It’s important for these challenging metrics to be tackled and shared, not just so funders can identify return on investment, but so artists and communities can benefit, be able to point to their successes, to know which efforts are worth continuing and repeating.

I plan on diving into this in even greater detail in future posts, as well as continuing to highlight existing art projects and their impact. If you want to share some insights about your organization or project, I invite you to join me in the comments or to reach out to me via Twitter or email.

Noteworthy:

If you are inspired by or involved in the intersection of arts, culture and community, these upcoming events may be worth your time to check out:


PHOTO CREDIT. Photo of entrance to the Franklin Building in Chicago by Flickr user Terence Faircloth/Atelier Teee.