Cracking the Code: A Science Media- Research Collaboration:

This article is one of a multipart series exploring the unique media practitioner-academic research collaboration of Cracking the Code: Influencing Millennial Science Engagement (CTC) a three year Advancing Informal STEM Learning Innovations (AISL) research project funded by the National Science Foundation (NSF) between KQED, a public media company serving the San Francisco Bay Area, Texas Tech and Yale universities. KQED has the largest science reporting unit in the West focusing on science news and features including their YouTube series Deep Look.

The author, Scott Burg, is a Senior Research Principal with Rockman et al.

“The goal of collaboration is not collaboration, but better results”. — WT Hansen

Finding a research partner

In any type of collaborative research project, it is not always easy for a practitioner to find a research partner whose work style and professional experience aligns with or respects the other individual’s or organization’s way of thinking. Finding the right set of complementary skills and knowledge from a research partner is especially important in creating practitioner-academic collaborations. As we have reported throughout this series, forming and sustaining these types of collaborations is no easy matter, given the already numerous substantive process and methodological differences between these two domains.

To serve as research partners on Cracking the Code, KQED science staff selected renowned science communication researchers Dan Kahan (Yale) and Asheley Landrum (Texas Tech). KQED staff felt the research team would be a good match given Kahan and Landrum’s joint research on science curiosity, motivated reasoning, and science media engagement. Kahan, Landrum and their colleagues also first created and validated a Science Curiosity Scale (SCS), a measurement of individuals’ general motivation to consume information about scientific discovery for personal satisfaction. The SCS tool was a key data collection instrument in helping to identify and build the project’s collection of social media profiles.

While well-versed in science communication theory, the research team from Yale University, who led the first couple of CTC studies, did not have deep digital media or journalism expertise with newer platforms like YouTube. Since Deep Look is distributed exclusively on YouTube, the lack of experience with digital platforms or media production from some on the research team frustrated the KQED science staff. This lack of a common experiential framework deterred an expeditious process of collaboratively developing research questions for the first series of CTC-Deep Look studies.

I felt like a subject matter expert more than a collaborative collaborator. Some of the researchers were really interested in abstracting the study from their perspective in order to mitigate any biases. In this context, we were trying to meet the researchers halfway between practical knowledge and research. In the beginning, we weren’t successful in trying to move the focus of our research questions out of the more academic and career contexts. — KQED science media staff

A number of KQED staff felt that when developing these research questions it would have been helpful if they had consulted with a media practitioner from CTC’s Advisory Group. Unfortunately, the Advisory Group had very little involvement over the course of the project.

I felt like we started with advisors but then never heard from them again. It would have been really valuable to solicit their feedback because a lot of those advisors are practitioners. It would have been great to cycle back to them and say you know we’ve reached this phase and now we’ve determined our research questions. What do you think? — KQED science media staff

Landrum and her team from Texas Tech University, however, did have familiarity with digital media studies, and YouTube in particular. Her team guided studies in Year 2 and 3 that strengthened the research process, and helped address KQED’s questions more directly. Jocelyn Steinke (Associate Professor, Department of Communication, University of Connecticut), who joined the research team in Year 3 to conduct a study on young women’s science identity also had a science media and journalism background. Her experience in these areas helped to facilitate the work with KQED.

Both the Deep Look and science news teams appreciated the level of media expertise brought to the project by Landrum and Steinke. As a result of their collective media experience, these researchers shared backgrounds and interests with the KQED science team that helped shape and sustain a common focus and set of commitments.

Landrum, Steinke and their team also brought experience in the application of sophisticated data collection tools and methods that enriched the project’s understanding of diverse audiences. Landrum and her team collaborated with KQED to spearhead a series of social media data analytic-driven studies, in particular the Deep Look titles study (a complete analysis of episode titles and themes), and the development of a research instrument for the science news team’s “awe” study. Each of these studies had a huge impact on the respective KQED science staff.

In her study, Steinke applied qualitative research methods, such as semi-structured interviews and natural language processing (NLP) to provide a more nuanced understanding of resulting themes and trends. These combined methods, and the knowledge of how to apply these tools in interpreting data tied to science media and journalism audiences, provided KQED practitioners with a level of detail and understanding relevant to their practice.

It’s been really valuable to learn different ways to understand our audiences specifically by applying research tools. That’s something I had limited exposure to. I was very impressed with their innovative application of tools. — KQED science staff

Having worked with practitioners and/or understanding issues and practices of science media production and dissemination, enabled Landrum, Steinke and their teams to incorporate practitioner knowledge and experience into the research process. This was particularly evident in Steinke’s ability to build consensus when working with Deep Look staff on the gender identity study. During the study, both KQED and the research team repeatedly demonstrated their willingness to revise and iterate instrumentation and methods based on feedback from the other.

Vetting technology and a technology partner

A particularly unique and innovative component of the CTC research design was the incorporation of advanced data analytic methods and tools to better understand audience engagement in a digital space. To complement the work of the research teams from Yale and Texas Tech universities, and facilitate this cutting-edge digital analytic work, KQED contracted with two additional firms, DeltaV, a digital strategy firm that helps organizations design, develop and execute digital outreach and platform strategies as well as understand online content across audiences, and Crimson Hexagon (now Brandwatch), developers of a sophisticated social media listening tool. KQED planned to apply social listening tools to build a more nuanced understanding of exactly how audiences think about their ‘product’ (e.g., science content and news) by analyzing what they said on social media channels.

In hindsight, KQED probably could have been more thorough in vetting both companies and their capabilities. As staff later discovered, neither Brandwatch nor DeltaV had any substantive experience working with public media or any other broadcasting clients. Neither organization had an acute understanding or appreciation of the kinds of audience analysis interests related to science production or science communication research. This combined lack of knowledge and experience hampered related communication and management activities between KQED staff and these two consultant organizations.

Given the consultants’ collective lack of efficiency and support, a number of project staff wondered why it was even necessary to contract with two separate outside organizations to conduct social listening activities. Some suggested that it may have been more productive, and a better use of resources, had Brandwatch simply trained CTC project staff to use tools and methods as a means of building project capacity and sustainability.

Ultimately, KQED and research staff jointly concluded that the Brandwatch tool never should have been used with any Deep Look related testing. Over the course of two years KQED staff spent countless hours trying to explain to Brandwatch and DeltaV what Deep Look was about, and the kind of audience information they needed. Ultimately, both KQED and the research team concluded that the Brandwatch tool lacked the capacity and YouTube data to provide the kind of audience demographic information (e.g., age, profession, etc.) that was of most interest to them.

During years 2 and 3 a staff member from KQED’s science team was trained to use the Brandwatch tool, and became proficient in conducting social media audience analysis activities more in line with project needs. Unfortunately, the training took longer than anticipated.

It would have been preferable if we had taken more time when conceptualizing the project to think about the roles of bringing in outside consultants and the possibility of training staff to do these kinds of tasks. If training were a priority from the beginning, and the tool was more reliable, I don’t think the training itself would have taken as long. As it was we had to be creative in how we could use their (Brandwatch) product. — KQED science staff

While the tool failed to meet the specific needs of CTC, the experience in using the tool has enhanced the awareness and potential of using social media platforms for audience engagement research within KQED. Increased use of social media as a research and engagement tool also strengthened the connection and collaboration between KQED and the research team.

I’ve been reaching out to members of the research team to ask about literature related to social media engagement. They’ve been very responsive. It’s good to know that social media engagement is being discussed on the academic side. — KQED staff

A big takeaway from the social listening experience, especially for KQED’s engagement team, was acquiring a greater knowledge and understanding of tools and methods for conducting social media analytics with diverse audiences across different platforms. This applied to engagement work throughout the organization. Research team members also came away with a greater appreciation of the utility of social media analytics.

What I’ve learned from this experience is I would definitely consider a more focused approach to audience research in our everyday work. Rather than just sharing content every day, I need to focus more on what people are talking about. This is what every engagement producer should do. It will help me take a better approach to understanding what questions people have in the community. — KQED science staff

--

--