Applications Open for NCAN’s Summer Idea Incubator Series; Apply by June 20

June 2, 2016

Last month, NCAN announced a series of “Idea Incubators” that would take place this summer at sites across the country. These Incubators will bring representatives from member programs together in-person to tackle a shared concern or challenge they each have around the use of data, evaluation, measurement, and/or the Common Measures. Thank you to all of the members who expressed interest or suggested topics for these meet-ups, which will tackle members’ questions and challenges related to data and the Common Measures.

NCAN is pleased to announce four Idea Incubators this summer. Members can apply to send a representative to these Incubators. Applications are due by close of business on Monday, June 20. NCAN will cover travel, lodging, and meal costs of five applicants for each of the four sites. Pending space, members geographically proximate to the host sites may be invited to attend at their own expense.

These 1.5 day meet-ups will each tackle a specific topic (more on those below). Through identifying members’ problems, outlining potential obstacles, doing real-time research on the issues, creating action plans or planning next steps, members will collaborate to find solutions to data-related problems. Applicants should be prepared to be fully engaged in this work for the entirety of the Incubator; they need not be experts in the topic, but they should bring interest, curiosity, creativity, and a willingness to work and cooperate with others in a spirit of open dialogue to the experience.

Descriptions of each of the four Incubators is below. Members with questions, concerns, or comments should contact Bill DeBaun, Program Analyst at or 202-347-4848 x202. Thank you for your interest and enthusiasm in this exciting undertaking!

Incorporating Fit and Match Into College Advising and Program Data
Host: Degrees of Change (Tacoma, WA)
Dates: July 12-13

The concept of promoting “fit” and “match” have become prominent in recent years in the college access and success literature as a way to promote student success. How can institutions best ensure that the students they serve are matriculating to colleges or universities that will do well by them and offer a good chance of successful outcomes? This Idea Incubator will consider how to incorporate fit and match into a program and address, according to the needs and interests of participants, some or all of the following questions:

  • What do we mean by “fit” and “match”?
  • What are some ways to measure the concepts of fit and match and how do they fit into data systems?
  • What does research say about fit and match, and what can we learn from the research about concrete practices for implementation?
  • How can advisors best steer students to institutions that are good fits or matches?
  • What are the implications for staff training and development for incorporating fit and match?
  • What are the implications for program-institution partnerships for promoting good fit and match?
  • What kind of continuous evaluation of institutional fit and match is possible for a program to implement?

Managing the Database Lifecycle and Ensuring Quality Data
Host: College Forward (Austin, TX)
Dates: July 19-20

Data and the database are clearly two important components of the data-driven organization. Unfortunately, there are pitfalls to beware with both data and databases. For example, some programs stick for too long with systems that may inadequately meet their needs or don’t think carefully about designing or procuring a system that really does meet their needs. Additionally, programs sometimes do not have a careful eye toward processes that ensure data quality. This Idea Incubator will consider the database lifecycle, data integrity, and data management. This Incubator will work through, according to the needs and interests of participants, some or all of the following questions:

  • By what measures do programs assess how well a database is meeting their needs?
  • How can organizations effectively manage the database lifecycle from procurement to retirement?
  • How can data be structured to specifically record college outcomes for students?
  • What are the green and red flags to consider when comparing database vendors?
  • How can a program find a database that meets the needs of staff at various levels?
  • What is data integrity and how does an organization know if they have it?
  • How can an organization create, or improve, its policies and processes around data management?
  • What are some common forms of user error? How can they be mitigated?
  • How are data from databases best visualized? What are some best practices in this area?
  • How do organizations avoid duplicate data altogether and how do they handle it when they have it?
  • What are some database and programmatic controls that can be built into a system to promote higher quality data?

Leveraging Data and Analysis for Program Improvement, Inquiry, and Growth
Host: The SEED Foundation (Washington, DC)
Dates: July 26-27

Many NCAN members use data to improve program performance and/or scale program capacity, but the idea of expanding the use of data for both of these ends can be daunting. Organizations that feel they are somewhat data-driven may want to become more data-driven or use data to change and test elements of program design, they may be eyeing an expansion and have concerns about scaling up the use of data across multiple sites on an existing infrastructure. Additionally, they may want to enhance their program staff’s ability to use data to inform their work or to communicate across multiple sites nationwide. This Idea Incubator will consider taking data use to the next step and will work through, according to the needs and interests of participants, some or all of the following questions:

  • What are some approaches for organization to take stock of their current data capacity?
  • Who are the stakeholders that need to be involved in an organization’s expansion of data collection, tracking, and use? To what degree(s) do they need to be involved?
  • What are some obstacles involved with implementing data-driven processes or a cycle of inquiry? How can they be surmounted?
  • What are the professional learning community, cycle of inquiry, and continuous improvement models, and how can they be adapted for fostering data use?
  • How can approaches like dosage analyses be incorporated into a program’s data collection, and how well do these approaches scale?
  • How can using data help to expand the number of students a program can serve?
  • How does data infrastructure need to adjust with program expansion?
  • How intentional is your current program design? How do you know?
  • Where do logic models and theories of action fit into the process of using data for improvement and expansion, if at all?

Collecting and Reporting Metrics across Organizations, Networks, and Collaboratives
Host: Iowa College Aid (Des Moines, IA)
Dates: August 9-10 (*note updated dates*)

In the collective impact environment (whether formal or informal) where multiple organizations and stakeholders rely on each other’s work to pull in a common direction, it can be difficult to get onto the same page in terms of collecting, managing, and reporting metrics. Different organizations have different missions, scopes, approaches, and personalities, and these can (and often do) affect the way data are collected, tracked, and reported. Reporting data with similar accompanying characteristics about our programs allow for better comparison and evaluation. This Idea Incubator will consider college access and success metrics from an inter- or cross-organizational lens and will work through, according to the needs and interests of participants, some or all of the following questions:

  • How can stakeholders in nascent collective impact efforts lay an early foundation of coordinating the way their metrics are defined?
  • How can stakeholders in the middle of a collective impact effort change course to begin defining their metrics in a common way?
  • How can organizations mitigate the challenges of conflicting missions and scopes?
  • When personalities are the obstacle for collecting and reporting metrics, how can that challenge be overcome?
  • How can collective impact efforts evaluate their progress formatively and summatively?
  • What are the best formats in which to report data?
  • How can organizations be kept accountable to each other, to the public, and to funders through shared metrics?
  • What are the best ways to leverage shared/collective metrics for advocacy purposes?
  • How can different organizations come to define the same metrics in the same way?
  • Are there specific formal models or approaches around coordinating common metrics for multiple organizations?
  • What are the technical processes through which organizations share data?
  • What are some best approaches or next steps for advancing common reporting of NCAN’s Common Measures?

Back to Blog

Leave a Reply:

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License