FAQs

Photo courtesy of Alamo Colleges

Photo courtesy of Alamo Colleges

Jump to FAQs for a particular topic:

General Questions

Q: What is the difference between NILIE, the National Religious Diversity Study, and the PACE, IDEALS, and CRSCS surveys? A: NILIE is the organization that encompasses multiple survey efforts. The National Religious Diversity Study uses the IDEALS (longitudinal) and CRSCS (cross-sectional) surveys and is administered by a separate research team than the PACE Survey. Please indicate on the NILIE Contact Us page which survey you are interested in for more information.

PACE Survey

Q. I want to launch my PACE survey soon, can you accommodate me? A: We most likely can accommodate your quick turn-around; however, there are two stages that clients might need more time than they expect. First, PACE staff requires a completed service agreement to start work. Institutions may need to run this service agreement by their legal department. Institutions may also like to form a taskforce to develop custom questions. This is the second stage where clients might need more time — developing custom questions and providing a contact list.

Q. How many custom questions may we add to the PACE survey? A: Clients may add up to 20 custom questions that begin with “The extent to which…”. In addition, clients may also include up to 3 custom demographic questions.

Q. How long should we leave the PACE survey link live? A. We recommend leaving the survey open for a minimum of three weeks in order to gather as many responses as possible.

Q: How many reminders will you send out while the survey is live? To whom will you send those reminders? A: We generally send out three reminders in the duration of three weeks when the survey is open in order to encourage participation. The reminders will only be sent to those who have not completed the survey. Respondents who complete the survey will not receive reminders.

NILIE Team Members

Q. How can we increase our survey response rates? A: Encourage the President or other member of your institution’s leadership team to send an email to participants prior to the email distribution of the survey to notify them that they will be receiving an email from PACE staff and to stress the importance of completing the survey. The subject of our email is “PACE Survey- Institution Name” and it is sent from noreply@qemailserver.com. Also, we highly encourage marketing the survey among participants in as many ways as possible in order to get the best response rate (i.e. announcements at meetings, flyers on bulletin boards, etc.)

 Q: If the answers are confidential, how do they know I didn’t already take the survey? A: Your responses are confidential but not anonymous. PACE staff will not report whether or not you take the survey; however, each invitee has a unique link to access the survey. Your answers are completely confidential and will be released only as summaries in which no individual’s answers can be identified. When you submit your survey your name will be deleted from the mailing list and never connected to your answers in any way.

Q: Will answering certain demographic questions make it obvious who I am? A: Any variable with a response rate of less than 5% of the total will be excluded from the data provided to the institution to protect the confidentiality of respondents. That means that if you are part of group that has less than a 5% response rate compared to all responses, your group will be combined with another group.

Q. Will you be able to tell who said what in the open-ended comments? A: Open-ended comments are separated from all other survey items including demographic questions and “the extent to which…” items. The institution will receive open-ended comments in the aggregate. There will be no way to connect comments to respondents. Additionally, PACE staff review the comments and edit as necessary to maintain respondent confidentiality.

Q: Is the Part-time Faculty Subscale only administered to respondents who indicate they are part-time and faculty? Are the Racial Diversity and Institutional Structure Subscales administered to all respondents? A: Yes, the Part-time Faculty Subscale would only be administered to part-time faculty while the Racial Diversity and Institutional Structure Subscales would be administered to all staff.

Q: How are the items on the Institutional Structure Subscale different than the Institutional Structure climate factor questions on the PACE survey? A: Historically, institutions score lowest on the institutional structure climate factor of the standard PACE. As a result, we developed the Institutional Structure Subscale to delve deeper into these issues. The questions on the Institutional Structure Subscale center around: mission, leadership, decision-making and influence, policies and structural organization, teams and cooperation, and communication and information sharing. Please review a Institutional Structure Subscale Sample Report.

PACE Report

Q: Since the report format has changed, can we still include a comparison to our previous administration? A: Yes, in the report you are able to select three comparison dimensions. One of these dimensions can be your previous administration as long as you have administered the survey within the last five years. If you have administered the PACE survey more than once in the last five years, you can also choose previous administrations as your second and third comparison dimension.

Q: Why can we not include a comparison to a previous administration from more than five years ago? A: The PACE Survey team does not think a comparison between two administrations more than five years apart is meaningful. The PACE survey instrument has not changed, you can request raw data from your previous administration for an additional fee.

Q: How do the comparison groups work? A: In the report, you can select three comparison dimensions from our norm base. Most institutions will select their previous administration, the entire NILIE norm base, and institutions of a similar size. When you select a Carnegie, IPEDS, or Census classification, your institution is already classified in each category by the information provided by your institutional research team. As a result, your comparison group for the Carnegie, IPEDS, or Census classification will be your institution’s classification. For example, if you are a Medium 2 year institution, then you will be compared to other Medium 2 year institutions.

Q: Why is the report no longer organized around the personnel classification variable? A: We offer a personnel classification report so that your institution can browse means by Administrator, Staff, and Faculty groups. Additionally, in our demographic report we standardized all of our demographic items so that institutions can focus on the demographic that is most important to them and still see comparisons to other institutions. For example, institutions focusing on generational issues may pay particular attention to our age demographic item and/or institutions concerned with racial diversity may choose to focus on that variable.

Q: Why do you not include means for groups with a 5% or lower response rate in our report? A: To protect respondent confidentiality, we will redact means from any response option with a 5% or lower response rate. The frequencies for these groups will still be reported.

IDEALS Survey

Q: What does the IDEALS measure and how is the survey constructed? A: IDEALS is designed as a longitudinal study with pre- and post-tests to assess change over the duration of students’ collegiate careers. Based on Astin’s (1993) I-E-O model, IDEALS measures a range of items to capture students’ input characteristics, environmental experiences (e.g., college experiences and engagement), and outcomes.

Q: Which campuses are eligible to participate in IDEALS? A: All four-year U.S. colleges and universities are welcome and eligible to participate. If interest exceeds the 120 institution cap, the study team reserves the right to select additional institutions for participation that best meet the study criteria.
Q: What is the commitment expected from participating campuses? A: Campuses must commit to administering the IDEALS instruments at the three time-points to a sample (or census) of the 2015 entering first-year student cohort. The surveys are scheduled to be administered in the fall of 2015, spring of 2016, and spring of 2019. In terms of human resource commitments, institutions need to designate a person/department to set up, advertise, and implement each survey. Institutional dedication to successful survey implementation will help assure strong survey response rates and retention of participants through the duration of the four-year process. As for financial commitment, cost of participation (including administration costs and participant incentives to increase response rates) is free.

Q: What kind of data about my campus will we receive if we participate? A: Each campus will receive data sets and custom reports for each administration of the survey (3 total data sets and 3 total reports). Institutions that register before February 2, 2015 will receive reports that include peer group and national comparisons for benchmarking purposes.

Q: What kind of incentives will be provided to encourage student response? A: At this time, we are investigating various incentive programs to determine which program will result in the greatest response and completion rates. Once an incentive program is determined, we will notify participating institutions. As noted above, participant incentives are provided free of charge.

Q: What kinds of support will the IDEALS research team provide to help ensure successful administration? A: Each institution will be assigned a dedicated contact person from the study team to help with administration preparation and troubleshooting. This designated contact will provide resources to help you successfully implement the study as well as assist with any administration issues or concerns you may have.

Q: How is IDEALS funded? A: A non-religiously affiliated organization that supports initiatives intended to foster constructive dialogue across difference has humbly chosen to remain anonymous while funding the planning year for this study.

Q: What’s the difference between IDEALS and the CRSCS? A: Although related, the IDEALS and CRSCS have slightly divergent foci. The study team utilizes both instruments to better understand student worldview commitment, perceptions of “other” worldview populations, and institutional factors that may influence interfaith attitudes and interaction. As a cross-sectional survey, the CRSCS focuses on student perceptions of campus climate around religious and philosophical worldviews.  Because the CRSCS is administered to a random sample of all students, it provides a snapshot of the campus climate and ways in which students are engaging in various worldview experiences.  Contrarily, the IDEALS is only administered to the incoming class of fall 2015, tracking their pre-college perceptions and behaviors as well as collegiate involvement and experiences. Due to its longitudinal structure, IDEALS will help campus administrators understand how the collegiate experience impacts student outcomes related to interfaith cooperation.

Q: What is the required sample for each campus? A: Sample size is variable by campus. The study team is aware of the number of surveys given to first-year students and is sensitive to the survey schedule and sampling preferences of each campus. However, in light of current trends in retention rates for longitudinal studies, the study team recommends a 50% or census sample when possible. Larger sample sizes will help institutions capture robust data over the duration of the project, allowing for greater analysis flexibility with the final data set.

Q: Are there additional packages a campus can purchase? A: Your campus can add ten custom questions to the survey for $500. The ten custom questions must remain the same across all three iterations of the survey. Institutions that register after February 2, 2015 will not receive free benchmark reports but can purchase them for an additional cost.
Astin, A. W. (1993). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass.

Photo courtesy of Harford Community College

Photo courtesy of Harford Community College

CRSCS Survey

Q: What does the CRSCS measure and how is the survey constructed? A: The Campus Religious and Spiritual Climate Survey (CRSCS) is a theoretically-based and empirically-validated assessment tool designed to assist campus leaders in creating positive climates that embrace the challenges with and realize the possibilities of supporting diverse religious and non-religious worldviews on campus. The climate scales on the survey are theoretically based on the framework established by Hurtado, Milem, Clayton-Pedersen, and Allen (1999), which models the interrelated elements of campus climate for racial/ethnic diversity. It is designed to help your campus answer such questions as:

  • Do students perceive campus as a safe space for diverse religious and non-religious identities, beliefs, and practices?
  • What are the most positive aspects of campus climate? What areas of campus climate present challenges or opportunities for improvement?
  • How do students respond to and interact with others representing different worldviews?
  • What are students’ attitudes toward diverse worldviews? How do they perceive their own capacities to effectively engage religious diversity?

Q: Which campuses are eligible to participate in the CRSCS? A: All four-year U.S. colleges and universities are welcome and eligible to participate, at their own cost.

Q: What is the commitment expected from participating campuses? A: The CRSCS is a cross-sectional survey designed to be administered at one point in time to a sample of students at your institution. There are two administrations per year, one in the fall and one in the spring. (Note, however, that the CRSCS will not be administered in the 2015-2016 academic year as we prepare to launch IDEALS.

Q: When will the next administration of the CRSCS take place? A: Registration for the spring 2015 administration of the CRSCS is now closed. The CRSCS will not be administered in the 2015-2016 academic year as we launch the Interfaith Diversity and Experiences Longitudinal Survey (IDEALS). We anticipate that the CRSCS will next be administered in fall 2016, in which case we will begin enrolling institutions interested in participating in spring 2016.

Q: What kind of data about my campus will we receive if we participate? A: Each campus will receive a data set and a custom report with national comparison data.

Q: What’s the difference between IDEALS and the CRSCS? A: Although related, IDEALS and the CRSCS have slightly divergent foci. The study team utilizes both instruments to better understand student worldview commitment, perceptions of “other” worldview populations, and institutional factors that may influence interfaith attitudes and interaction. As a cross-sectional survey, the CRSCS focuses on student perceptions of campus climate around religious and philosophical worldviews. Because the CRSCS is administered to a random sample of all students, it provides a snapshot of the campus climate and ways in which students are engaging in various worldview experiences. Contrarily, the IDEALS is only administered to the incoming class of fall 2015, tracking their pre-college perceptions and behaviors as well as collegiate involvement and experiences. Due to its longitudinal structure, IDEALS will help campus administrators understand how the collegiate experience impacts student outcomes related to interfaith cooperation.

Q: What is the cost to participate in the CRSCS? A: The CRSCS will not be administered during the 2015-2016 academic year as we launch IDEALS. Fees will be posted when the CRSCS begins enrolling colleges and universities for future administrations.