skip to main content skip to footer


ETS Internship, Fellowship and Visiting Scholar Programs in Research

Collaborate with ETS researchers to carry out innovative and impactful R&D projects.

Learn more about available internships and how to apply.

 

2023 Summer Research and Measurement Sciences (RMS) Internship Program for Graduate Students

If you are a creative and innovative individual who wants to help shape the future of learning and assessment, we encourage you to apply for the 2023 Summer Research and Measurement Sciences (RMS) Internship program.

RMS conducts rigorous foundational and applied research on the most critical issues facing education and the workforce. Central to ETS’s legacy of global leadership in learning and assessment, RMS is dedicated to advancing the science and practice of measurement, driving innovation in digital assessment, learning and teaching, and advancing equity of opportunity for all learners.
 

As an intern in RMS, you’ll work with experts who are nationally and internationally known as thought leaders, trusted advisors and go-to collaborators for their high-impact work addressing significant educational and societal goals.

If you’re accepted into the program, you’ll collaborate with scientists on projects related to foundational and innovative topics in education and will participate in research tasks, including project planning, data analysis, and writing. Upon the completion of the program, you’ll have the opportunity to present your findings to teams across R&D.

Tasks

You’ll perform any number of tasks associated with a research project, such as:

  • developing a research plan to address a critical issue and executing the plan
  • performing a literature review and developing new frameworks
  • working with data collection
  • conducting various psychometric and statistical analyses with real or simulated data
  • preparing a proposal for a conference
  • drafting a research report and documenting research instruments
  • learning specialized procedures or software if the project requires it

At the end of the internship period, you will create and give presentations about your project to teams across R&D.

Doctoral students who have completed at least 2 years in one of these areas or a related field are encouraged to apply.

  • psychology
  • education
  • psychometrics
  • measurement
  • statistics
  • cognitive or learning sciences
  • data science

Eligibility requirements

  • Current full-time enrollment in a relevant doctoral program
  • Completion of at least 2 years of coursework toward the doctorate prior to the program start date.

In a given year, you may apply to only one of the RMS, ETS® AI Labs™ or NAEP internship programs. Please apply to the internship program that best fits with your qualifications and research interests. 

Selection

The main criteria for selection will be scholarship and the match of applicant interests and experience with the research projects.

We value team members who bring a diversity of interests and lived experiences to RMS. We strongly encourage students from underrepresented groups and backgrounds to apply. Late or incomplete applications will not be considered.

Complete the electronic application form. On the application form:

  1. Choose up to two research areas in which you are interested and provide written statements about your interest in the area(s) of research and how your experience aligns with the project.
  2. Attach a copy of your curriculum vitae (preferably as a PDF).
  3. Attach a copy of your graduate transcripts (unofficial copies are acceptable).
  4. Download the recommendation form and share it with your recommenders. The link to the recommendation form is on the application. 
    • Recommendations should come from an academic advisor, a professor who is familiar with your work as it relates to the area of interest or an individual with whom you have worked on a closely aligned project. 
    • ETS will only accept two recommendation forms. Recommendations should be sent electronically to internfellowships@ets.org and must be received by February 1, 2023.

Dates and location

  • Deadline: The application deadline is February 1, 2023
  • Decisions: You’ll be notified of selection decisions by March 31, 2023.
  • Duration: 8 weeks: June 5, 2023–July 28, 2023
  • Location: Hybrid — You will be on campus the first 2 weeks of the program. The remaining 6 weeks of the internship will be remote.

Compensation

  • The full compensation package includes a competitive intern stipend and housing/transportation while on campus.
  • Stipend: $10,000
  • Housing: Housing is provided if you’re commuting more than 50 miles. (For the on-campus portion of the internship: first 2 weeks).
  • Transportation: Transportation allowance is granted if you’re relocating to and from the Princeton area. (For the on-campus portion of the internship: first 2 weeks of the internship).

Some examples of the kinds of projects interns have worked on in recent years include:

  • Sociocultural perspectives in classroom assessment. Next generation assessment practices should signal what sociocultural perspectives we value in how students learn. The project applied sociocultural learning theories to analyze, critique and refine classroom assessment prototypes in support of science learning and instruction. The project helped develop a framework to describe how to include sociocultural considerations in assessment development.
  • Joint Modeling of NAEP Process Data and Response Data. This project involved developing methods to study the relationships between a test taker's sequences of actions on interactive NAEP Science items and their performance on the assessment. The study investigated the use of different modeling strategies including Markov chain models and clustering methods to label test takers as "efficient" or "inefficient" in terms of their sequence of actions and examined how this efficiency was related to performance.
  • Methods for Evaluating Fairness of Machine Scores. Artificial intelligence (AI) and machine learning algorithms have been found to produce biased predictions in some settings. It is important to make sure scoring procedures, such as automated or machine scoring algorithms, are fair and do not produce such biased predictions. This project examined methods based on confirmatory factor analysis and structural equation modeling to investigate whether the relationships among predictors and item scores were invariant across demographic subgroups in order to ensure the scores we produce are fair.
  • Holistic Admissions in Graduate School Applications and Impact on Admissions Outcomes. In this project, we examined how graduate programs approach holistic admissions, considering a range of evaluation criteria when determining applicants' fit for their programs. We examined different approaches to holistic admissions and documented the training that staff received to apply holistic admissions. We also researched outcomes of holistic admissions in terms of diversity and academic rigor of the admitted class. This is an ongoing project.
  • Evaluating Measurement Invariance of Translated Tests Across Language Groups. The intern project investigated whether the two (or multiple) language versions of the same test affect item level performance and test level performance, whether score equating is necessary, and how equating should be conducted. We conducted multiple group item response theory (MIRT) concurrent calibrations to select common items that are statistically invariant to serve as an internal anchor between the different language forms (e.g., an English form and Spanish form). We found that text-heavy subjects (e.g., reading) is more affected by translation in terms of form-equivalency. For math, the two language forms were very similar.

Contact

For more information, contact us via email.