How DEI Bureaucrats Control University Hiring


Internal documents reveal how administrators use “diversity checks” to influence the hiring process and engage in discrimination.


City Journal

By John D. Sailor

July 7, 2025


In early 2021, Carma Gorman, an art history professor at the University of Texas at Austin and the designated “diversity advocate” for a faculty search committee, emailed John Yancey, the College of Fine Arts’ associate dean of diversity, seeking approval to proceed with a job search.


“I wanted to make sure that the demographics of our pool pass muster,” Gorman wrote. She noted that 21 percent of applicants were from underrepresented minority groups, with another 28 percent self-identifying as Asian.


“The 21% is enough to move forward,” Yancey replied, but he cautioned that concerns could arise depending on how the applicant pool was narrowed. “If 20 of the 23 URM applicants are dropped in the early cut,” he wrote, “then things don’t look good anymore.”


The exchange, which I obtained through an open-records request, offers a window into a diversity practice adopted at many universities. Documents I’ve acquired from institutions across the country—hiring plans, grant proposals, progress reports, and internal emails—show that routine diversity checks are now embedded throughout the hiring process, often enforced with serious consequences for searches that fail to “pass muster.”


This practice raises not only significant legal questions but also highlights how such policies can concentrate power in the hands of individual administrators, granting them effective veto authority over one of a university’s most consequential decisions: the hiring of tenure-track faculty.


In 2023, Texas governor Greg Abbott signed Senate Bill 17, banning racial preferences and the employment of diversity officers. But just two years earlier, the situation at UT–Austin looked very different.


The documents tell the story. As diversity advocate, Gorman—coauthor of the annotated bibliography Decentering Whiteness in Design History—proposed a detailed diversity plan for her search committee. The plan, which I obtained via a records request, outlined a rigorous process for monitoring diversity at every stage of the hiring process.


“Once we’ve sorted everyone into Qualified and Unqualified groups,” Gorman wrote of the first stage in the search process, the committee would ask an administrator to “check the demographic characteristics” of the initial cut. “If it is a diverse enough group to merit moving forward with the search, fantastic!” But if the pool was deemed insufficiently diverse, the committee would revisit candidates from underrepresented groups who were initially considered unqualified, expand job advertising, or simply “cancel the search entirely.” This step would be repeated for both the shortlist and the finalist slate.


The practice raises obvious legal red flags—particularly when it involves canceling searches outright, effectively denying all candidates a fair opportunity based on immutable characteristics. Yet documents I’ve obtained show that more than a dozen universities have adopted some version of this approach.


At the University of Illinois at Urbana-Champaign (UIUC), for instance, search committees routinely receive reminders about the institution’s diversity-check policy. “Every week, [the College of Liberal Arts and Sciences] will send the diversity of the pool report of your faculty search to the unit for review,” wrote Amy Lawrence Elli, a director of human resources, in an email to several departments.


These emails also included department-specific demographic goals. “For your specific search, [the college] has set a strategic goal to hire more U.S. ethnic/racial minority and female faculty in your unit,” Elli wrote in an email to a microbiology committee.


At UIUC, this scrutiny of race and sex would continue right up to the selection of finalists. Deans would review a “diversity of the pool report” for semifinalist and finalist slates. If the makeup was deemed “sufficient,” then search committees could proceed with interviews; if the pool was deemed “insufficient,” the college would “contact the executive officer and search chair to discuss options within 1-2 business days.”


The policy is not limited to universities in progressive states. In a video I previously reported on, Susan Olesik, the Ohio State University’s divisional dean of math and sciences, told a department that “diversity of the candidates has to be as high of a priority as the scholarship.”


To ensure that priority, Olesik noted that approval for finalist slates would depend on their having the right demographic balance. “If the slate of candidates that you bring forward are not diverse, I will ask you to simply keep searching,” she said.


Emails show how the policies played out in practice. As I’ve reported, one Ohio State search committee seeking a dean’s approval boasted that it was “incredibly fortunate to have found three fantastic Native women scholars/candidates who all identify as Native.” Dana Regna, the divisional dean of arts and humanities, wrote that she supported the list “based upon recruitment and diversity of finalists.”


Regarding another search, Regna’s approval, predicated at least in part on “diversity,” was even more enthusiastic: “I definitely approve! What a diverse process, pool, and finalist list.”


Heavily redacted emails from UIUC show several administrators poring over proposed finalists, at times voicing their concerns. “Attached is the diversity of the semi finalist pool for the AAS search,” Elli noted, along with another comment that was redacted. She added that the college “had set a goal for URM.”


Lloyd Munjanja, the university’s associate director of graduate diversity and program climate, responded, “I will talk with the Associate Deans about this as well before the search moves forward.”


Perhaps unsurprisingly, the records show how this diversity-checking policy encouraged controversial and potentially illegal hiring practices—most obviously, disparate treatment based on race.


For searches that didn’t pass muster, Gorman’s plan proposed adding the highest-scoring minority candidates dropped from consideration back to the shortlist and finalist slate. “I suppose we could each pitch our favorites,” Gorman added parenthetically, “which might surface some folks who were underestimated by the committee as a whole—but just seeing who has the next-highest number of stars seems like a good starting point.”


By threatening to shut down or indefinitely postpone searches, diversity checks create an incentive for departments to adopt additional DEI litmus tests for hiring. At UIUC, Elli listed several strategies for getting a “diverse set of semi-finalists or finalists,” including requiring applicants to submit DEI statements and making the “ability to enhance the diversity of your department” an evaluation criterion. DEI statements, which Elli promoted repeatedly in boilerplate emails, have grown increasingly unpopular, even among progressive academics, and are seen by many as ideological litmus tests.


Diversity checks reveal something more subtle about the DEI era. These overbearing, often clever policies have not just sanctioned a legally tenuous obsession with race. They also confer power—giving administrators, many pursuing an ideological agenda, the ability to delay, halt, and redirect departments in their most important decision-making capacities.


If there’s one key lesson here, it’s that the desire for power, not ideology alone, gave rise to the social-justice university. More than likely, power will also prove its undoing.


John D. Sailer is the director of higher education policy and a senior fellow at the Manhattan Institute.



December 11, 2025
Student evaluations subject professors to perverse incentives.
December 10, 2025
Written by John Craig December 10, 2025 On October 27, the Manhattan Institution’s City Journal published a major, breakthrough analysis of the performance of 100 prominent US (and one Canadian) universities and colleges, “Introducing the City Journal College Rankings,” For the first time, this new performance system includes data on measures (68 in all) like freedom of expression, viewpoint diversity tolerance, quality of instruction, investment payoff, and campus politicization that are not considered in the other major higher ed ranking systems. How did Davidson measure up in City Journal’s performance assessment? On a scale of one (bottom) to five (top) stars , Davidson is among the 63 schools that received 2 stars. Schools that, according to City Journal, have “Mostly average to below-average scores in all categories with no particularly noteworthy strengths. Significant, focused policy changes are needed at these schools.” (Full rankings available here College Rankings | Rankings ) To summarize the methodology, the City Journal team selected 100 schools that are highly touted by other ranking systems, widely known to the American public, and/or of high regional importance. The researchers gathered data on 68 variables across 21 categories covering four major aspects of on- and off-campus life. The Educational Experience categories were Faculty Ideological Pluralism, Faculty Teaching Quality, Faculty Research Quality, Faculty Speech Climate, Curricular Rigor, and Heterodox Infrastructure; the Leadership Quality categories were Commitment to Meritocracy, Support for Free Speech, and Resistance to Politicization; the Outcomes categories were Quality of Alumni Network, Value Added to Career, and Value Added to Education; and the Student Experience categories were Student Ideological Pluralism, Student Free Speech, Student Political Tolerance, Student Social Life, Student Classroom Experience, Campus ROTC, Student Community Life, and Jewish Campus Climate. No other higher ed ranking system includes as many variables. (Read more about methodology at College Rankings | Methods ) The data included publicly available information from sources such as the Integrated Postsecondary Education Data System (IPEDS), the Department of Education’s College Scorecard, and the Foundation for Individual Rights and Expression’s College Free Speech Rankings. The researchers also developed original measures for the project, such as the ideological balance of student political organizations and the partisan makeup of faculty campaign contributions. Each variable was coded so that higher values mean better performance and was weighted to reflect relative importance. For example, student ideological pluralism (as measured by self-reported student ideology and the left-right balance of student organizations) accounts for 5 percent of a school’s score while City Journal’s estimate of how many years it will take the typical student to recoup their educational investment to attend a given college accounts for 12.5 percent. A school’s overall score is the sum of points across the 21 categories, with the top possible score being 100. While the assessment system is for the most part hard-data-based, it has, like other ranking systems, subjective elements—like the weighing system. So methodological challenges will come and will doubtlessly lead to improvements the next time around. That said, the methodology strikes me as defensible and a marked improvement over that of other popular rating systems. I will conclude with some comments on the findings. Note that the Average score (out of 100) for the 100 institutions is 46 and the median score is 45.73—so overall, this is not a “high performance” group of institutions. No institution receives a 5-Star rating, and only two receive a 4-Star rating (University of Florida and University of Texas at Austin). Only 11 schools receive a 3-Star rating—Having “Mixed results across the four categories, showing strengths in some and weakness in others. These schools typically have several clear paths to improvement.” Because assessment scores are generally low and tightly clustered in the middle, the rankings by score are misleading: Davidson, at 51.16 with a rank of 25, looks to be in the top quartile (between Princeton and Georgetown), but in fact gets just a 2-Star assessment
November 11, 2025
Report from Ivy League school finds rampant grade inflation, but students complain administration is moving goal posts
Show More