Nearly all higher education employees are using artificial intelligence tools for work tasks, but less than half know whether their institutions have policies governing such use—a disconnect that experts warn could pose significant risks to data security and privacy.
Canva Magic Studio AI
The findings reveal that AI adoption has rapidly outpaced institutional governance efforts, with more than half of respondents (56%) reporting they use AI tools not provided by their institutions for work-related tasks—tools that may not have been vetted for data privacy, accuracy, or accessibility concerns.
"This gap could have serious consequences for data privacy and cybersecurity, data-informed decision making, digital accessibility, and more," the report states.
The policy awareness gap persists even among senior leadership. The survey found that 38% of executive leaders and 43% of managers and directors reported they are not aware of policies or guidelines meant to guide work-related use of AI. Similarly, 35% of technology professionals and 30% of cybersecurity and privacy professionals—the groups most likely to have decision-making authority for such policies—lack awareness of institutional guidelines.
These findings suggest many institutions simply have not created comprehensive AI policies rather than failing to communicate existing ones effectively, according to the report.
Despite widespread use and the policy vacuum, only 11% of respondents said they are required to use AI tools for work, and 64% don't expect such requirements in the near future. Among those who have recently used AI tools for work, a majority do so either daily (38%) or weekly (34%).
The most common work-related uses include brainstorming (63%), drafting emails (62%), and summarizing long documents or meetings (61%), followed by proofreading or copyediting (56%) and creating presentations (47%). More than half of respondents (54%) reported using AI tools for eight or more different types of work tasks in the past six months.
Faculty showed distinct usage patterns, with 63% reporting they have used AI tools for creating learning activities or assessments, compared to just 32% of staff members.
Most institutions have developed AI strategies—92% according to the survey—with common elements including piloting AI tools (65%), evaluating opportunities and risks (60%), encouraging staff and faculty to use the technology (59%), and creating policies and guidelines (54%). Only 5% of institutions are discouraging or prohibiting AI use for work.
However, institutional leaders face significant challenges in managing AI implementation. Just 13% of respondents said their institutions are measuring return on investment for AI tools. When asked how they measure ROI, most said they are unsure, while others described using user experience surveys, focus groups, and tracking metrics such as adoption rates and task completion times.
The survey revealed both enthusiasm and caution among higher education professionals. While 81% feel enthusiastic or mixed enthusiasm and caution toward AI, respondents identified numerous urgent risks. More than two-thirds (67%) identified six or more risks as urgent, with increased misinformation (55%), use of data without consent (52%), and loss of fundamental skills requiring independent thought (51%) topping the list.
Most respondents also expressed optimism about AI opportunities, particularly for automating repetitive processes (70%), offloading administrative burdens (65%), and analyzing large datasets (60%). Similarly, 67% identified five or more opportunities as most promising.
Among institutions working to increase AI-related workforce skills, 69% are primarily upskilling existing staff and faculty rather than hiring new positions. Most commonly, institutions encourage employees to develop skills independently (80%) or offer in-house professional development (71%).
The survey included responses from managers and directors (39%), professional staff (32%), executive leaders (16%), and faculty (12%) across institutions of all sizes. Public institutions comprised 67% of respondents, and 75% of respondents have worked in higher education for 10 years or longer.
ChatGPT, Microsoft Copilot, Google Gemini, Claude, and Perplexity were among the most frequently mentioned AI tools used for work-related tasks.














