From the crest of Bluebonnet Hill, Huston-Tillotson University offers a view that serves as a living metaphor for the current American moment: Below the campus, the glass-and-steel skyline of downtown Austin reflects a city transformed by the global technology industry. But on the hill sits the oldest institution of higher learning in the city, where buildings were laid brick by brick by the hands of Black students more than a century ago.
Log in to view the full article
From the crest of Bluebonnet Hill, Huston-Tillotson University offers a view that serves as a living metaphor for the current American moment: Below the campus, the glass-and-steel skyline of downtown Austin reflects a city transformed by the global technology industry. But on the hill sits the oldest institution of higher learning in the city, where buildings were laid brick by brick by the hands of Black students more than a century ago.
At the second annual HBCU Ai Con held on Huston-Tillotson’s campus last month, leaders declared historically Black institutions will not be left out of the conversations — and opportunities — presented by tech’s newest boon: artificial intelligence. According to Black Tech Futures CEO Dr. Fallon Wilson, who gave the opening keynote, it was the only way to begin a conversation about the future of technology. “We cannot move forward until we go back and claim what is lost,” she said.
Wilson reminded the audience that HBCUs were “sanctuaries of radical dreaming” long before they were research hubs. They were created as spaces of physical and figurative freedom by Black Christians and educators who recognized the power of the mind at a time when Black bodies were fresh from the bonds of slavery.
“HBCUs are places where humanity is rubbed back into us,” Wilson said. Her words underscored a central thesis of the conference: that the artificial intelligence revolution is not a race for HBCUs to join, but a legacy for them to lead.
Shifting the deficit narrative
The interest surrounding the summit reflected the urgency of the moment. Roughly 600 people gathered for the event. Reno Dudley, a Huston-Tillotson adjunct professor and recent TEDx speaker, made attendance at the summit a requirement for students across four of his courses: Business Communications, Intro to Marketing, Information Gathering & Analysis, and Public Speaking. For Dudley, exposure to the summit was not an extracurricular opportunity; it was preparation for the future his students will inherit.
“I wanted my students to understand the frontiers we are now breaching with AI and how it will define the future,” Dudley explained. His students, sitting in buildings their ancestors might have built, were now tasked with building the digital equivalents.
Panelists intentionally sought to shift the conversation away from the idea of a “digital divide” — a phrase that often frames HBCUs through a lens of scarcity — to what one speaker described as a “concentration of genius” inherent within these institutions.
Philip Butler, associate professor at the Iliff School of Theology, noted that artificial intelligence systems are ultimately reflection engines. They reproduce the data — and the biases — of the humans who build them. “We realize many AI systems are biased, but that can only be a human error,” Butler said. “HBCUs hold the power of innovation, because they believe and weave into their community the power of humanity — a critical feature missing from AI.”
That humanity, speakers argued, is not merely philosophical. It is a technical necessity. Roy Austin, director of the Howard Law Artificial Intelligence Initiative, offered a stark reminder of the stakes. “If you’re not at the table, you’re on the menu,” Austin warned, urging institutions to protect their intellectual property and create their own systems before scaling or partnering with major technology firms. Austin and other speakers pointed to the historical precedent of HBCUs building their own campuses to ensure their survival as a framework for how they should be approaching AI from a builder mindset.
Corporate speakers also cautioned against a future where HBCU talent simply feeds existing technology pipelines. Chazara Clark-Smith of the IRS encouraged institutions to maintain an entrepreneurial mindset. “We should want to incentivize and encourage students to build and not always sell,” she said. The goal, according to panelists, is not simply to produce graduates who are “industry-ready,” but leaders who are capable of building entirely new industries.
Representatives from major companies, including Adobe and Amazon Web Services (AWS), shared insights on how corporations can support equitable partnerships. Margie Vela of AWS described educator enablement programs that have reached more than 30 campuses, helping expand pathways into the AI workforce. Huston-Tillotson (HT) itself highlighted its own AI certificate program, taught by former Indeed CEO and visiting lecturer at HT Chris Hyams, as a model for in-house curriculum development.
The problem with a single “human in the loop”
There was also a deeper discussion of ethics and implementation, and another theme emerged around the limits of current safeguards. In Silicon Valley, many companies promote the concept of a “human-in-the-loop” — a system where a person reviews AI decisions to prevent bias or harm.
But Meme Styles, co-chair of the HBCU Ai Con and president and Founder of community data organization MEASURE, argued that a single human reviewer is not enough. “Communities need to be in the loop,” Styles insisted. “Humans in proximity to those communities need to be in the loop.” Her point was simple: the communities most affected by AI-driven surveillance, hiring tools, or predictive systems should also have the agency to question and audit those technologies.
Styles also noted that Black communities have long functioned as unrecognized data sources for technology companies. “There’s a revenue stream,” she said, adding that communities should be compensated and credited for the data and lived experiences that fuel modern AI systems.
Jose Teran, CTO of Ohel Technologies, used a census dataset to illustrate how machine-learning models can be optimized for efficiency while still producing biased outcomes.
Even when obvious factors like race or gender are removed from a dataset, advanced algorithms can still identify indirect proxies that replicate discriminatory patterns. One example involved the use of the popular machine-learning model XGBoost in recruitment systems. Teran showed how certain data patterns could lead to false positives that disproportionately disqualify candidates from marginalized communities.
A machine can be statistically accurate while still failing to meet a deeper standard of algorithmic integrity, he said. Achieving that integrity requires intentional human oversight — not as a replacement for technology, but as a partner to it. “Model + accuracy = peak algorithmic integrity,” Teran noted, demonstrating that when a human was in the loop to evaluate false positives, the accuracy of the XGBoost model reached 88 percent.
The ancestral conductor
The narrative arc around HBCUs and AI centered around the importance of pushing forward with an intentional mission to include and support other HBCU leaders, despite the current political climate. Wilson shared a deeply personal reflection about the loss of a $12 million grant amid the current political backlash against diversity, equity, and inclusion initiatives — a reminder that the freedom spaces HBCUs represent remain under constant pressure.
“It broke pieces of me,” Wilson admitted. “But we still must do the work.”
She pointed to the legacy of leaders like Rev. Jesse Jackson, who pushed Silicon Valley companies in 2014 to release diversity data and confront disparities within the tech industry. Wilson describes that tradition as “Black Public Interest Tech” — the application of cultural values and historical experience to ensure technology serves liberation rather than exploitation.
By looking back to reclaim what was lost — the principle of Sankofa — historically Black institutions are ensuring that the future impact of artificial intelligence on communities is not a bull in a china shop, but a tool for collective advancement.












