Taking a Seat at Café AI

With the rise of ChatGPT and generative AI on college campuses, SCU faculty reckon with what it means for the future of education.

Taking a Seat at Café AI

This past spring, Santa Clara University faculty from departments ranging from fine arts to electrical engineering filed into the business school. This time, they were the students. Queued up on the projector was a presentation titled, “AI Goes to Class Whether You Invite It or Not.” Their first lesson: signing up for ChatGPT, the supposed enemy of their classrooms. 

Imagine a vast library with endless shelves full of tomes holding the answers to questions big and small—from how to boil an egg to what happens when a star dies. ChatGPT and other forms of generative artificial intelligence are like the librarians of massive libraries, drawing upon an unfathomably large collection of information to answer any manner of questions or prompts. Since its public release in November 2022, ChatGPT has quickly stormed schools and raised a ton of questions on ethics and functionality. As of February 2023, SCU hosts CAFE AI, a series of hour-long sessions to discuss the presence of ChatGPT on campus. 

One of the main concerns for faculty has been maintaining academic integrity. The use of ChatGPT can be very difficult to detect in a work environment. SCU Manager of Instructional Technology Eric Haynie believes that since there’s no reliable metric for picking out what’s AI and what’s human, the discussion around AI must be directed to how people engage with it.

“The ethical question comes down to how it is that students see their learning being augmented by their use of generative AI,” Haynie says. “Is it enhancing it? Or is it offloading something?”

Smartphone With Chatgpt On Keyboard (52917311050)
ChatGPT has become widely usead across the world and its presence continues to grow since its public launch last year in Novemeber. Photo provided by Jernej Furman.

For Haynie, a good approach is to try to understand it as akin to a sounding board. Rather than have ChatGPT do all the work, it can be used to help generate new ideas or eliminate menial tasks like summarization or basic code. It can open the door for much deeper introspection for humans. 

Plus, AI is in its infancy. We have no idea how it will continue affecting humanity going forward. “It’s right to both be hand-wringing from a sense of ‘this is a threat,’ and also to have a sense of the opportunity,” Haynie says. “It’s so new that this is really a threshold moment. I think that we’ll look back and think this has been kind of a revolutionary moment for a lot of things and what it will have been revolutionary for is yet to be determined.”

The Potential of Generative AI

As AI continues to improve and find footing in a wide variety of industries, its potential influence on the future jobs of current college students is worth examining. According to a report from Dell Technologies, a majority of jobs that will exist seven years from now haven’t been invented yet. So it makes a certain sense that preparing today’s college students for tomorrow means incorporating new tech tools into classes. 

During a library research session with her students last spring quarter, English lecturer Loring Pfeiffer saw Andrew Carlos, the head of research, outreach, and inclusion in the University Library, use ChatGPT as a tool to help students refine their topics for a writing assignment. 

In the session, Carlos guided the class in an exercise where ChatGPT produced a concept map of all the different subtopics related to an overarching topic. For example, if a student asked ChatGPT to create a concept map about self-driving vehicles, the student would see that there are many subtopics and even sub-subtopics within that broader concept. Think: ethics and autonomous vehicles, public policy and autonomous vehicles, sensors, etc.

For Pfeiffer, this session was an aha moment that highlighted the potential of ChatGPT for brainstorming and topic-selection purposes. “One of the things that’s really challenging about topic selection is that at the beginning stage of the process you don’t know what you don’t know,” Pfeiffer says. “If you have AI make you a mind map, the map shows you all of the different things that you don’t know and all of the subtopics that relate to this bigger picture topic. I can’t imagine another way of doing that kind of work that quickly.”

While ChatGPT has great potential to help, it is not without its limitations. In one assignment, Pfeiffer had her students use ChatGPT to identify themes and poetic techniques within Victorian-era texts such as the 19th century lyrical ballad The Lady of Shalott by Alfred Tennyson.

Students were supposed to engage in close reading analysis of the works, noting rhyme scheme, repetition, or other writing style reflective of the texts’ themes. John Paul Kraus ’24 says that ChatGPT failed to correctly identify rhyme schemes or examples of repetition within the poem. 

The program, in its current form, simply cannot capture certain aspects of human writing, Kraus says. “I’d say use it very sparingly,” he says. “It depends on the subject, but always look at it with a human perspective and have a little skepticism. Especially when it relates to emotions, which is something that humans do better than AI. Also when you’re doing the writing, it’s important not to skip the writing process for yourself for your own development as a writer.”

Waterhouse, The Lady Of Shallott, 1888
An artistic rendition of the 19th century poem The Lady of Shalott by Alfred Tennyson. In the original poem the repetition present throughout it signify the monotnous life of the the Lady of Shalott, however, ChatGPT failed to take note of this or the poem’s rhyming scheme. Original painting by John William Waterhouse.

Concerns of Generative AI

While ChatGPT may bring about new ways of approaching education, in many ways it still remains a Pandora’s box. The full extent of the ethical issues surrounding it aren’t yet fully understood. Subsequently, some faculty are more wary of embracing generative AI in their classrooms.

Electrical and computer engineering lecturer Andrew Wolfe sees how ChatGPT could help students circumvent doing the actual work on an assignment, which in turn would trivialize their entire college education. “We have students who would never think to go to their sister or go to their friend and say, ‘Can you write my essay for me and I’ll put my name on it and turn it in?’” Wolfe says. “Yet some of those same students might do the same thing with ChatGPT, and I think we’re hoping the students will understand that they need to be able to apply their own ethical principles, more often and more generally.”

With its vast computing power, generative AI can seemingly trivialize basic tasks like writing summaries or simple code. This brings into question the very foundations of pedagogy and whether we should or even can run before we learn to walk.

49062863796 E6f7686e65 C
Pictured above is a data center for Google. One of the concerns with ChatGPT and other generative AI is that they will require much more and possibly large data centers or data farms in order to function. Photo provided by Chad Davis.

Beyond the academic sphere, generative AI presents ethical and legal concerns for society at large. According to Irina Raicu, internet ethics director of the Markkula Center of Applied Ethics, what makes evaluating the ethics of generative AI even more difficult is the lack of transparency from companies that make these programs. 

“People play with these tools without realizing that every time they put a cute prompt into something like ChatGPT they’re actually using energy and having an environmental impact,” Raicu says. “It’s because those kinds of impacts aren’t explicitly shown, [and caused] in part by the fact that the tools were made at least initially free.”

According to Raicu, some AI tools may also be more environmentally costly than others. Specifically, large language models can have upwards to hundreds of billions of parameters, meaning they’re being trained on increasingly larger amounts of data in order to generate better responses. Building and implementing such tools requires rare metals for manufacturing, water to cool data centers, and energy to keep those data centers running.

Betty Li Hou ’22, who was a Hackworth Fellow with the Markkula Center from 2021 to 2022, agrees that a level of caution toward AI usage and its implementation is wise. For Hou, as exciting as AI can be in terms of potential benefits, she says there will always be bad actors that abuse it. These problems are often difficult to mitigate with fairness.

Fairness in AI has been a longstanding problem. What does “fair” look like in this context? Who determines what’s fair? Is there even a clear definition of the word? Identifying unfairness, then, amidst countless layers in AI systems is like finding a needle in a haystack. To make the matter more complex, the system itself might not be the issue but rather how it is integrated into society or how it is deployed and regulated. What does fair access to AI look like? Who gets to benefit and who is sidelined, or harmed, by it?

“From a technical standpoint of how models are built, we can look at how to make systems more fair, accessible, transparent, truthful, and coherent” says Hou, who is now a computer science doctoral student at New York University. “That’s what AI researchers do and it’s definitely not straightforward at all. I think more than ever though, there’s this question of how technology is going to come together with humanity. How is it going to change how we live in society and what it means for us to be human? It’s these really big philosophical and ethical questions that we have to think about now more than ever.”

Whether educators choose to implement generative AI like ChatGPT into their classrooms or not, Hou believes that at the end of the day all educators must consider what their goals are. 

“What is it that we really want students to walk away with, and how do we get them there?” Hou says. “Both in terms of knowledge but also skills, specifically life skills and critical thinking skills. Consider what will help them flourish, not just as students, but as people.”

A Woman’s Place

Despite being cut off from some levers of power within the Catholic Church, women continue to find ways to lead.

Increasing Access

Discerning one’s dream requires a whole set of experiences based on community, opportunity, and, yes, cash. Santa Clara helps first-generation students discover their paths through various means of support.

In Search of Verdure

Santa Clara students and faculty are on a quest for greener pastures.

Make AI the Best of Us

What we get out of artificial intelligence depends on the humanity we put into it.