Is ChatGPT a friend or foe? Abilene colleges take on AI as fall term nears

Is ChatGPT a friend or foe? Abilene colleges take on AI as fall term nears

In the year 3535

Ain’t gonna need to tell the truth, tell no lie

Everything you think, do and say

Is in the pill you took today.”

– “In the Year 2525,” Zager and Evans, 1969

Almost 55 years ago this month, a song by Denny Zager and Rick Evans peered into the future and found it to be perilous. The song, recorded in one take in a field near Odessa – yes, just west of Abilene – was No. 1 for six weeks.

We’re still a long way from 2525, but were folk singers Zager and Evans musical prophets?

The emergence of AI – artificial intelligence – in the mainstream would suggest they had an inkling of what was ahead the same year Neil Armstrong walked on the moon.

Nine months ago, ChatGPT was released. It’s the product of software firm Open AI and capable of creating human-like text based on past conversations and input on context.

It’s called a “language processing tool.”

ChatGPT's free launch last fall has had colleges scrambling to deal with this tech tool

ChatGPT’s free launch last fall has had colleges scrambling to deal with this tech tool

ChatGPT has excited many while causing others concern. Red flags were hoisted on college campuses, where administrators and faculty had to call a time-out and huddle on the sidelines to figure out a game plan on the fly. Would students simply let ChatGPT write their term papers?

Expelliarimus!

As the fall academic term approaches, we talked to representatives of Abilene’s three four-year universities about AI’s arrival. Actually, it was pointed out, AI has been around for years.

Think Hal 9000 in “2001: A Space Odyssey,” which came out in 1968.

A year before the Zager and Evans’ musical warning.

In the year 5555

Your arms hangin’ limp at your sides

Your legs got nothin’ to do

Some machine’s doin’ that for you.”

McMurry’s take: Face it, AI is here and we need to be there, too

McMurry’s Matthew Draud is both fascinated and wary of ChatGPT.

Artificial intelligence is not new to college campuses. A “red flag,” if you will, was raised even a decade ago, he said. AI was used as a tool, he said, “to discover situations of academic dishonesty.” Programs could be used to determine if the work was original or not.

Sound familiar?

AI overall goes back even further, Draud said. Artificial intelligence has played an increasingly larger role in our lives for years. Did “auto fill” help with your last online purchase or application or search?

“But it certainly has hit this exponential increase in what it’s capable of doing,” said Draud, vice president for academic affairs and dean of the faculty. “Where we stand and where we stood a year are dramatically different.”‘

Admitting he’s a TikTok fan (to follow what’s going on education, of course, he said, smiling), Draud started getting feeds “about this interesting thing called ChatGPT, which I had never heard of. When it first came out, there were just a few users who had beta access to it. They were saying, ‘Oh my gosh, this is a game-changer.’

Matt Draud, vice president for academic affairs and dean of the faculty at McMurry University.

Matt Draud, vice president for academic affairs and dean of the faculty at McMurry University.

“It was November when I started seeing streams about what ChatGPT can do. It became really evident that as a large language processor, it was going to be a complete game-changer.”

Yes, there had been applications that students could use “to basically cheat,” he said, but those cost money.

ChatGPT offered a free version.

“And it’s capable of a lot better writing,” Draud said.

Thus, the immediate knee-jerk in the academic world was “students will be cheating left and right,” he said. “It will be hard to detect it because the writing will be so good. So human-like. It doesn’t have that choppiness that AI applications in the past had.”

That led to use of news detectors of AI-generated writing, to mixed success.

Draud called that a “very superficial reaction to something that is far more profound.”

What, he wondered, was this going to do to higher education and labor markets?

“Every aspect of humanness is going to be changed by this. What universities do to prepare students for careers is going to change,” he said.

Students have been trained to write code. ChatGPT can write code. It’s just another language, such as English.

Thus, instruction may not be learning how to code but getting different artificial intelligence models to do the coding as desired.

“By December or January, it became very clear that prompt engineering might be the next big thing,” Draud said.

He told McMurry faculty that, yes, plagiarism is a concern, but there’s more to this.

“We can worry about that. But the most important thing is understanding how this is going to transform the world and how are we going to transform students in order to live in that new world,” he told faculty.

A history professor told students that using ChatGPT is fine to use to write a paper, but he was expecting the student to do the research to provide specific data. ChatGPT data, Draud said, will be far-ranging and is only up to date through 2021.

“I am going to expect that your papers are accurate, so you’re going to have to review your papers,” Draud said of the prof’s plan. “And it says what you want it to say.”

The professor had mixed results. Students followed that plan and turned in “some the best papers he’s ever read,” Draud said. But students who didn’t merge AI and their research parameters “turned in some of the worst papers that he had ever gotten. Not the way they were written but in terms of the factual content.”

How about teaching, say, future accountants? AI will change their jobs, Draud said.

It can provide flawless auditing. Yet, could AI also be used to “cook the books?”

“I don’t think accountants are going to go out of business. They’ll make as much money as they have made in the past. The deal is, their jobs will change,” Draud said. “What will they be doing? That’s what I want my faculty to be thinking about.”

And so, Draud said, “It’s imperative that we teach our students how to use this technology and not prevent them from using it. That would be absolutely the wrong instinct.”

McMurry’s IT staff at first said it could shut down ChatGPT.

Draud balked at that suggestion, saying it would confine students. And maybe chase them off.

“We want them to be able to use that model,” he said, in a way that is educationally beneficial to students, not just conveniently beneficial.

Yet, what about creativity? Or sincerity?

Draud said Vanderbilt caught flak for issuing a ChatGPT condolence statement after the student shooting in February at Michigan State. An apology had to be issued.

Draud then considered that people send Hallmark cards, signing a prewritten message.

“We do this all the time. We allow other people to express our thoughts,” he said. Now it would be a computer. Yet, he added, people most often fall back on most common human sentiment: You are in our thoughts and prayers.

Draud paused.

“I really am struggling with this,” he said of balancing technology with the human element. HI vs. AI. Creativity is the sum of a person’s life experiences, which a computer program can assemble but not feel. Yet AI is providing the sum of a million more experiences and information.

That could be amazing.

Bottom line for McMurry, he said, is approaching AI from three directions:

  • Preparing students for a world in which AI will be vital and prominent

  • Discouraging them from taking educational shortcuts by using AI

  • Employing AI to better teach students

“Those are the three things that have been on my radar,” he said.

HSU computer science prof: College curriculum needs reboot

Wade Ashby, assistant professor of computer science at Hardin-Simmons, developed his own program to check students’ work in his entry programming classes.

It grades the work and sends input back to them.

“I wrote that four or five years ago,” he said. “It frees up my time up to spend more time with the student.”

That’s example of how AI has a positive influence on education, he said. And has been around the campus.

He recently has been working with Travis Seekins, VP for enrollment management, to develop ways to identify prospective students and then to keep them at HSU. Particularly at-risk students.

“How can we create an intervention plan?” Ashby said. “You can use machine learning to do that.”

In fact, he is pursuing his Ph.D. with his dissertation focusing on using machine learning to predict student outcomes.

“It’s more at a generic level, not specific to a discipline,” he said.

It’s AI at work, he said.

AI also is affecting instruction in HSU’s Kelley School of Business.

“We are looking possibly at redoing our computer science curriculum,” said Ashby, bringing up GitHub Copilot – an artificial intelligence tool that auto-completes code “in a very intelligent sense.”

A study, he said, showed that a development team using GitHub Copilot outperformed a team not using it by 56%.

“It’s functioning into the development environment already, so getting our students ready for that, we’re looking at shifting our curriculum. I need to lay the groundwork so my students have that foundation when they graduate to walk into whatever is out there,” Ashby said.

The large language model that ChatGPT is built on dates to 2014, he said, but it “bloomed” in 2017. So, Ashby said, it has been around but the November release of free ChatGPT sailed it into the mainstream.

He noted in an April presentation that tutoring service Chegg lost 48% on its stock price in one day when ChatGPT was available at no cost.

Ashby said HSU has not set a policy on ChatGPT use but Dr. David Joyner, who heads the learning management system program at Georgia Tech (of which Ashby is a graduate), suggested it or another other AI be treated like “any other agent in the course.”

It could be viewed as a participant in a study group, “but you wouldn’t ask the study group to write the paper for you,” Ashby said.

He said he caught two students this past spring using ChatGPT in his classes.

Cheating remains unethical, he said, “but it also does you a disservice because you’re not learning what you need to know to get a career in the computer science industry. I stress that heavily, but at the same time incorporate (AI).”

In one class, he taught in collaboration with AI. Instead of starting from scratch with a project, which has been the norm, use of ChatGPT was allowed “but they had to build something different than what I normally would’ve required.”

Students had to learn about cohesion in coding to build AI models.

Wade Ashby, assistant professor of computer science at Hardin-Simmons University - with Baymax from the movie "Big Hero 6."

Wade Ashby, assistant professor of computer science at Hardin-Simmons University – with Baymax from the movie “Big Hero 6.”

“I hope this is where Hardin-Simmons is headed. This is where I am headed. This isn’t going away,” he said.

He likens using AI now to using card catalogues in the library years ago. A calculator doesn’t make a person better at math, he said, “but it makes people better at math a whole lot faster.”

Spell check and Grammarly are examples, he said, of AI in use for years.

“We’re OK with those,” Ashby said.

What is key for an instructor to impart to a student is that work is being assessed, not for production, “but to see that you have that understanding. If AI helps you a little, that’s OK but it shouldn’t do it for you.”

ChatGPT can offer a framework for, say, writing a first draft of a writing project. Then, the writer takes it from there.

To prove his point, he sent an email invitation to HSU faculty and staff for a brown-bag lunch about ChatGPT, Ashby used the program.

“It was fun going back and forth with it,” he said. He forgot the date the first time, so revised it. It was too formal, so he asked for the invite to be more light-hearted.

“I sent it out by ChatGPT, prompted by Wade Ashby,” he said, laughing. “Part of that was the shock effect – ChatGPT wrote this email.

Ashby recalled a report from the Department of Labor to the president. It addressed the rapid advancement of technology and the possible negative effect on the U.S. worker.

The president, Ashby said, was Lyndon Johnson.

“So this isn’t a new concept,” he said. The challenge now is considering if technology is advancing faster “that a human can reskill.”

ACU writing coach emphasizes need to embrace what’s next

What about teaching students how to write?

Dr. Cole Bennett is an English professor who also directs the ACU Writing Center. He is invested in writing, and “not afraid” of technological advances such at ChatGPT.

Before the fall term begins, ACU faculty will gather for a one-day jumpstart into classes called “Faculty Fusion.”

One development session will focus on AI, and include Bennett.

“We’ve had a lot of faculty interest from all over campus,” he said. “The comforting news is that very few people in academic are standing there saying, ‘I have all this figured out. I have all this figured out, so here we go.'”

Instead, the goal is to become individually comfortable in the classroom.

Dr. Cole Bennett, professor of English and Writing Center director at Abilene Christian University. Aug 2 2023

Dr. Cole Bennett, professor of English and Writing Center director at Abilene Christian University. Aug 2 2023

The goal of the writing center, which Bennett said many universities now offer, is to help students with, no surprise, their written work. But not just students who struggle – many students, he said, are A and B students who want their work to shine. A resume, or a poem.

The one-on-one aspect works, and it’s where stumbling blocks such as plagiarism can be caught before an instructor makes the arrest.

So how will tutors employ ChatGPT into the efforts at the center? Will there be a need for a writing coach if a computer is doing the writing?

Bennett believes there will.

He specializes in composition and rhetoric – the writer’s intent to persuade. This background is driving him forward in embracing AI. He clearly states he is not afraid of what has arrived.

But gone on are the days of book reports. ChatGPT can write a summary and analysis. Many have done that before. It can gather those ideas and give a student a terrific presentation, Bennett said.

Instructors, he said, had to ask, “Where can we take that?”

For example, how the content of a book assigned to be read “intersects the things that we’ve been learning in this class. That’s a much narrower assignment that’s not going to be found on Wikipedia.”

And if AI can do that, too, faculty has to ask again, “What am I trying to get my students to learn and how can I do that so it doesn’t just encourage, it demands them to bring in their creativity? Their rhetorical creativity.”

Dr. James Prather, a computer science prof at ACU, spoke previously to faculty and said ChatGPT can write flawless program coding in 10 seconds. That was what students were tasked to do. Prather ditched that.

“I am moving onto what’s next,” Prather told faculty. Bennett asked himself the same question, as did others.

“People who are teaching writing are teaching rhetoric,” he said. “We want student to assert claims with evidence.”

The focus become nuance rather than emphasis on research. ChatGPT can do that.

“How do you use a source as evidence? How do you integrate it?” Bennett asked. “We’re now talking about evaluation of rhetoric as we create rhetoric, more than the mechanics of rhetoric.

“I think what ChatGPT is doing is bringing a tool to bear that is so accomplished at doing certain things that it pushes us to teach more nuanced rhetoric.”

Thus, Bennett sees AI has a great tool, not something to catch and punish.

“I think that is a fool’s errand,” he said.

He easily can spot Joe Lazy – not that Joe attends ACU, he said, laughing.

“As I grade students’ essays, I still am very attuned to whether they are meeting the prompt well rhetorically,” Bennett said. It doesn’t matter of Joe wrote it or asked his friend, ChatGPT.

Did Joe not read over the work to make sure it presented what he wanted to turn in it?

“Sometimes,” Bennett said, “ChatGPT will miss a prompt.”

“My job is to teach both the production and reception of rhetorical savvy,” he said.

“Is this saying what I’m trying to say?”

This article originally appeared on Abilene Reporter-News: ChatGPT, friend or foe? Abilene colleges take on AI as fall term nears

EMEA Tribune is not involved in this news article, it is taken from our partners and or from the News Agencies. Copyright and Credit go to the News Agencies, email news@emeatribune.com Follow our WhatsApp verified Channel210520-twitter-verified-cs-70cdee.jpg (1500×750)

Support Independent Journalism with a donation (Paypal, BTC, USDT, ETH)
WhatsApp channel DJ Kamal Mustafa