UTAH TECH UNIVERSITY'S STUDENT NEWS SOURCE | May 09, 2024

OPINION | ChatGPT is not the end of academic integrity

The most advanced AI chatbot we have ever seen. Brynlee Wade | Sun News Daily

Share This:

Students relying on AI-powered tools like ChatGPT to complete homework assignments raises concerns about ethics, creativity and critical thinking skills.

Did that sentence come off as a little strange? I would hope so because the first sentence you read in this story was the response I received from ChatGPT when I entered the following prompt, “Can you write me a lead for my opinion story about college students using you to write essays and solve math problems in 25 words or less, and also do it in AP style?”

It’s that easy. A quick prompt and I didn’t have to write my own lead, which is something I’m always trying to get right. Even in jest, I feel pretty gross about it.

Students have been taking advantage of artificial intelligence like ChatGPT to take the easy way out when presented with their most prominent hurdle homework.

It sounds silly, but throughout my testing with the titular chatbot, I’ve found it’s all too simple to just use ChatGPT like I’m on an episode of “Who Wants To Be A Millionaire.”

I asked it questions of varying subjects at the college level, and it was able to answer before I could take a sip of water.

I asked it how to calculate the interest rate when taking out a loan; it gave me various answers that were all correct.

I asked where the deepest point in the ocean was, and it told me it was the Challenger Deep in the Mariana Trench.

I even asked about the primary source of energy for St. George, and it told me about hydroelectricity along with its benefits and downsides.

I also tried to stump it with video game trivia, and while I won’t go into all the details, I now know the optimal route to take when playing the original Super Mario Bros.

Is this the end of academic integrity as we know it? Not at all, but I believe we’re at the point where academic institutions need to incorporate measures for detecting AI-interference.

While plagiarism tools are used quite a bit in learning settings, the need for AI-detection is not something that’s widely implemented into curriculum yet.

In the spirit of this opinion, I decided to ask ChatGPT for an example of AI-detection software, and it gave me five. Such software needs to be used to preserve those “creative thinking skills” that ChatGPT so kindly mentioned.

Before you even think it, yes, I did put the first sentence of this story into an AI-detection program. It told me the sentence had a 77% chance of being fake. Take that as you will.

The tools are available for catching students in the act, so I don’t think anyone could get away with using AI for homework as long as the necessary precautions are taken.

With all that being said, I think ChatGPT is a really cool program. The fact it’s able to give such solid answers in such an easy-to-use fashion is nothing short of astounding. You don’t even have to pay for it.

I’m an avid fan of AI and the advancements made in this field. It’s not anywhere near “Terminator” level, and I don’t think it will ever be that big of a problem, but it’s none the less fun to see progress in the field of AI.

Should there be any worries about AI replacing traditional workers? Not right now. Sure, we have robots working at restaurants serving food, but that’s more automation and remote controlled than AI.

I mean come on. Google recently implemented an AI into one of its robots, and it was having a hard time identifying a sponge. I think we’re safe from our AI-overlords for the time being.

We’re still in the phase of experimentation with AI and its various uses. The reason why ChatGPT has been getting so much attention lately is because it’s one of the first examples of an AI working as intended, almost too well.

The fact that I can easily use this program to solve math problems with complicated formulae in seconds is awesome. It’s just the use of it in the wrong setting that can be upsetting.

It’s akin to a good blacksmith creating tools for use. It’s not up to the blacksmith to decide what tasks the tools are used for. He’s just skilled at making tools.

We don’t need to moderate the use of AI in academic settings. In fact, I think they could be quite useful in explaining how to solve a problem, rather than just giving the answer.

I hope academic institutions will act quickly to solve this problem of academic integrity, but I genuinely am excited about the future of AI in the world of education as well as its possibilities elsewhere.