College professors are going back to paper exams and handwritten essays to fight students using ChatGPT::The growing number of students using the AI program ChatGPT as a shortcut in their coursework has led some college professors to reconsider their lesson plans for the upcoming fall semester.
They’re about to find out that gen Z has horrible penmanship.
Millennial here, haven’t had to seriously write out anything consistently in decades at this point. There’s no way their handwriting can be worse than mine and still be legible lol.
As a millennial with gen Z teens, theirs is worse, though somehow not illegible, lol. They just write like literal 6 year olds.
Last week of school i found out my history teacher took all my handwritten things too the language teacher and had her copy it into legibility i felt so bad for that lady.
You’d be so surprised. From my interactions with my younger cousins and in laws, they can’t even write in cursive.
I’m in the weird in between gen z and millennial. I only use cursive to sign my name and read grandma’s Christmas card. Frankly, it’s not useful for me. I’m glad we spent the time in school taking typing classes instead of cursive.
What is crazy to me is that my youngest cousins (in their early teens) use the hunt and peck method to type. Despite that, they’re not super slow. I was absolutely shocked when I found that out. I think it was all the years of using a phone or tablet instead of an actual keyboard that created a habit.
Same and times I’ve had to write my hand cramped up so quickly from those muscles not being active for years
Actually that is a sign of dysgraphia.
You’d be surprised. My daughter (13) has better penmanship than I do (46). Although I’m sure my left-handedness doesn’t help there.
I have horrible penmanship as a millennial, I was typing from around 6 on
There are places where analog exams went away? I’d say Sweden has always been at the forefront of technology, but our exams were always pen-and-paper.
Same in Germany
Norway has been pushing digital exams for quite a few years, to the point where high school exams went to shit for lots of people this year because the system went down and they had no backup (who woulda thought?). In at least some universities most of or all exams have been digital for a couple years.
I think this is largely a bad idea, especially on engineering exams, or any exam where you need to draw/sketch or write equations. For purely textual exams, it’s fine. This has also lead to much more multiple-choice or otherwise automatically corrected questions, which the universities explicitly state is a way of cutting costs. I think that’s terrible, nothing at university level should be reduced to a multiple choice question. They should be forbidden.
The university I went to explicitly did in person written exams for pretty much all exams specifically for anti-cheating (even before the age of ChatGPT). Assignments would use computers and whatnot, but the only way to reliably prevent cheating is to force people to write the exams in carefully controlled settings.
Honestly, probably could have still used computers in controlled settings, but pencil and paper is just simpler and easier to secure.
One annoying thing is that this meant they also usually viewed assignments as untrusted and thus not worth much of the grade. You’d end up with assignments taking dozens of hours but only worth, say, 15% of your final grade. So practically all your grade is on a couple of big, stressful exams. A common breakdown I had was like 15% assignments, 15% midterm, and 70% final exam.
Covid forced the transition to electronic exams in many areas.
as someone with wrist and hand problems that make writing a lot by hand, I’m so lucky i finished college in 2019
Copy writing about to make a real comeback.
Can we just go back to calling this shit Algorithms and stop pretending its actually Artificial Intelligence?
It actually is artificial intelligence. What are you even arguing against man?
Machine learning is a subset of AI and neural networks are a subset of machine learning. Saying an LLM (based on neutral networks for prediction) isn’t AI because you don’t like it is like saying rock and roll isn’t music
I am arguing against this marketing campaign, that’s what. Who decides what “AI” is and how did we come to decide what fits that title? The concept of AI has been around a long time, like since the Greeks, and it had always been the concept of a made-made man. In modern times, it’s been represented as a sci-fi fantasy of sentient androids. “AI” is a term with heavy association already cooked into it. That’s why calling it “AI” is just a way to make it sound high tech futuristic dreams-come-true. But a predictive text algorithm is hardly “intelligence”. It’s only being called that to make it sound profitable. Let’s stop calling it “AI” and start calling out their bullshit. This is just another crypto currency scam. It’s a concept that could theoretically work and be useful to society, but it is not being implemented in such a way that lives up to its name.
Who decides what “AI” is
Appearently you.
Who decides what “AI” is and how did we come to decide what fits that title?
Language is ever-evolving, but a good starting point would be McCarthy et al., who wrote a proposal back in the 50s. See http://www-formal.stanford.edu/jmc/history/dartmouth/dartmouth.html
Techniques have come into and gone out of fashion, and obviously technology has improved, but the principles have not fundamentally changed.
The field of computer science decided what AI is. It has a very specific set of meanings and some rando on the Internet isn’t going to upend decades of usage just because it doesn’t fit their idea of what constitutes AI or because they think it’s a marketing gimmick.
It’s not. It’s a very specific field in computer science that’s been worked on since the 1950s at least.
If AI was ‘intelligent’, it wouldn’t have written me a set of instructions when I asked it how to inflate a foldable phone. Seriously, check my first post on Lemmy…
https://lemmy.world/post/1963767
An intelligent system would have stopped to say something like “I’m sorry, that doesn’t make any sense, but here are some related topics to help you”
AI doesn’t necessitate a machine even being capable of stringing the complex English language into a series of steps towards something pointless and unattainable. That in itself is remarkable, however naive it may be in believing you that a foldable phone can be inflated. You may be confusing AI for AGI, which is when the intelligence and reasoning level is at or slightly greater than humans.
The only real requirement for AI is that a machine take actions in an intelligent manner. Web search engines, dynamic traffic lights, and Chess bots all qualify as AI, despite none of them being able to tell you rubbish in proper English
The only real requirement for AI is that a machine take actions in an intelligent manner.
There’s the rub: defining “intelligent”.
If you’re arguing that traffic lights should be called AI, then you and I might have more in common than we thought. We both believe the same things: that ChatGPT isn’t any more “intelligent” than a traffic light. But you want to call them both intelligent and I want to call neither so.
I think you’re conflating “intelligence” with “being smart”.
Intelligence is more about taking in information and being able to make a decision based on that information. So yeah, automatic traffic lights are “intelligent” because they use a sensor to check for the presence of cars and “decide” when to switch the light.
Acting like some GPT is on the same level as a traffic light is silly though. On a base level, yes, it “reads” a text prompt (along with any messaging history) and decides what to write next. But that decision it’s making is much more complex than “stop or go”.
I don’t know if this is an ADHD thing, but when I’m talking to people, sometimes I finish their sentences in my head as they’re talking. Sometimes I nail it, sometimes I don’t. That’s essentially what chatGPT is, a sentence finisher that happened to read a huge amount of text content on the web, so it’s got context for a bunch of things. It doesn’t care if it’s right and it doesn’t look things up before it says something.
But to have a computer be able to do that at all?? That’s incredible, and it took over 50 years of AI research to hit that point (yes, it’s been a field in universities for a very long time, with most that time people saying it’s impossible), and we only hit it because our computers got powerful enough to do it at scale.
If it shouldn’t be called AI or not no idea…
But some humans are intel·ligent and let’s be clear…say crazier things…
Inflating a phone is super easy though!
Overheat the battery. ;) Phone will inflate itself!
But then the investor wont throw wads of money at these fancy tech companies
Please let’s not defame Djikstra and other Algorithms like this. Just call them “corporate crap”, like what they are.
This thinking just feels like moving in the wrong direction. As an elementary teacher, I know that by next year all my assessments need to be practical or interview based. LLMs are here to stay and the quicker we learn to work with them the better off students will be.
And forget about having any sort of integrity or explaining to kids why it’s important for them to know how to do shit themselves instead of being wholly dependent on corporate proprietary software whose accessibility can and will be manipulated to serve the ruling class on a whim 🤦
It’s insane talking to people that don’t do math.
You ask them any mundane question and they just shrug, and if you press them they pull out their phone to check.
It’s important that we do math so that we develop a sense of numeracy. By the same token it’s important that we write because it teaches us to organize our thoughts and communicate.
These tools will destroy the quality of education for the students that need it the most if we don’t figure out how to reign in their use.
If you want to plug your quarterly data into GPT to generate a projection report I couldn’t care less. But for your 8th grade paper on black holes, write it your damn self.
Putting quarterly data into ChatGPT is dangerous for companies because that information is being fed into the AI and accessible by its creators, which means you’re just giving away proprietary information and trade secrets by doing that. But do these chucklefucks give one single shit? No. Because they’re selfish, lazy assholes that want robots to do their thinking for them.
In what ways do you envision working with LLMs as an educator of children?
I have used ChatGPT to explain to myself a number of fairly advanced technical and programming concepts; I work in Animation through my own self-study and some good luck, so I’m constantly trying to up my skills in the math that relates to it. When I come up against a math or C++ term or concept that I do not currently understand, I can generally get a pretty good conceptual understanding of it by working with ChatGPT.
So at one point I wanted to understand what Linear Algebra specifically meant, and it didn’t stick but I do remember asking it to expand on things it said that weren’t clear, and it was able to competently do so. By asking many questions I was able - I think - to get clearer on a number of things which I doubt I ever would have learned, unless by luck I found someone who knows the math to teach me.
It also flubbed a lot of basic arithmetic, and I had to mentally look for and correct that.
This is useful to an autodidact like myself who has learned how to learn at a University level, to be sure.
I cannot, however, think of a single beneficial way to use this to educate small children with no such years of mental discipline and ability to understand that their teacher is neither a genius nor a moron, but rather, a machine that pumps out florid expressions of data that resemble other expressions of similar data.
Please, tell me one.
has led some college professors to reconsider their lesson plans for the upcoming fall semester.
I’m sure they’ll write exams that actually require an actual understanding of the material rather than regurgitating the seminar PowerPoint presentations as accurately as possible…
No? I’m shocked!
We get in trouble if we fail everyone because we made them do a novel synthesis, instead of just repeating what we told them.
Particularly for an intro course, remembering what you were told is good enough.
Thats a shitty system from both sides.
Meh. I haven’t been in Uni in over 20 years. But it honestly seems kind of practical to me.
Your first year is usually when you haven’t even settled on a major. Intro classes are less about learning and more about finding out if you CAN learn, and if you’ll actually like the college experience or drop out after your first year.
The actual learning comes when the crowd has been whittled to those who have the discipline to be there.
I’m glad you had a better experience than mine on academia. Still wanting that time back.
I would love to have that time and money back.
One of the disadvantages of being of an age where you straddle the line between worlds without internet and with, is that you get to enjoy the 20,000 dollars you spent on learning in the 90s suddenly be available for free in the present.
Seriously, there isn’t a single thing I learned in my Near Eastern Classical Archaeology degree that I couldn’t just go learn from Wikipedia today.
I wish! I got roped into doing it after the Internet was available.
Teachers halfass pretended to teach and we halfass pretended to learn because we tought that piece of paper at the end would make a difference.
Turns out googling shit instead of being in debt was the way to go all along.
if you CAN learn
I always found this argument completely unsatisfactory…
Imagine someone coming up to you and saying “you must learn to juggle otherwise you can’t be a fisherman” and then after 14 years of learning to juggle, they say “you don’t actually need to know how to juggle, we just had to see if you CAN learn. Now I can teach you to fish.”
You’d be furious. But, because we all grew up with this nonsense we just accept it. Everyone can learn, there’s just tons of stuff that people find uninteresting to learn, and thus don’t unless forced; especially when the format is extremely dry, unengaging, and you’ve already realized… You’re never going to need to know how to juggle to be a fisherman… ever.
The show “Are you smarter than a fifth grader?” (IMO) accurately captures just how worthless 90% of that experience is to the average adult. I’ve forgotten so much from school, and that’s normal.
The actual learning comes when the crowd has been whittled to those who have the discipline to be there.
Also this is just ridiculous, “Everyone is a genius. But if you judge a fish by its ability to climb a tree, it will live its whole life believing that it is stupid.”
You do realize you get to choose which courses to take in undergrad right? Universities aren’t forcing you to take any of the courses, you choose ones in subjects you are interested in, and first year is to get you up to speed/introduce you to those subjects, so you can decide if you want to study them further.
once you have a major or specialist, then yeah, you have some required courses, but they do tend to be things very relevant to what you want to do.
You do realize you get to choose which courses to take in undergrad right? Universities aren’t forcing you to take any of the courses, you choose ones in subjects you are interested in, and first year is to get you up to speed/introduce you to those subjects, so you can decide if you want to study them further.
That’s not true at all, every degree has a required core curriculum at every university I’ve ever heard of (e.g., humanities, some amount of math, some amount of English, etc). It also says nothing for the K-12 years.
In my university you had breadth requirements, but it was 1 humanities course, 1 social science, and 1 science, and you could pick any course within those areas to fulfill the requirement. So you had a lot of choice within the core curriculum. Man, if other unis aren’t doing that, that sucks.
My favourite lecturer at uni actually did that really well. He also said the exam was small and could be done in about an hour or two but gave us a 3 hour timeslot because he said he wanted us to take our time and think about each problem carefully. That was a great class.
IME, a lot of professors liked to write exams that specifically didn’t cover anything from the PowerPoint presentations lol
deleted by creator
When I was in College for Computer Programming (about 6 years ago) I had to write all my exams on paper, including code. This isn’t exactly a new development.
So what you’re telling me is that written tests have, in fact, existed before?
What are you some kind of education historian?
He’s not pointing out that handwritten tests are not something new, but that using handwritten tests over typing them to reflect the student’s actual abilities is not new.
Same. All my algorithms and data structures courses in undergrad and grad school had paper exams. I have a mixed view on these but the bottom line is that I’m not convinced they’re any better.
Sure they might reflect some of the student’s abilities better, but if you’re an evaluator interested in assessing student’s knowledge a more effective way is to make directed questions.
What ends up happening a lot of times are implementation questions that ask from the student too much at once: interpretation of the problem; knowledge of helpful data structures and algorithms; abstract reasoning; edge case analysis; syntax; time and space complexities; and a good sense of planning since you’re supposed to answer it in a few minutes without the luxury and conveniences of a text editor.
This last one is my biggest problem with it. It adds a great deal of difficulty and stress without adding any value to the evaluator.
Well if i go back to school now im fucked i cant read my own hand writting.
Waiting for 100% oral exams to make a comeback.
deleted by creator
Am I wrong in thinking student can still generate an essay and then copy it by hand?
Not during class. Most likely a proctored exam. No laptops, no phones, teacher or proctor watching.
Only if you memorize it. In which case, why not just memorize the test material?
Sounds like effort, I’m making a font out of my handwriting and getting a 3d printer to write it
Obviously that is the next step for the technically inclined, but even the less inclined may be capable of generating them copying to save time and brain effort.
Joke’s on you, I can’t write by hand without severe pain after a short while!
For this case you can get an assistant to write for you while you narrate
Me too. And on top of that, I’m left-handed, so any time I write, I get ink or graphite all over my hand.
This isn’t exactly novel. Some professors allow a cheat sheet. But that just means that the exam will be harder.
Physics exam that allows a cheat sheet asks you to derive the law of gravity. Well, OK, you write the answer at the bottom pulled from you cheat sheet. Now what? If you recall how it was originally created you probably write Newtons three laws at the top of your paper… And then start doing some math.
Calculus exam that let’s you use wolfram alpha? Just a really hard exam where you must show all of your work.
Now, with ChatGPT, it’s no longer enough to have a take home essay to force students to engage with the material, so you find news ways to do so. Written, in person essays are certainly a way to do that.
You can always write down what gpt shows on the screen onto a paper
Our uni accepts 99% of things on paper (except some reports, which need to be done in Latex), and I am glad this is the case. I just can’t think normally while at the screen. If there’s a task - I need to do it on paper first and only then type out (which is a bit frustrating because I can only type with two fingers).
I am the total opposite. Put a pen in my hand and my brain just fogs up.