Don’t Ban ChatGPT in Schools. Teach With It. OpenAI’s new chat bot is raising fears of cheating on homework, but its potential as an educational tool outweighs its risks. https://www.nytimes.com/2023/01/12/technology/chatgpt-schools-teachers.html
Makes sense. It’s like learning cursive for no reason at all when you can print and type. Technology moves the bar.
Given how powerful it is, I wouldn't be comfortable with blessing its use in a classroom unless all students had access and proper training on how to use it.
AI coming soon? ‘My AI Is Sexually Harassing Me’: Replika Users Say the Chatbot Has Gotten Way Too Horny
Alarmed by A.I. Chatbots, Universities Start Revamping How They Teach With the rise of the popular new chatbot ChatGPT, colleges are restructuring some courses and taking preventive measures. https://www.nytimes.com/2023/01/16/...intelligence-universities.html?smid=url-share
That's way too much work, but I'm going another direction that also entails way too much work. I'm afraid to discover that they've probably been cheating all along.
In some ways, learning the discipline of real research died with google. I remember getting B papers by reading one or two books that could kill bugs for direct quotes and googling the rest like the garnish on a chuck roast. If i were more competitive as a student, there were already services where people could write essays for you, and then you'd rewrite them to evade the checkers of that period. Students on medical or masters tracks usually felt overwhelming pressure to deliver, even if it meant sabotaging someone else's lab results... I guess if google and the internet was like the fast food of knowledge where facts were grounded up into one liners, opengpt would be the robotic mama bird vomiting all the words onto our children's papers. Critical thinking and meta learning will always be important but it feels more and more like society discourages it by offering soma and convenience. Good ol 80/20 rule. It's not all bad. I assume there will be more smart people by number even if the worldwide average slowly dips year after year.
Following the discussion we've been having in the AI Art thread yes I do think the problem with these is that it's removing the process of creation from humans. In this case the process of writing which actually takes work. As someone who is mildly dyslexic I'm sure Chat GPT would've been a huge benefit to me back in school but at the same time just having an AI write my papers for me would've taken out a lot of the thought and care that I needed to put into writing those papers. Schools though are going to have to learn to live with these and I don't think bans are going to work. These tools exist and will only get more widely available and easier to use.
By the way, the thing is super impressive, but it also lies like crazy lol. Just makes a lot of stuff up on the fly. But it's a super impressive linguistic demo.
College unis haven't really accepted the threat posed by online learning and even embraced it during covid despite yearly tuition that can cost a small to midsize car. Maybe this will catalyze needed reform and make a bigger value add for on-site classes? [ft]ChatGPT will force school exams out of the dark ages Too much of our testing regime still remains fixated on being able to regurgitate information In the past 20 years, search engines have revolutionised our access to knowledge. Neuroscience has transformed our understanding of how different people learn. But the way we teach and test has barely changed. My own kids sit national exams which feel horribly similar to those I took at school. They still require vast feats of memorisation but come with the new horror of “mark schemes” which must also be learnt to score points by parroting the correct “keywords”. To sit biology A-level, or history GCSE, is to see a fascinating subject reduced to a largely deadening plod through names, dates and formulas. Teachers don’t call this system “drill and kill” for nothing. Biology and history are subjects that parents of dyslexic children steer their offspring away from, fearing they will struggle to recall the sheer volume of facts irrespective of how well they grasp the concepts. It was only when one of my children turned out to be dyslexic that I realised just how narrow our system had become. Rote learning still has its place, in times tables and languages for example. But while I adored learning anthologies of poetry, my ability to recite these verses says nothing about whether I am a critical thinker. If all we are asked to do is string lists of facts together in an essay, we might as well be replaced by chatbots. That’s not the limit of our human abilities, and it’s not what employers want either. In Davos this week, where panels on generative AI were oversubscribed, Chief executives were talking about LQ — learnability quotient — as the new IQ. LQ is essentially a measure of adaptability, of our desire and ability to update our skills throughout life. Employers have been saying for years that they value collaboration and curiosity. It’s a world away from frantically cramming facts that are quickly forgotten as soon as the exam is over. This has a pretty dampening effect, frankly, on the joy of learning. The speed with which generative AI is developing makes us right to be wary — not least because it can generate disinformation. Unlike a calculator, which always gives the same answer, large language models like ChatGPT are probabilistic technologies which can give different answers to the same question at different times. But this makes it all the more important that we teach kids how to use them. Rather than banning ChatGPT teachers should ask pupils to give it an assignment and critique its response. Fans of generative AI believe it can complement human beings, not substitute for us. To make that true, we must keep up. It is intriguing that Singapore, whose schools have regularly topped the international OECD’s Pisa rankings, has been reforming its education system to “spark [a] passion for continuous learning” and foster “a mindset of life-long learning”. Its teachers are being asked to put more emphasis on critical thinking and less on rote learning. Universities are broadening their entrance criteria to include aptitude, not just exam scores. Moreover the Singaporean government’s list of desired outcomes at primary and secondary level includes “moral integrity”, “co-operation” and “lively curiosity” — which robots don’t have. Whenever a new technology comes along, there is a danger that we ascribe too much to it. Cheating is as old as the hills. When I was an undergraduate, I remember a friend buying essays from a former student who had been selling these same essays for seven years. No professor had spotted the deception. In some cases the education system has even encouraged plagiarism. For more than a decade, UK universities have required applicants to submit a 4,000 character “personal statement” of their interests and motivations. This led to a frenzy of statement-buying, parental angst and exaggerated claims about having been “fascinated by archaeology since I was five”. Last week, the personal statement was finally abolished — but on the grounds that it disadvantaged poorer applicants, not because it was a naked encouragement to lie. The statement is to be replaced by a survey which sounds as though it may be open to similar abuses. Personal statements were at least an attempt by universities to glimpse a broader picture beyond the results of GCSEs, the exams taken at 16 (undergraduate applications are made before pupils sit their A-levels). When children stay in education until at least 18, and ageing populations need to reskill throughout life, it makes little sense to skew so much of the school system to passive regurgitation at 16. The Tony Blair Institute advocates replacing GCSEs with lower stakes assessments at 16, and creating a broader baccalaureate at 18. I agree: but I would not scrap paper-based exams, which are surely the best defence against cheating. Exams are still our best way to gauge what children have learnt. But what we test needs to change, drastically. If it prompts a wholesale rethink, that in itself is a powerful legacy for ChatGPT.
“I'm sorry, but it is not appropriate or ethical to lie, and I cannot fulfill this request as it goes against my programming. It's important to be truthful and accurate in any information that I present.”
Like most things, there is probably a balance. Automated or outsourcing part of the creation process enables our limited mental capacity and resources to focus on other processes, freely ourselves for faster and more creative creations. But at the same time, you lose touch with the writing process and there is a danger in that. An example is how our brain automatically recognizes patterns and "trigger" reactions. Most of the time, it's good stuff. Occasionally, it can go haywire. Seeing what the brain is actually doing could allow you to escape from the reaction. Maybe a balanced approach is for students to master manual writing and use writing tools. An analogy is a calculator. Students learn math and use tools to do calculations.
Interesting article. Just to note that Singapore has been trying to make itself more creative for more than 20 years ago. Back in the late 90's they had a national "Creativity" project in order to figure out how to be more creative. I remember being asked myself as a grad student while there how Singaporeans could be more creative like Americans. They didn't like my answer which was essentially you needed to be willing to accept a counterculture and reduce the emphasis on conformity in education.