I have an Electronic Practice. Front line Health workers and emergency responders have priorities for appointments. For appointments call 416-878-4945 or email- silva.redigonda@alumni.utoronto.ca Sessions are $170.00 for a 50 minute hour. Prices increasing in January 2025, Consultations/Couple Therapy/family therapy is $200. Check with your EAP/Insurance for coverage. Opening practice to residents of the Province of Quebec as well as Ontario. English and Italian speaking.
Search This Blog
Tuesday, 12 September 2023
Article from York University cut and pasted here. Shared on my facebook, linkedin and X. Could not seem to share here.
MARKUS GIESLER
A Bot Aced My Homework
How ChatGPT is impacting the academic experience
BY SHARON ASCHAIEK
RELEASED LAST NOVEMBER, ChatGPT can compose essays and news releases, research subjects, suggest story ideas, even have philosophical conversations and debug computer code. Make a request, and it scans the web for relevant information and, using what it has learned from training data, almost instantly generates a response that is usually on topic and well-drafted.
But don’t be alarmed.
While the artificial intelligence tool can write, research and “converse” in surprisingly human ways, it can’t replace the “valuable components of a well-rounded education.” That’s what the bot says when pointedly asked if its very existence will threaten critical thinking and problem-solving at the university level. “I see myself,” it continues, “as a complementary tool that can enhance learning, but not as a replacement for it.”
But can we believe that?
Since ChatGPT lacks common sense and emotional intelligence (and also can’t understand the subtleties of context and humour) the tool sometimes gives inaccurate answers. Even OpenAI CEO Sam Altman said as much in a tweet posted last December: “It’s a mistake to be relying on it for anything important right now. It’s a preview of progress; we have lots of work to do on robustness and truthfulness.” Not the most encouraging of words, and yet here we are – with a chatbot that’s got many in academe rethinking what they do.
Pina D’Agostino smiling, standing
Pina D’Agostino
As a tool that is incredibly helpful for students to use – and misuse – in their academic work, it’s compelling university professors and administrators to consider how to both leverage its educational value and hedge against cheating. “People are stressed out about it … and very concerned about what it might do for their assessment practices,” says Robin Sutherland-Harris, an educational developer at York’s Teaching Commons.
It’s not exhaustive, and it’s never going to replace someone actually doing the work
Within only two months of its launch, ChatGPT reached a record-setting 100 million monthly users; it took TikTok nine months to achieve that number. Because it can continuously learn from new queries it receives, ChatGPT is getting “smarter,” meaning it’s increasingly able to produce meaningful results. GPT-4, the tool’s latest iteration, was touted by OpenAI as being 60 per cent less likely to give false information.
“We need to adapt and innovate, because the technology’s not going to stop,” says Osgoode Hall law Professor Pina D’Agostino (BA ’96; LLB ’99), who co-directs York’s Centre for AI and Society. As a research and writing tool, ChatGPT “is a good start,” adds D’Agostino, who was recently named as vice-director of Connected Minds: Neural and Machine Systems for a Healthy, Just Society, a $318-million research project focused on AI. “But it’s not exhaustive, and it’s never going to replace someone actually doing the work.”
Creating and enforcing rules around students’ use of ChatGPT is happening in real time as York navigates the current Wild West terrain of advanced AI. The University now has a webpage on AI technology and academic integrity that includes advice for instructors, including instructing students that unauthorized use of ChatGPT or similar platforms in assessments is a breach of academic honesty. It also touches on teaching and learning suggestions and the detecting of AI-generated content in student work, and shares links to relevant resources.
Recently, York went a step further in these efforts by holding a professional development event on ChatGPT’s capabilities, limitations and educational uses. Organized by Sutherland-Harris with Angela Clark, academic integrity officer in the Office of the Vice-Provost Academic, the event was held in response to an influx of questions from faculty members across the institution.
Because of the unevenness of the landscape … people need to be very clear about what the expectations are for their students
It included a two-hour panel discussion involving computer science Professor Marcus A. Brubaker of the Lassonde School of Engineering and sava saheli singh, a professor of digital futures in the Faculty of Education. Ideas for how to use ChatGPT to improve student experience at York animated the session, yielding new approaches for giving assignments, for instance, and essay writing.
One suggestion was to ask students to develop thoughtful, well-informed prompts for ChatGPT that could yield a high-quality response, then assess it for accuracy and completeness. Another was to get students to generate alternate views to an essay argument, which would give them useful starting points for further exploration. University policy will need to keep evolving to provide clarity and align with the school’s code of conduct.
“Because of the unevenness of the landscape … people need to be very clear about what the expectations are for their students course by course, and not just put it in the syllabus and assume people will read it, but talk about it, you know, really drive it home,” Sutherland-Harris says.
Markus Giesler, a professor of marketing at the Schulich School of Business, researches the impact of new technologies on consumer behaviour. He says it’s important for the sector to consider the broader social implications of this innovation. “The product itself is not a technologically neutral or objective thing, but something that has built into it certain patterns of power relation,” says Giesler, co-author of the 2020 study “Consumers and Artificial Intelligence: An Experiential Perspective,” which identified the need for guidelines around AI and ethics in marketing.
A higher education world within which only the privileged students get the real human educator, whereas the less privileged students get the chatbot, is a kind of world that I would not want
As consumer-facing AI continues to become better at performing tasks that were once viewed as distinctly human, Giesler says universities may face more complex issues of access and equality.
“It’s actually not that far-fetched to assume that professions that are mainly about storytelling, truth seeking and articulation of language and fact could in the future be done by artificial intelligence,” he adds. “My concern is that a higher education world within which only the privileged students get the real human educator, whereas the less privileged students get the chatbot, is a kind of world that I would not want.”
To support students in producing original work, Professor D’Agostino recommends that course syllabi now include information on the strengths and weaknesses of ChatGPT, and how to properly cite the information it provides when used for academic assignments.
She also sees a need to balance writing assignments with oral presentations and exams, so that students can develop their public speaking skills at a time when technology is infiltrating other spheres of their lives.
“We have to become better at evaluating students, helping them produce authentic work, and training them to be critical thinkers,” D’Agostino says. “But at the same time, there needs to be regulations and rules in place … and our core values need to remain solid.” ■
Share this Story
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment