An OpenAI bot is passing law school exams, and it could threaten traditional teaching

US company OpenAI’s chatbot, which received a huge financial injection from Microsoft this week, uses artificial intelligence to generate scripts based on simple instructions. The results are so good that educators say there is a risk of widespread cheating, and the tool could spell the end of traditional classroom teaching methods.

Jonathan Choi, a professor at the University of Minnesota Law School, gave the robot the same test his students face. It consisted of 95 multiple-choice questions and 12 open-ended questions that could be answered in essay form. The system has a total score of C+, which is the best of the three.

This was enough to pass the test, but in most subjects, the robot passed between the worst students and failed multiple-choice questions in math.

Professors who evaluated the results wrote: “When writing articles, ChatGPT demonstrated a good knowledge of basic legal rules and consistently good organization and composition.” They added that the bot “often has difficulty recognizing problems when asking an open-ended question, which is an essential skill on law school exams.”

Authorities in New York and elsewhere have banned the use of ChatGPT in schools. But Professor Choi suggested that the robot could be a valuable educational tool.

“Overall, ChatGPT wasn’t a great law student when he acted alone,” Choi wrote on Twitter. “However, we expect that, in collaboration with humans, language systems such as ChatGPT will be very useful for law students taking exams and for practicing lawyers,” he added.

Choi downplayed the possibility of cheating when, in a response to another Twitter user, he wrote that two out of three residents had noticed work written by a bot. Choi writes: “(They had) a hunch, and their hunch was right because ChatGPT had elaborate grammar and was fairly repetitive.

See also  The Important Place of Fashion in America’s Super Bowl

The Guardian wrote last year that this chatbot impressed with its ease of use and its ability to write and handle complex tasks. Incorrect assumptions can be recognized and inappropriate requests refused to respond. According to the newspaper, professors, programmers and journalists may be out of work in a few years.

The organization was founded in 2015 by Elon Musk before it split from it in 2017 due to a conflict of interest between the organization and Tesla.

Leave a Reply

Your email address will not be published.