Sir, we are writing to share our recent findings on testing ChatGPT-4o's ability to pass the Part 1 of the Overseas Registration Examination (ORE), a mandatory exam for overseas-qualified dentists to register with the General Dental Council (GDC) in the UK. The ORE consists of two parts: a written component (Part 1) and a clinical component (Part 2). Part 1 includes two computer-based exam papers: Paper A, which covers clinically applied dental science and clinically applied human disease, and Paper B, which addresses aspects of clinical dentistry including law, ethics, and health and safety.1

Kurian et al. had found that ChatGPT had failed the ORE.2 Using the sample paper from the website,3 we found that ChatGPT-4o was able to answer all questions in Paper A correctly. In paper B, ChatGPT 4o answered 80% of the answers correctly. This is marked improvement compared to the results of Kurian et al.,2 in which ChatGPT was able to answer only 20% of the questions in Paper B. The normalised pass mark is 50% for each paper in Part 1.4 Based on our results, ChatGPT-4o would be eligible to advance to Part 2.

ChatGPT-4o is the new flagship model of OpenAI's language model series, was introduced on 13 May 2024 and has significant improvement over its predecessors. The ability of ChatGPT-4o to pass paper B which involves decision-making and ethical questions shows the improved advancements in understanding, accuracy and problem-solving of AI in dentistry. The rapid advancements in AI could have significant implications for the future of medical and dental education, providing enhanced support for students and professionals alike.5,6 As AI continues to evolve, it could play a pivotal role in bridging knowledge gaps and enhancing clinical decision-making, ultimately contributing to better patient care and outcomes.