AI in dentistry: What are the HIPAA violation risks? 

Abyde, CDA’s Endorsed Services partner for HIPAA compliance, shares AI considerations and guidelines for dentists
August 27, 2025
134
QUICK SUMMARY: When implementing AI tools into your dental practice, care must be taken to secure protected health information and verify that the applications are aligned with HIPAA rules. Endorsed Services partner Abyde offers helpful insights.

Artificial intelligence is changing the landscape of dental care with streamlined scheduling, automated treatment notes and even smart imaging analysis. While this new technology has the potential to significantly enhance practice efficiency and diagnostic tools, AI use is not without risk.

Topping the risk analysis list is the potential for violations of patient privacy.

AI systems require vast amounts of patient data to learn and function effectively. This can raise significant data privacy and security issues, making sensitive patient information vulnerable to breaches. Dental practices should use AI with care — particularly generative AI — ensuring that applications are aligned with privacy laws.

The experts at Abyde share the following precautions and guidelines your practice must follow to remain HIPAA-compliant. 

Understanding privacy and security risks with AI use

To date, no legislation has been enacted specifically to define compliant AI use in health care. AI use currently falls under HIPAA legislation, with AI companies considered business associates if they have access to protected health information.

ChatGPT is one of the most accessible forms of AI. Regardless of which version is used, de-identifying data and entering the minimum necessary information is essential. Dentists should never use the free, public version of ChatGPT with patient data because anything input into that platform could be used to train the AI model, posing a privacy and security risk.

Future efforts will likely focus on having language learning models like ChatGPT manage health care data compliantly. In fact, ChatGPT recently offered Business Associate Agreements on a case-by-case basis for paying customers using the ChatGPT API.

The BAA is a written contract that holds both parties liable if a breach occurs. This agreement defines each party’s responsibilities and how they secure PHI. Any vendors a practice works with that can access PHI must sign a BAA. For example, an AI dental imaging software that analyzes and predicts diagnoses would need to sign a BAA.

Overall, it’s best practice to avoid working with a vendor who refuses to sign a BAA. The same goes for medical AI companies.

Considerations if you implement AI in your dental practice

If you are considering implementing an AI solution in your dental practice, thoroughly review the safeguards the business has in place to secure PHI. A good indicator of this is an easily accessible HIPAA policy on the site.

You also want to analyze how AI is being used in your practice. Is it for treatment, payment or operations? Is it for research and marketing? Depending on how the practice is using it, you might need to obtain your patients’ authorization first.

Additionally, ensure a BAA is signed and in place before using AI products in your practice. AI is already woven into many practice workflows: automated charting, patient education content and even fraud detection. For example, your existing software vendor probably leverages AI for tasks like code suggestions or predictive analytics. Be sure to confirm that your BAA with them covers those functionalities.

AI is not infallible, and mistakes do happen. Remember that AI is a tool to assist — not replace — your professional judgment. A clinician should review every output to verify that AI recommendations meet the established standard of care.

While guidance from the Office for Civil Rights on AI is forthcoming and state laws on AI use are being developed, the basic rules of patient privacy haven’t changed: HIPAA is here to stay. The good news is intelligent software can streamline your HIPAA compliance program.

Much like advancements in artificial intelligence, health care compliance is constantly evolving. Implementing intelligent software can proactively identify vulnerabilities, preventing them from becoming risks to your practice. With the proper safeguards, you can continue securing patient information using AI technology.

Learn more about HIPAA and AI by scheduling a consultation with an Abyde compliance expert today.

Feedback

Was this resource helpful?