Last Updated on Dec 12, 2025 by Kurt Dunphy

Is Perplexity Private for Lawyers? Client Data Security Guide

Could your AI assistant be putting your client’s confidential information at risk? With the increasing reliance on AI tools like Perplexity, lawyers face a crucial question: Is the data you entrust to these tools truly secure? 

This guide will explore the privacy risks of using Perplexity and why legal professionals should consider instead using an AI tool built specifically for legal work needs, such as Spellbook.

Key Takeaways

  • Perplexity plans offer varying levels of data privacy, with enterprise plans providing strong controls, including not using you data for model training and effective retention policies.
  • Lawyers should be cautious when using Perplexity. Legal teams should limit the entry of  privileged client information or confidential case details. They should also update engagement letters to request client consent for AI use, anonymize data, and ensure human oversight over the use of AI outputs to maintain compliance and protect confidentiality. 
  • Spellbook offers strong confidentiality and privacy protections with legal-focused tools and regulatory compliance support.

Version Differences: Free vs. Perplexity Pro (Plus) vs. API

Choosing the right Perplexity plan depends on your data privacy and usage needs. Here’s a comparison of the Free, Pro, and API plans with a focus on key privacy aspects:

Perplexity Free vs. Perplexity Pro (Plus) vs. API

Version Training Use Privacy & Security Best For
Free & PlusUsed for training by default unless the user explicitly opts out  (disabling "Improve the model for everyone").Standard consumer controls (opt-out toggle, Temporary Chat). Encrypted data in transit and at rest.Everyday use, personal learning, and research where performance and feature access are priority.
APINo training on inputs/outputs  (since March 1, 2023).Secure by design; encrypted in transit/at rest; Zero Data Retention options for maximum security (excludes some features).Secure integrations, building custom applications, and scenarios requiring programmatic control and the highest baseline privacy guarantee.
Enterprise/TeamsExcluded from training by default. Organizations must explicitly opt in to share data for training purposes.Private, contract-bound; enterprise compliance (SOC 2, GDPR support); admin controls (SSO, audit logs, data residency).Sensitive business and legal work, corporate deployment, and large-scale team collaboration requiring governance.
Best for Sensitive UseAPI (especially with ZDR) or Enterprise

In Free and Pro plans, training is enabled by default. You must opt out of training, and privacy features are modest. The Sonar API keeps no data and does not use it for training, making it the most private option for developers. Enterprise plans combine no training use with stronger privacy and admin-managed retention and access.

What is Perplexity’s Privacy Model?

Perplexity collects personal information, such as your name, email address, and payment details, when you sign up or make a purchase. It also gathers data from your prompts, inputs, and the AI’s responses, along with usage data, such as device information, location, and browsing behavior, to enhance the user experience. 

Perplexity manages privacy by limiting data collection to only what’s necessary, ensuring only relevant data is collected to provide services. The platform retains data for as long as needed to support service functionality, such as keeping your account active. This data is used to generate responses, improve AI models, and enhance service quality. 

When processing information, Perplexity protects data privacy through encrypted connections and secure storage. For example, if Free and standard Pro users who have not opted out of AI data usage ask "What are the latest data privacy laws?", Perplexity processes and temporarily stores the question to generate a response and improve its AI model. For Enterprise users, the data is processed for the response but is not used to improve the AI model.

In compliance with GDPR, users can access, delete, or correct their data and manage privacy settings, including opting out of personalized ads and limiting data use for training. Enterprise users can establish strict data retention policies. Free/standard versions allow data to be used for model updates, but users can opt out of training in their Account Settings.

Privilege Risk: Does Using Perplexity Waive Attorney-Client Privilege?

With the integration of AI tools like Perplexity into legal workflows, lawyers must be mindful of the risks to the attorney-client privilege. ABA Rule 1.6 requires lawyers to prevent the inadvertent or unauthorized disclosure of client information. Without clear data retention and training opt-out options, AI tools like Perplexity may introduce risks that legal-specific platforms like Spellbook are built to mitigate from the ground up.

AI tools like Perplexity may store or use data beyond the lawyer's control. Lawyers should treat Perplexity as a public cloud platform unless a confidentiality agreement and audit controls are in place. Even with Perplexity’s enterprise-level protections, entering privileged client information or confidential case details into a third-party, cloud-based system poses a risk of violating the duty of confidentiality and potentially waiving the attorney-client privilege.

Lawyers should protect personal information by ensuring AI tools do not retain conversations beyond the session and client data isn't stored unnecessarily. ABA Rule 1.4 mandates informing clients about AI usage and its potential risks to confidentiality.

Data Training: Does Perplexity Train on Your Inputs?

Perplexity's API/enterprise plans do not use customer content for model training, offering greater data control. Standard accounts, however, may use user content for improvement, though users can opt out in Account Settings. Data used for training poses a risk of inadvertent exposure of confidential or privileged information, potentially compromising confidentiality.

To manage privacy, limit data collection to only what’s necessary and avoid entering sensitive data unless absolutely required. Even if you opt out of future data usage for model training, previous data may still be retained, posing ongoing risks to privacy and confidentiality. Always treat sensitive input as if it could be retained, even if deletion options are offered.

If you're considering Perplexity AI for legal document drafting, learn how it securely processes data in this guide.

Confidentiality: Who Can Access Your Chat History?

Access to your chat history with Perplexity depends on your subscription plan and the type of data involved. Perplexity employees, support personnel, and automated systems may access chat histories for service improvement or troubleshooting, especially on standard plans where chat histories may be retained unless opted out in Account Settings.

Under enterprise plans, files in Threads are deleted after 7 days, while Threads may persist until manually deleted. 

Perplexity uses encrypted connections and has SOC 2 Type II compliance to protect data from external threats during processing. Perplexity does not share your data with third parties without your consent, ensuring information is handled securely and privately.

For law firms, chat history should not be considered strictly private unless Perplexity has guaranteed confidentiality in a contract. Without a formal confidentiality agreement (e.g., a BAA), your chat history could be accessible to Perplexity employees or third parties.

Legal Best Practices When Using Perplexity

To use Perplexity effectively while ensuring compliance and maintaining client trust, legal teams should:

  • Limit Privileged Content: Only enter privileged information if the appropriate data protections are in place.
  • Update Engagement Letters: Disclose the use of AI tools like Perplexity in engagement letters and address privacy concerns with clients to ensure transparency and trust.
  • Minimize Data Scope: Input only the necessary information and anonymize details when possible. Avoid identifiers to reduce risk.
  • Retain Human Oversight: Verify AI outputs for accuracy and to preserve attorney-client privilege. 

These risk-mitigation measures help you protect client confidentiality and comply with ethical duties.

Perplexity vs. Spellbook: Which is Safer for Legal Professionals?

While Perplexity is a powerful AI tool, it is not designed with legal confidentiality in mind. In contrast, Spellbook is built specifically for legal workflows. Here’s how Spellbook helps where Perplexity might miss: 

  • Built for legal workflows: It helps law firms and in-house legal teams work faster by automating tasks like drafting, redlining, and reviewing legal documents.
  • Includes clause libraries for contract-ready content: Provides pre-built legal clauses for consistency and industry alignment.
  • Provides compliance benchmarking (GDPR, CCPA, PIPEDA): Automatically checks documents against key regulations to ensure compliance.
  • Customizable compliance rules: Tailors compliance checks to specific legal requirements.
  • Offers strong encryption to safeguard confidential data: Protects data in transit and at rest from unauthorized access.
  • Zero data retention policy: No data is stored after the session ends, ensuring privacy.
  • Maintains attorney-client context, protecting privileged information: Safeguards sensitive communications and preserves legal privilege.
  • Audit trails for accountability and transparency: Tracks all actions and changes for compliance and security.
  • Role-based access control: Limits data access to authorized personnel only.

Spellbook is built to meet the unique needs of legal teams, ensuring sensitive client information is handled with the highest level of privacy and protection, far beyond what general-purpose tools can offer.

To learn more about how Perplexity compares to other AI tools like ChatGPT in terms of privacy, read Perplexity vs ChatGPT: Privacy Considerations. Additionally, discover how Perplexity AI can improve legal research and enhance your workflows.

Start using Spellbook today and experience an AI solution designed specifically for legal professionals, with the privacy, security, and compliance features necessary to protect client confidentiality and streamline legal workflows.

Frequently Asked Questions

Can I Delete My Perplexity History?

Yes, you can delete your Perplexity history. Threads are deleted automatically after 7 days in enterprise plans, but can be manually deleted in all plans. Once deleted, Threads cannot be recovered. Opting out of AI training does not remove previously used data.

Can Perplexity Access My Legal Queries?

Perplexity may access your legal queries for troubleshooting or service improvement, especially on standard plans. Perplexity employees, support personnel, and automated systems may have access. For enterprise plans, data retention is more controlled, and data is not used for model training.

Does Perplexity Share My Data with the Public?

No, Perplexity does not share your data publicly. Threads remain private unless you choose to share them. Perplexity may share data with trusted service providers, or if required by law, but enterprise plans offer stronger data privacy controls. Perplexity also processes user inputs securely without storing identifiable information to ensure your data remains protected.

Start your 7-day free trial

Join 4,000 legal teams using Spellbook

please enter your business email (not gmail, yahoo, etc)
*Required

Thank you for your interest! Our team will reach out to further understand your use case.

Oops! Something went wrong while submitting the form.