
.png)
Could your AI assistant be putting your client’s confidential information at risk? With the increasing reliance on AI tools like Perplexity, lawyers face a crucial question: Is the data you entrust to these tools truly secure?
This guide will explore the privacy risks of using Perplexity and why legal professionals should consider instead using an AI tool built specifically for legal work needs, such as Spellbook.
Choosing the right Perplexity plan depends on your data privacy and usage needs. Here’s a comparison of the Free, Pro, and API plans with a focus on key privacy aspects:
In Free and Pro plans, training is enabled by default. You must opt out of training, and privacy features are modest. The Sonar API keeps no data and does not use it for training, making it the most private option for developers. Enterprise plans combine no training use with stronger privacy and admin-managed retention and access.
Perplexity collects personal information, such as your name, email address, and payment details, when you sign up or make a purchase. It also gathers data from your prompts, inputs, and the AI’s responses, along with usage data, such as device information, location, and browsing behavior, to enhance the user experience.
Perplexity manages privacy by limiting data collection to only what’s necessary, ensuring only relevant data is collected to provide services. The platform retains data for as long as needed to support service functionality, such as keeping your account active. This data is used to generate responses, improve AI models, and enhance service quality.
When processing information, Perplexity protects data privacy through encrypted connections and secure storage. For example, if Free and standard Pro users who have not opted out of AI data usage ask "What are the latest data privacy laws?", Perplexity processes and temporarily stores the question to generate a response and improve its AI model. For Enterprise users, the data is processed for the response but is not used to improve the AI model.
In compliance with GDPR, users can access, delete, or correct their data and manage privacy settings, including opting out of personalized ads and limiting data use for training. Enterprise users can establish strict data retention policies. Free/standard versions allow data to be used for model updates, but users can opt out of training in their Account Settings.
With the integration of AI tools like Perplexity into legal workflows, lawyers must be mindful of the risks to the attorney-client privilege. ABA Rule 1.6 requires lawyers to prevent the inadvertent or unauthorized disclosure of client information. Without clear data retention and training opt-out options, AI tools like Perplexity may introduce risks that legal-specific platforms like Spellbook are built to mitigate from the ground up.
AI tools like Perplexity may store or use data beyond the lawyer's control. Lawyers should treat Perplexity as a public cloud platform unless a confidentiality agreement and audit controls are in place. Even with Perplexity’s enterprise-level protections, entering privileged client information or confidential case details into a third-party, cloud-based system poses a risk of violating the duty of confidentiality and potentially waiving the attorney-client privilege.
Lawyers should protect personal information by ensuring AI tools do not retain conversations beyond the session and client data isn't stored unnecessarily. ABA Rule 1.4 mandates informing clients about AI usage and its potential risks to confidentiality.
Perplexity's API/enterprise plans do not use customer content for model training, offering greater data control. Standard accounts, however, may use user content for improvement, though users can opt out in Account Settings. Data used for training poses a risk of inadvertent exposure of confidential or privileged information, potentially compromising confidentiality.
To manage privacy, limit data collection to only what’s necessary and avoid entering sensitive data unless absolutely required. Even if you opt out of future data usage for model training, previous data may still be retained, posing ongoing risks to privacy and confidentiality. Always treat sensitive input as if it could be retained, even if deletion options are offered.
If you're considering Perplexity AI for legal document drafting, learn how it securely processes data in this guide.
Access to your chat history with Perplexity depends on your subscription plan and the type of data involved. Perplexity employees, support personnel, and automated systems may access chat histories for service improvement or troubleshooting, especially on standard plans where chat histories may be retained unless opted out in Account Settings.
Under enterprise plans, files in Threads are deleted after 7 days, while Threads may persist until manually deleted.
Perplexity uses encrypted connections and has SOC 2 Type II compliance to protect data from external threats during processing. Perplexity does not share your data with third parties without your consent, ensuring information is handled securely and privately.
For law firms, chat history should not be considered strictly private unless Perplexity has guaranteed confidentiality in a contract. Without a formal confidentiality agreement (e.g., a BAA), your chat history could be accessible to Perplexity employees or third parties.
To use Perplexity effectively while ensuring compliance and maintaining client trust, legal teams should:
These risk-mitigation measures help you protect client confidentiality and comply with ethical duties.
While Perplexity is a powerful AI tool, it is not designed with legal confidentiality in mind. In contrast, Spellbook is built specifically for legal workflows. Here’s how Spellbook helps where Perplexity might miss:
Spellbook is built to meet the unique needs of legal teams, ensuring sensitive client information is handled with the highest level of privacy and protection, far beyond what general-purpose tools can offer.
To learn more about how Perplexity compares to other AI tools like ChatGPT in terms of privacy, read Perplexity vs ChatGPT: Privacy Considerations. Additionally, discover how Perplexity AI can improve legal research and enhance your workflows.
Start using Spellbook today and experience an AI solution designed specifically for legal professionals, with the privacy, security, and compliance features necessary to protect client confidentiality and streamline legal workflows.
Yes, you can delete your Perplexity history. Threads are deleted automatically after 7 days in enterprise plans, but can be manually deleted in all plans. Once deleted, Threads cannot be recovered. Opting out of AI training does not remove previously used data.
Perplexity may access your legal queries for troubleshooting or service improvement, especially on standard plans. Perplexity employees, support personnel, and automated systems may have access. For enterprise plans, data retention is more controlled, and data is not used for model training.
No, Perplexity does not share your data publicly. Threads remain private unless you choose to share them. Perplexity may share data with trusted service providers, or if required by law, but enterprise plans offer stronger data privacy controls. Perplexity also processes user inputs securely without storing identifiable information to ensure your data remains protected.
Thank you for your interest! Our team will reach out to further understand your use case.