This guide outlines our approach to ensuring data privacy while leveraging AI capabilities through the "Anonymous In, Personalized Out" process.
While data security measures protect against threats and malicious actions in certain scenarios, our primary focus is on safeguarding data ownership and preventing its use in external AI algorithms.
Personally Identifiable Information (PII) pertains to data elements that can uniquely identify individuals, such as names, addresses, and phone numbers. In discussions about data privacy, PII serves as a guiding principle for maintaining the equilibrium between personalized experiences and confidential data handling.
Any data that qualifies as PII is stripped out from database records, such that only non-PII data is passed into the AI. Instead, a randomized hash is used to replace PII data to ensure that no two records are confused.
Then, the anonymized donor data is fed into the AI models and once outputs are finalized the hash replaced with the original identifiable data points.
No PII is ever passed into any AI models.
The workflow of the "Anonymous In, Personalized Out" process follows a systematic approach:
In conclusion, our “Anonymous in, personalized out” approach should preserve user and organizational data integrity and ensure that no PII is ever used in an artificial intelligence algorithm or model.
We encourage you to read all of our documentation on security and privacy. If you have any specific issues or requests please contact your customer success manager.