On this Global Data Protection Day, created to raise awareness and promote data protection best practices, we reflect on the maturation of our data ecosystems.
I recently visited the newly reopened MIT Museum in Cambridge, Massachusetts. Drawn to the history of computing section, it was hard not to marvel at the strange and wonderful objects behind plexiglass that recorded the history of computational technology — from mechanical equation machines to vacuum tubes, then punch cards, and finally displays on screens where data operation methods became less motorized.
Continuing down the light-filled hall, the exhibits started to take on the form of encounters. Where a robot learns how to set a table and where an AI agent (Generative Pre-trained Transformer or “GPT”) collaborates with you on a poem that moves off the screen and up the wall to float alongside others shared for visitors to muse on and ponder.
Ponder, I did. At what point did information, having evolved from units of computation to elements of data, come to encode the personal data each of us streams into the world each day? If one were to place a privacy layer atop the recent milestones on this data journey, you would see that privacy has joined data in a discipline every bit as interesting.
Privacy, once simply a feature of a product or a service, has become a dimension of data that the best companies in the world now recognize as a core competency to be developed. Its first real US milestone was the now-prescient Health Insurance Portability and Accountability Act (HIPAA), created in 1996 to safeguard personal healthcare data. Known as Protected Health Information (PHI), it set an 18-data point baseline that providers were required to protect.
That protection has coalesced around an even more robust protocol known commonly in the U.S. as “de-identification.” This kind of procedure, another data privacy milestone, is now embraced by product developers, service providers and others in the healthcare space as a best practice. Thought leaders in businesses, who recognized HIPAAas a privacy harbinger, proactively began to integrate de-identification into their service offers and imbue it in their corporate ethos.
Nuance was an early adopter of the protocol, initiating a program in 2020 to formally integrate de-identification safeguards into the fabric of our product development and service delivery processes. Using HIPAA criteria as a guide, we have evolved this privacy commission as a key differentiator for our service offer. It is a welcomed innovation by our customers, and their users, for whom privacy is ascendent as a personal and societal belief.
At Nuance, our privacy team has taken measures to enhance privacy protection. Our team works with several third parties to assess and confirm the effectiveness of our security and privacy methods and procedures. In 2021, we selected Privacy Analytics to give us an external perspective of our de-identification and pseudonymization practices, as well as to provide additional privacy-related expertise. They found the design of our practices to be effective and made recommendations for strengthening our process and privacy framework even further. Renewing our relationship in 2022, we further responded to these recommendations, developing guidelines for the de-identification and pseudonymization of data, standardizing practices, and improving our processes to assess de-identification projects.
So, on this Global Data Protection Day, created to raise privacy awareness and promote data protection best practices, I reflect on the maturation of our data ecosystems and how it may one day be marked by historians. And I am heartened by the AI exhibit that serves as the coda for the MIT Museum. It proffers a series of questions about the varied AI trajectories, challenges, and opportunities before us. As we look ahead to another year, we look forward to fresh developments in the world of privacy and to continuing the implementation of innovative ways to protect the sensitive data with which we are entrusted.