Community//

“Set a Digital Identity Strategy”, With Jason Remilard and Max Kirby of Publicis Sapient

Set a Digital Identity Strategy: If data is oil, customer data is the light sweet crude. The most critical step in protecting customer privacy and digital identity is normalization rules to define the different parameters of single files representing a person. Digital Identity is something that you need to have a sense of, not just […]

The Thrive Global Community welcomes voices from many spheres on our open platform. We publish pieces as written by outside contributors with a wide range of opinions, which don’t necessarily reflect our own. Community stories are not commissioned by our editorial team and must meet our guidelines prior to being published.

Set a Digital Identity Strategy: If data is oil, customer data is the light sweet crude. The most critical step in protecting customer privacy and digital identity is normalization rules to define the different parameters of single files representing a person. Digital Identity is something that you need to have a sense of, not just a system for. How you handle different tolerances for resolution will impact your privacy efforts down the line. For example, how will you resolve other user names? How do you decide the preferred device consumers might use to interact with your first-party systems? How will you handle normalization rules when there are contextual conflicts, missing parameters, or defects? All of these technical questions present implications for your nontechnical roles and their strategies to use data or collect it.


As a part of our series about “5 Things You Need To Know To Tighten Up Your Company’s Approach to Data Privacy and Cybersecurity”, I had the pleasure of interviewing Max Kirby, who leads the Customer Data Platforming Practice at Publicis Sapient.

Max Kirby is an expert in Digital Identity, Data Privacy, and the increasingly critical intersection between the private and public sectors when it comes to who we are, online. He leads the Customer Data Platforming Practice at Publicis Sapient, a digital transformation partner.


Thank you so much for joining us in this interview series! Before we dig in, our readers would like to get to know you. Can you tell us a bit about how you grew up?

I grew up outside of Boston and studied Philosophy at the University of Virginia. Unlike most of my colleagues, I saw the issues of Philosophy through the lens of Technology. I lobbied the Philosophy faculty to allow me to take a series in Nanotechnology at the school of engineering and convinced the Engineering professors to take me without the prerequisites. That was the same year I became a Mock Trial All-American. My first real job was in Shanghai as a paralegal. I went back to study Business Valuation in London before returning to the states to start a career in Technology.

Is there a particular story that inspired you to pursue a career in data privacy? We’d love to hear it.

Privacy is a combination of those two fields of discipline — Policy, and Technology — and therefore, it’s a difficult subject to categorize. Not many philosophers are technologists and vice versa. I think that will change. I went into the business transformation field after I saw the paradox of Technological China. They were five years ahead of the US in the cities and twenty years behind in the provinces. The contrast of quality of life was remarkable. I started in tech right as the internet began to converge around concepts like cloud infrastructure. Not being an engineer myself, I made it a point to understand the engineers from the outside and translate the technical for the non-technical to persuade our clients to transform.

Can you share the most interesting story that happened to you since you began this fascinating career?

When I was pitching a Cloud Practice idea in Silicon Valley and interviewing potential partners we ended up going to an interesting dinner. At dinner, the conversation surrounded around connecting different datasets that had never been done before. Some of them were out there. One of the partner’s Head of Solutions Architect asked me, furtively, “How serious are you about these?” We spent the next few weeks in workshops combining all the pieces of Google’s Platform to make a solution that changed how our clients looked at the world. We did not call it a Customer Data Platform at the time, but it was one of the first emergent categories. The first time we used the architecture was for a car manufacturer. By the time we were done, they could predict what kinds of cars needed to be built and sent to the right lots based on demand, right down to the car color. Later, we did the same thing for a Pharmaceutical and created a system that predicted the flu’s incidence better than the CDC’s models, using Youtube searches. There are other stories like that, each a function of adding customer data to something you might not immediately think to add it to or apply.

None of us are able to achieve success without some help along the way. Is there a particular person to whom you are grateful who helped get you to where you are? Can you share a story about that?

At a previous company, I once sent a farewell email to 100 colleagues, thanking them for their mentorship and leadership. Christopher Davey, one of the early founding members of Sapient was a standout mentor as he gave me the opportunity to redefine myself. If you ever feel misunderstood for having an idea, I recommend trying to find someone like Chris; who has a strong vision of the future and understands how to continually update and adjust that vision. Chris used to call foresight “the curse” for its double-edge — change almost always upsets people who are winners in the present, and most often, those are the same people who are influential enough to block you. Chris taught me that anyone who wants to wield that curse needs to remember that making a point is a lot less important than making a difference. You can’t do both simultaneously. That means the more you try to make a point, the less of a difference you can make. Giving up the former for the latter was a hard thing to do, but Chris taught me the method.

Are you working on any exciting new projects now? How do you think that will help people?

We’re very excited about where Digital Identity is headed. Most interesting for me are the projects we are doing to question what matters when it comes to which customers are “best.” In one recent project, we were able to beat the credit score as a determiner of fiscal responsibility, using transactional data. One of the most critical behaviors that correlate with fiscal responsibility in an individual is a regular transaction at a grocery store. The world of data can feel cold -as if everyone is a number — but we’re still in the pregnant phase of the field, and I see bright lights ahead. The more you do these types of projects and think about the humans behind the numbers, the more you start to see that it’s almost impossible to judge where someone is in life and, at the same time, understand their trajectory. Like a Heisenberg uncertainty principle applied to people: the more you know where they are, the less you can accurately judge their direction and vice versa, which has been evident in all of our projects on data & behavior.

What advice would you give to your colleagues to help them thrive and not “burn out”?

Just as failure is often the cost of success, burning out is often how you learn how not to burn out. Instead of trying to avoid burning out, submit to doing so to understand it, then control the environment around you to notice reactions and how your behavior changes. I used an audio journal — I wanted to journal but did not seem to make time for it, so I started using text-to-speech and removed the behavioral blocker. This advice shouldn’t be taken as a license to keep burning out, but when you try to manage your time or keep some focus on your self-care, having the experience of knowing how you burn out helps. Burn out doesn’t affect everyone the same way, so your approach needs to be personalized. The only other advice I can give is that 15-minute breaks sprinkled throughout a day can sometimes be better than a day off, and you always need someone you can talk to who makes you feel understood in your professional life.

Ok super. Thank you for all that. Let’s now shift to the main focus of our interview. The Cybersecurity industry, as it is today, is such an exciting arena. What are the 3 things that most excite you about the Cybersecurity industry? Can you explain?

Confidential Computing, Decentralized Identity, and Deepfakes. Confidential Computing promises a world where we can finally hurdle the types of endemic security concerns that have blocked cloud adoption from companies who, on principle, cannot adopt cloud for specific workloads. I think as we start to see edge computing make progress, we’re going to need it. Decentralized Identity is a topic we find more applications for as the developer community starts to break down the growing stranglehold that the platforms have on what was supposed to be a free and open web. The faster this kind of identity can take root, the faster the web can once again make progress towards democratization. We are studying the implications of this kind of thing a lot more now, and the results are painting an optimistic picture of where we might go next. You might be surprised about the last one on my list, but we are finding that Deepfakes might be the cure to the problem we all have when it comes to pictures and videos being spread all around the internet. Once the technology gets good enough that anyone can Deepfake you into any picture, then any picture of you has to be called into question as authentic. I think DeepFakes are going to cause some more issues, of course, but along with them, they provide an undermining effect to the belief most people have about what they see on the internet: that everything is authentic. As with other times in history, technology creates a problem for us and then provides a solution.

Looking ahead to the near future, are there critical threats on the horizon that you think companies need to start preparing for?

A topic that’s top of mind is how governments will interact with the private sector when it comes to data. Data Privacy legislation just got its toe in the water, and as I said, combining the technological concept of identity and the legal concept of privity and consent is tricky. Our research is showing that very few people know what they are consenting to during data collection. We need to fix that and soon. Companies need to start preparing to educate their customers on data privacy, and governments (or at least representative governments) need to finesse security and to open-source their technology. I think we’re going to see an informed consent standard emerge in data privacy legislation, where if your customer is not given a chance to understand what you can do with the data you collect, their consent would be void. That’s a significant threat to many companies today trying to become independent of the big platforms and build their customer identity systems.

Do you have a story from your experience about a cybersecurity breach that you helped fix or stop? What were the main takeaways from that story?

We are often asked to test our technology partner’s alpha programs, including those related to differential privacy methods. Differential privacy refers to techniques that obfuscate an identity that could be revealed by comparing two separate datasets. It’s designed to counter reidentification when someone takes dataset A that does not include a clear identifier and dataset B that does not have a clear identifier and compares them or combines them to expose a clear identifier. I cannot speak too much to specifics on this story, but it began when we were working late one night to try out a new service from a major platform company they were thinking of releasing. I asked two data scientists I trusted with alpha programs to go through the new system and identify any exploits. This kind of whitehat exercise is supposed to reveal potential abuses with a specific reasonable boundary, so we were coming up with queries and models to try to break the protections by comparing two datasets. We found one that worked a bit too well. We could write a few unorthodox queries to generate two datasets to compare and expose the identifiers. In this case, it worked a bit too well. With a couple of lines of code, we could identify the amount of media that had been spent on each individual among competitors bidding on the same segments. The fact that we even could touch this information was over the reasonable boundary of what we were supposed to test. At worst, the folks who gave us access to this were looking at a legal issue; at best, there was career risk for them individually. We locked up our machines, and I called my contact and told her to meet me right away. The next day the columns were removed from our instance, and I later found out that they had removed the access to all of the folks who were testing out the new system across the globe. As is often the case with these types of trusted relationships, doing the right thing led to more trust.

What are the main cybersecurity tools that you use on a frequent basis? For the benefit of our readers, can you briefly explain what they do?

In the world of privacy, we speak more about methodologies than specific systems, but we are using every kind of privacy sandbox out there. Those are playgrounds where new privacy proposals are testable. There are some projects like MIT’s Solid Platform that are interesting. Tim Berners-Lee, who originally wrote up the www white paper, is running that project along with MIT and the backing of Glasswing Ventures in Boston, one of the hottest new VC firms out there backing machine learning projects. We’re also using every major privacy cleanroom system and exploring what can and cannot be done with things like federated learning. Cleanroom is a term that comes out of virology, referencing those rooms where you have to decontaminate when you enter and exit. In data, that’s a database that allows matching inside the environment, but the only data you can take with you are findings. It’s useful for now — when we need to graft privacy principles onto an internet that is ultimately based on public identifiers. I’m not convinced that we as an industry have cracked the privacy nut with any of these first attempts, but it’s early.

How does someone who doesn’t have a large team deal with this? How would you articulate when a company can suffice with “over the counter” software and when they need to move to a contract with a cybersecurity agency or hire their own Chief Information Security Officer?

Like CyberSecurity folks, privacy professionals are constantly trying to justify their own existence, the more successful they are. Preventing issues only becomes a burning priority right after there has been a breach or a leak. I think privacy-wise at those three stages, you should think about your Privacy investments as something that makes your brand more attractive as it grows. Transparency makes your customers more likely to share their data. This is the opposite of what most of the industry recently believed. You should fear missing out on key datasets you would have never collected without your transparency more than things like user deletion. It’s also becoming easier and easier to obfuscate an identity these days, so holding back on doing the right thing to keep your data from being deleted is a losing strategy. Early on, you should stick to basics and rely on the platforms you interface with to handle most of your privacy woes on their end. Once you start collecting lots of first-party data about your customers, that’s when you need to start renting expertise. Businesses crossing that threshold should begin exposing the information they have and start thinking about Digital Doubling so that you can start to use data without worrying about losing its source customer. This is what all the large platforms are doing these days. After you reach 10,000 customer records, your focus will be less on protecting identities and more on preserving them. A Chief Privacy Officer only makes sense if your core business relies on customer data (which, eventually, it should) and should be put in place once you decide to make it part of your core. For some, that’s from the beginning. For others, it’s towards the end of their first steps towards digital transformation.

As you know, breaches or hacks can occur even for those who are best prepared, and no one will be aware of it for a while. Are there 3 or 4 signs that a lay person can see or look for that might indicate that something might be “amiss”?

From a privacy perspective, breaches reflect problems latent in the technology we use to manage identities today. You can use services like haveibeenpwned and password managers, but you cannot prevent attacks even with a vigilant stance. I have had my own identity stolen once, and as a privacy advocate, I encourage those interested in understanding the subject not to try to secure the ‘unsecurable.’. If you lock everything down, you start failing to see the tensions that we need to resolve as an industry. The two signs of a compromised identity are suddenly logged out and subscribed to things you never subscribed to. Many systems today tend to limit the maximum number of logged-in devices, and when a lot of them are added, they log out the others. Attackers will also try to see if you are watching your inbox. They specifically target emails that seem abandoned, so anything from forwarding emails justifies changing passwords. These are the privacy equivalents of strange spikes in network traffic or sudden increases in your cloud computing bills.

It’s not for everyone, but the best way to approach this on both the business and individual side is to adopt a containment mindset. It’s improbable that you will wipe everything out online, but you can stop putting new things out there. Make sure that you have a plan to be able to prove your own identity. Never digitize things like your analog ID card. Take the time to write the number you need down on a piece of paper and take a picture of that instead, if you absolutely must send it. In my case, when a low-level password was compromised, someone used my to order delivery food — 40 orders of cookies from McDonald’s. They must have had a sweet tooth.

After a company is made aware of a data or security breach, what are the most important things they should do to protect themselves further, as well as protect their customers?

Shut off any data-sharing deals you have, at least temporarily. The risk of re-identification has never been higher for customers than it is today. You’ll find that many datasets are useless unless they are combined with others. That puts your partners at risk because whatever data you are sharing with them is likely stitchable to your own data if exposed. The breach on your end will make their data more valuable as the other pieces of the puzzle needed to unlock the information has been leaked. One good thing you can do to prevent or be ready for these types of things is to create a consortium with your data-sharing partners to review potential attacks as a group. You can often use that synthetic data issue by working together, which means that when and if a leak or a re-identification exploit emerges, you will have open channels to work as a team. I speculate that we’ll see a Federal Agency emerge that will function this way across industries.

How have recent privacy measures like The California Consumer Privacy Act (CCPA), CPRA GDPR, and other related laws affected your business? How do you think they might affect business in general?

People thought we were crazy when we were talking about the death of the cookie in 2015. Just like then, I would encourage everyone to see these changes as an opportunity. The most significant effect of these laws in my industry was what it did to the big platforms that were almost entirely valued on their ability to use data for advertising. The Clean Rooms were quick to emerge after that, and we were among some of the earliest users. Right now, we are working on getting all of our clients to adopt them across a multi-cloud framework, and we have gone so far as to publish code that helps you quickly get into these clean rooms with basic functionalities that many miss when they are first exploring. The platforms cannot tell you exactly what you should do with these things because, in many cases, they do not necessarily know themselves and are learning along with you. People need to recognize that clean rooms are a good thing and that we need to be people before we are businesspeople. It’s not worth resisting and trying to find a way around them just to preserve soon-to-be-obsolete adtech and martech. Better to adopt cloud and APIs and get building. These laws are the beginning of something good for the industry, and we’re going to see more of them, so modernizing your stack early and often has become a bit of a categorical imperative.

What are the most common data security and cybersecurity mistakes you have seen companies make?

Identity wise, it’s allowing the persistence of data silos to exist far longer than they should. The new laws make it so that if you create two silos instead of one, you are doubling your liability for every silo you tolerate. Start using a method like privacy sidecars and put in place a Customer Data Platform soon enough that you can know where you stand. If you cannot implement one right way, start working on a common data model and get a sense of how you will structure your own Identity Strategy. The only other watch out I can give is to realize that your technical folks, or maybe you yourself, will have trouble making business cases for what you want to do. It’s okay. You are not supposed to be a finance expert and technologist at the same time (usually). However, you should get someone to start doing a data valuation exercise against what you have and get down to what each identity is worth to your company in qualitative terms or quantitatively based on certain assumptions. That will help you translate your projects into business value. Just like you cannot expect your customers to meet your business on your terms, you cannot expect your business stakeholders to either.

Since the COVID19 Pandemic began and companies have become more dispersed, have you seen an uptick in cybersecurity or privacy errors? Can you explain?

Majorly. Collection especially, and it’s our fault because we feel that the exigent circumstance of the pandemic is justification enough to collect data without checking and balancing. Health passports by private companies are noble, but their implications are terrifying. Governments moving too quickly (or not quickly enough to put in place restraints) are those kinds of mistakes you find when you study history. Transformation is rarely something that happens better when it happens faster. We can all understand the rush to help people, but political rewards like being the first to introduce contact tracing are not necessarily the kind that we want to build precedent on for the next century. I’m not saying we should stop, and there will undoubtedly be some very good projects that come out of this moment — some even that might force the kind of change we want to see. But we all try to be mindful of something in Philosophy we call the “State of nature” argument. If your principles don’t work within low technology eras, they shouldn’t be trusted with high technology eras. Good intentions can be just as dangerous as bad ones, and security, privacy, and liberty are three legs of a table we need to keep level. That means each leg needs to grow in proportion, and the right way to move quickly in this pandemic is to move all three, not just one.

Ok, thank you. Here is the main question of our interview. What are the “5 Things Every Company Needs To Know To Tighten Up Its Approach to Data Privacy and Cybersecurity,” and why? (Please share a story or example for each.)

Most companies are striking a balance between modernization-first and privacy-first, but whether you lean one way or another at any time, a few strategies make security and privacy incrementally more effective and easier to manage as you modernize. Here are five things every company can do to make managing their privacy easier:

  1. Educate Your Customer for Their Understanding, Not Your Compliance: Data Privacy Legislation is still on trial. When laws are in their infancy, regulation and enforcement will focus on the principles behind the law. The principle behind almost every data privacy law is consent — but to hold this principle, you have to take into account the fact that most people (61%) know almost nothing about what companies do with their data. They can only consent to your data collection if they understand what they are consenting to. We needed laws to reign in the surveillance economy because it’s probably against your company’s short term fiscal interest to expose what data you have collected. This is where you should be long-term greedy. The Publicis Sapient Data Collection & Consent Survey shows that users share more data with brands that are transparent about their data collection and usage. That means that the fear of losing data should be outweighed by the fear of missing out on data you would have never collected in the first place if you never had a policy of transparency and trust. There’s a reward out there for companies that do their best to educate their customers on the data they collect and use — more (and better) signal. Especially as more services emerge to make hiding an identity on the web easy, the companies that will win the most information in the information age will be those who give their customers no reason to hide in the first place.
  2. Establish a Common Data Model: Any professional will tell you the relationships between your datasets are often more important than the data itself. A top-down model agreed upon by all the departments who need to use data will help ensure that collection, translation, and activation happen with enough standardization to help you create privacy policies for each that work together. If that part is done right, it will allow for the right forms of customization by any party to exist while precluding many situations where prevention strategies might fall short. As a bonus, a common data model can also be used to identify potentially risky data silos.
  3. Set a Digital Identity Strategy: If data is oil, customer data is the light sweet crude. The most critical step in protecting customer privacy and digital identity is normalization rules to define the different parameters of single files representing a person. Digital Identity is something that you need to have a sense of, not just a system for. How you handle different tolerances for resolution will impact your privacy efforts down the line. For example, how will you resolve other user names? How do you decide the preferred device consumers might use to interact with your first-party systems? How will you handle normalization rules when there are contextual conflicts, missing parameters, or defects? All of these technical questions present implications for your nontechnical roles and their strategies to use data or collect it.
  4. Build a Poly-cloud Customer Data Platform: CDPs were recently named as one of the most important Digital Business Transformation trends by Forces, and 2021 could peak. There’s no reason why you can not have a packaged SaaS CDP in place to take advantage of well-tested functionalities, but you should always back it up with a cloud-based data warehouse. Also, state laws like CCPA and the federal laws are expected to follow and not treat different data silos in your business very kindly. Suppose you do not start busting siloes and resolving any identities present throughout your systems into a centralized system. In that case, it becomes more challenging to protect data in the first place. Distributed data can be more accessible and allow teams to pursue their strategies, but you should have one privacy strategy, not several. Do your legal team a favor and use a Customer Data Platform in tandem with a Common Data Model and a Consent Management System to satisfy the operationally necessary conditions of compliance with the new laws.
  5. Embrace APIs: Microservices and APIs offer several benefits to those trying to handle privacy at scale. An API management system will make it easier to manage access keys, and some of them provide the ability to make API creation a self-service workflow. Metadata is also much easier to create when API access logs are available and can save you time during audits or breachers. You do not need to use an API to understand who accessed what data, when, but it makes it considerably less labor-intensive to do. The labor saved scales up with usage.

You are a person of enormous influence. If you could inspire a movement that would bring the most amount of good to the most amount of people, what would that be? You never know what your idea can trigger. 🙂 (Think, simple, fast, effective, and something everyone can do!)

Every company should publish an explainer video that outlines exactly what they use their customer’s data for, why, and open the door for feedback.

How can our readers further follow your work online?

    Share your comments below. Please read our commenting guidelines before posting. If you have a concern about a comment, report it here.

    You might also like...

    Community//

    Bjørn Skou Eilertsen of Milestone Systems: “Burning out is a silent killer in almost any industry now”

    by Jason Remillard
    Community//

    “Align your policies with your actions”, With Jason Remilard and Debbie Reynolds

    by Jason Remillard
    Community//

    “Make it personal”, With Jason Remilard and Carey O’Connor Kolaja of AU10TIX

    by Jason Remillard
    We use cookies on our site to give you the best experience possible. By continuing to browse the site, you agree to this use. For more information on how we use cookies, see our Privacy Policy.