Leif-Nissen Lundbæk of Xayn: “Keep your eyes on the goal”

“Keep your eyes on the goal.” Artificial intelligenceis a powerful tool that scares many people, mainly because of how Big Tech has used it in the past. However, when you design AI applications with privacy in mind, you see a whole new potential that this technology holds. It is innovation in its ideal form, not […]

The Thrive Global Community welcomes voices from many spheres on our open platform. We publish pieces as written by outside contributors with a wide range of opinions, which don’t necessarily reflect our own. Community stories are not commissioned by our editorial team and must meet our guidelines prior to being published.

“Keep your eyes on the goal.” Artificial intelligenceis a powerful tool that scares many people, mainly because of how Big Tech has used it in the past. However, when you design AI applications with privacy in mind, you see a whole new potential that this technology holds. It is innovation in its ideal form, not guided by profit or any ulterior motive.


As a part of our series about business leaders who are shaking things up in their industry, I had the pleasure of interviewing Leif-Nissen Lundbæk.

Leif-Nissen Lundbæk (Ph.D.) is Co-Founder and CEO of Xayn and specializes in privacy-preserving AI. He studied Mathematics and Software Engineering in Berlin, Heidelberg, and Oxford. He received his Ph.D. in Computing at Imperial College London.


Thank you so much for doing this with us! Before we dig in, our readers would like to get to know you a bit more. Can you tell us a bit about your “backstory?” What led you to this particular career path?

I am an academically trained mathematician, and I have a deep interest in artificial intelligence. During my Ph.D. studies at Imperial College London, I worked on developing an energy-efficient and trustworthy framework for human-machine interactions in the IoT space. My doctoral thesis was my first significant contribution to increasing users’ control over AI-powered machines and processes. I also applied my knowledge about algorithms and automation in the smart mobility sector with career stations at IBM and Daimler. By the time I co-founded Xayn, I had gathered first-hand experience with AI applications in various use cases — IoT, smart mobility, and search. In each case, privacy stood out as a major issue to me.

AI algorithms are virtually all around us. They are essential elements in nearly every smartphone app and every website’s analytics and tracking functionalities; they ‘feed’ on our personal data and shape our digital experience in every possible way, and yet they are completely out of the end user’s control. In a series of research and commercial projects, I worked on developing several award-winning enterprise solutions. They were based on a privacy-protecting AI that does not depend on the centralized collection and processing of personal data. On this fundamental principle, I co-founded Xayn in 2017 together with Professor Michael Huth, who now serves as our Chief Research Officer, and Felix Hahmann, now COO. We took the privacy-protecting AI model we had originally created for businesses and developers and implemented it into the most widespread service digital users rely on: online search.

Can you tell our readers what it is about the work you’re doing that’s disruptive?

To put it in one word: Choice. We created Xayn with the motivation to give users a real choice regarding ownership of their personal information. People’s digital lives are marked by an endless stream of illusory choices: “Do you want to accept cookies?,” “Do you consent to our privacy policy?,” or placing a checkmark next to “I acknowledge receipt of the Terms & Conditions.” These are all empty statements. Not consenting means not being able to browse, shop, or enjoy the content that you want. So, the choice is a sham. We founded Xayn in order to restore users’ control regarding their digital privacy, while also providing the best, personalized experience.

Unlike asking users for (un-)informed consent, Xayn leads with privacy and ethics by design. We do not have to explain how we manage and secure users’ private information processing because we let them keep that processing directly on their own devices. That way, they do not have to hand over any personal data ever again. Big Tech is forcing users to throw their personal information into a gigantic, opaque data meat grinder. The meat grinder breaks down the information, takes it apart, and does things with it that nobody can truly fathom. On the other end of the data meat grinder, users get the online experience they want — such as relevant search results — but they also have to swallow a big side dish of intrusive advertising, privacy violations, and a general lack of online security.

Thanks to Xayn, users now choose to bypass the Big Tech data meat grinder entirely and take ownership and control over their information processing. Personal devices are powerful enough to perform the same advanced AI-steered operations that enrich our online lives without the need for sharing their personal data with a big corporation.

Can you share a story about the funniest mistake you made when you were first starting? Can you tell us what lesson you learned from that?

I have a funny story about the importance of getting to know your team outside of work. Michael, Felix, and I (Xayn’s co-founders) are all native German speakers, but we did not always know that. When we first started building the company, we were so focused on the project that every conversation revolved around it, and our working language was English, so… It took a few months of daily interaction until we realized that all three of us share a native language.

Apart from having a good laugh over it, we took this as a lesson on the benefits of getting to know the people we work with on a daily basis. Nowadays, whenever we grow and bring new colleagues in, we make sure to have a proper conversation with them and get to know them also on a personal level. It has done wonders for our culture and teamwork.

We all need a little help along the journey. Who have been some of your mentors? Can you share a story about how they made an impact?

My most notable mentor on my journey towards developing and popularizing privacy-preserving AI solutions has been Professor Michael Huth, my Ph.D. supervisor who is also Xayn’s Co-Founder and Chief Research Officer. Ever since the years we spent working together at Oxford University and Imperial College, we have set our minds to creating viable solutions to the privacy dilemma that torments millions of users in their everyday digital lives. Michael helped me cultivate my scientific interest in ethical computing. The way that he applies the scientific method to real-life problems and the academic rigor with which he approaches solutions have left a deep impression on me. Our academic collaboration was the natural precursor to Xayn. When it comes to technical know-how and scientific merit, Michael is always my go-to person and never fails to provide insight and guidance.

In today’s parlance, being disruptive is usually a positive adjective. But is disrupting always good? When do we say the converse, that a system or structure has ‘withstood the test of time?’ Can you articulate to our readers when disrupting an industry is positive, and when disrupting an industry is ‘not so positive?’ Can you share some examples of what you mean?

Disruption is closely tied to innovation in my mind. I, like many key figures on Xayn’s team, come from an academic background. I know that innovation and scientific, economic, and societal progress do not inherently contain moral or ethical costs. The Internet and search engines were developed decades ago by scientists to share and find data easily. Their underlying protocols were open-source, free to all, decentralized. None of that tremendous progress came at the expense of privacy violations, mass online surveillance, or any other morally dubious decisions. Then platforms slowly started taking over, preying on the open-source protocols and putting their add-ons and digital fences around them. They disrupted the Internet’s inherent principles of universal access and freedom and laid down the foundations for today’s ‘pay with your data’ model. This to me is negative disruption of the worst kind.

This cycle of negative disruption has also affected the way we think of and develop AI. Algorithms were originally created to perform mathematical operations, solve problems, and create predictions that make human decision-making easier. The work of pioneers in machine learning and AI such as Alan Turing or Marvin Minsky thus positively disrupted the computing industry by pushing the boundaries of what computers could do. As we entered the age of ‘big data,’ our algorithms required ever-bigger datasets for training and self-improvement. However, these datasets did not have to be collected and stored in a centralized way, and they certainly did not need to be monetized. This invasion into our privacy was a conscious decision that Big Tech made for financial gain, and it continues pushing a false narrative that you can’t have scientific progress without paying for it with privacy. Xayn’s existence and success demonstrate that ‘AI vs. privacy’ is a false tradeoff.

Therefore, we can identify a constant cycle of positive and negative disruption, and we can see how one fuels the other. Without Google’s data-hungry algorithms, there would be no private search with Xayn. A yin needs its yang.

Can you share 3 of the best words of advice you’ve gotten along your journey? Please give a story or example for each.

  • “Keep your eyes on the goal.” Artificial intelligenceis a powerful tool that scares many people, mainly because of how Big Tech has used it in the past. However, when you design AI applications with privacy in mind, you see a whole new potential that this technology holds. It is innovation in its ideal form, not guided by profit or any ulterior motive.
  • “Lead with your values.” In order to get to that point, you need a measure of social intelligence, which is how I would characterize ethical behavior. In a world plagued by inequalities and moral failures, if you do not innovate and operate based on a strict moral code, you are on the wrong side of history. Sooner or later, society will reject your proposition, and your product will be useless. The sooner you embrace the ethical way, the better.
  • “Build it and they will come.” Xayn’s success has taught me that, when you apply AI in a socially intelligent manner, human intelligencewill prevail, meaning that people will make the right choice. The adoption speed of our private search solution has filled me with confidence and hope about the path we have chosen. We take product design very seriously, and we go to great lengths to accommodate users’ needs and wishes. Privacy is on nearly everyone’s mind right now, and we have heard the call and given people a choice. I am thrilled to see that it is catching on.

To shake up the AI industry, you need artificial, social, and human intelligence to work together towards a higher moral goal. This is what we are doing with Xayn.

We are sure you aren’t done. How are you going to shake things up next?

With Xayn’s private search solution, we have made a powerful statement. The app’s success and growing popularity demonstrate users’ hunger for privacy-protecting AI. We will expand this model to other areas where algorithms are at work, in our professional and private lives. We will continue innovating according to our privacy-first ethos.

Discoveries and innovations are essentially value-free; they are neither ‘good’ nor ‘bad’ by their nature. Advancements in AI, machine learning, and automation allow us to complete more tasks faster and better than before. It is up to our moral and ethical compass to choose the tasks, to which we will apply the computing power. Big Tech innovates with profit in mind and perpetuates the model of ‘paying’ for search, email, and social networking with your privacy. We, on the other hand, develop algorithms that are in no way inferior to Big Tech’s, but we never lose sight of our guiding principle: user privacy.

If there is an easy solution that compromises privacy, we would not even consider it. As scientists, we know that there is a way to get where we want to go without moral dilemmas and ethical compromises, and we always find it. The last decade has spawned a growing movement in computer science that focuses on AI ethics and explainable algorithms. We wholeheartedly believe that this is the way to go, too. There is currently a hunger for explanations regarding AI because so much of it is used for nefarious purposes. If we design AI with users’ interest in mind, such as privacy and data protection, the explanation becomes secondary.

Do you have a book, podcast, or talk that’s had a deep impact on your thinking? Can you share a story with us? Can you explain why it was so resonant with you?

During an extended interview with CNBC back in 2018, Edward Snowden said about the EU’s General Data Protection Regulation (GDPR), “The problem is not data protection. It’s data collection.” Still early in Xayn’s history, this quote meant everything to me. The world’s most famous whistleblower and privacy advocate had just confirmed the basic principle on which we were developing Xayn. He was right, of course. Even though it is one of the world’s most developed privacy-protecting pieces of legislation, at its heart the GDPR is a monument to the status quo. It does nothing to limit the collection and processing of personal data. You can ask a company to tell you what information it has about you, but it doesn’t give you any real control over what it does with it.

Since a “General Data Collection Regulation” was not likely to appear, I became even more convinced that the path forward leads to AI that is private by design. Legal mandates and institutional oversight will not save us from privacy intrusion because tech is always several steps ahead of the laws that try to regulate it. Think of Mark Zuckerberg’s testimony before the US Senate. At one point, he famously had to explain Facebook’s business model to Sen. Orrin Hatch — the headline-catching statement, “Senator, we sell ads.” If lawmakers fail to understand the first thing about a tech giant like Facebook, how can we trust them to regulate it effectively?

For the reasons above, I have taken a privacy-by-design approach to AI, and it has become the guiding principle behind Xayn.

Can you please give us your favorite “Life Lesson Quote”? Can you share how that was relevant to you in your life?

It is hard to pick just one… I will go with: “The world is what you think of it, so think of it differently and your life will change’’ by Paul Arden. It inspires me to take control of my own thought processes, to take ownership of my life, and to make a positive impact on the world around me. This idea applies with equal measure to my personal and my professional life. We created Xayn because we had a different thought, different vision of the digital world, and we are still experiencing its life-changing effects. It all started with us thinking that the online population was not passive and resigned to a life of constant surveillance and data breaches. Instead, we thought of a world that values privacy and ethical computing. Today, over 100,000 daily searches on Xayn and over 100,000 downloads of the app tell me that many people share that different vision. So, hats off to Mr. Arden for putting the innovator’s path so elegantly into words.

You are a person of great influence. If you could inspire a movement that would bring the most amount of good to the most amount of people, what would that be? You never know what your idea can trigger. 🙂

I am a self-proclaimed privacist, and I would love to see privacism grow into a global movement. I believe that privacy is a fundamental human right, and we should push governments and businesses to treat it this way. This invariably means making conscious consumption choices that move away from privacy-invading products. This is ‘bottom-up privacism.’ However, I also want to advocate for ‘top-down privacism,’ and in doing so, I want to push back against the idea that most users give up their privacy out of mere convenience or ignorance. In fact, today’s users are quite clever. The system is built in such a way that the only rational decision is to not read any privacy policies or cookie notices and to consent to being tracked blindly.

How many pop-ups do you click away each day? How many transactions do you conclude online, bound by Terms & Conditions that spread over a dozen screens in microprint? I guarantee you that informing yourself is untenable, and the vast majority of consumers are intelligent enough to know it, so they don’t even try. Big Tech chooses to propagate this status quo and to capitalize on consumers’ rational decision to trade privacy for convenience — not out of resignation or ignorance but with economically sound judgment. Xayn is breaking up that old model. We provide a viable alternative that is generating a massive following as we speak. If you give people a real choice, they will act in their best interest. This is how we change the status quo.

How can our readers follow you online?

You can find me on Twitter and LinkedIn where I post quite regularly. Also be sure to follow Xayn’s profiles on Twitter and LinkedIn and to visit our blog for commentary and product updates. Come join the privacy revolution with Xayn!

This was very inspiring. Thank you so much for joining us!

Share your comments below. Please read our commenting guidelines before posting. If you have a concern about a comment, report it here.

You might also like...

Community//

Women Leading The AI Industry: “ Keep calm and explore AI.” with Rosaria Silipo, Ph.D. and Tyler Gallagher

by Tyler Gallagher
Community//

“There is always a way to improve.” With Charlie Katz & Leif Kristjansen

by Charlie Katz
Wisdom//

The Navy SEAL Trick You Need to Know to Be a Good Leader

by Richard Feloni
We use cookies on our site to give you the best experience possible. By continuing to browse the site, you agree to this use. For more information on how we use cookies, see our Privacy Policy.