The AI Authenticity Paradox—And the Students Who Missed It

The AI Authenticity Paradox—And the Students Who Missed It
Professor Koen Pauwels, 2026 Robert D. Klein University Lecturer. Photo courtesy of Northeastern University.

The 62nd Robert D. Klein Lecture made the case that AI is making us more human. The audience missed the mark.

On Tuesday afternoon, Northeastern's Cabral Center located in the John D. O'Bryant African-American Institute filled with faculty, administrators, and staff for the 62nd Annual Robert D. Klein University Lecture. Professor Koen Pauwels, Associate Dean of Research and Distinguished Professor of Marketing at D'Amore-McKim, took the stage to deliver a talk titled The AI Authenticity Paradox: Why Machines Are Making Us More Human.

As I looked around the room, the only young people I could see besides myself were Professor Pauwels' children. This is a major problem, and it is why this publication exists.

Professor Pauwels’ Presentation

Professor Pauwels, nicknamed “The Best Marketing Academic on the Planet," opened the lecture with an ad. In particular, he presented Anthropic's recent Super Bowl ad, in which an AI fitness assistant mid-conversation starts pitching height-increasing insoles to a man who simply asked for a workout plan. Second, a photograph of a group of high schoolers at the starting line of a charity marathon. Each student was looking somewhere different, someone in a dinosaur costume, someone on their phone… chaotic and imperfect, but unmistakably human. 

The point was not that the ad was bad technology. It was that the audience reacted to the photograph with warmth and to the ad with discomfort. As AI gets better at mimicking human communication, that reaction becomes even more prominent. Professor Pauwels therefore argued that we are heading toward a premium on human-made content, a world in which people no longer pay for information but for strictly human competencies.

The lecture moved across three areas of his research. First, how framing AI disclosure changes consumer perception. In short, consumers respond better when a company specifies why it used AI, particularly when the reason is benefit-oriented. Second, how active engagement with AI, what he called "self-investment," actually increases people's sense of ownership over the output and reduces their privacy concerns. Professor Pauwels uses building IKEA furniture as a metaphor for this concept–build the furniture yourself and it becomes more valuable. Third, he presents the idea that by learning from past technology cycles, organizations can significantly better their AI adoption strategies.

The Q&A

During the 35-minute Q&A, the audience asked a few interesting questions. Professor W. Paul Chiou, a finance professor at D'Amore McKim, voiced a concern shared by 86% of college faculty nationwide—what value do we add when AI can do so much? Pauwels stated that AI has actually made him significantly more efficient at the parts of his job that he finds more difficult. For example, Professor Pauwels says he has leveraged Northeastern’s partnership with Anthropic to use Claude to anticipate the objections of skeptical reviewers.

Polina Starobinets, a project manager from the College of Engineering, rightly questioned that if human interaction becomes more valuable, does it too become more expensive? Pauwels' answer was one word—"yes." The audience laughed, but human expertise becoming a luxury good available only to those who can afford it is anything but funny. He went on to state that this concern has been reflected with every past major technology. The belief is that prediction doesn’t mean inevitability, but the risk is there

President Aoun asked an oddball question about China championing the integration of AI compared to more Western-idealized skepticism. Professor Pauwels claims that pushback forces more thoughtful integration. He pointed to the co-op experience of a game design student who pushed back against a boss pushing virtual reality into a product where it made no sense. This type of critical judgment, he stated, is the essence of today’s moment. 

As everyone at the Business school asked about how far we can push the limits of AI, I asked about its environmental cost–the gap between the immediate, quantifiable cost savings AI provides and the large-scale, harder-to-measure externalities. In addressing these concerns, Professor Pauwels argues that there must be more than self-regulation. Industry ethics efforts matter, but, as he put it, it is much better to have a "stick behind the door."

Who Was and Wasn’t in the Room

The lecture was excellent and the conversation was substantive. The problem lies in the population that was absent from the audience.

Northeastern University is home to over 19,000 undergraduate students, and the Klein Lecture is one of its oldest and most distinguished academic traditions since 1964. Tuesday’s topic, what it means to be human in a technological moment, is one of the most important questions of today. And yet, the auditorium was filled almost entirely with people who already have careers, tenure, and high institutional standing.

Academic events like this one too often circulate within existing networks of faculty and administrators.

This is part of why NU Nexus exists. The conversations happening in rooms like the Cabral Center on a Tuesday afternoon matter. As an undergraduate student myself, I see the worth in absorbing knowledge in spaces like these. As students try to figure out what skills to build, what fields to enter, and how to think critically about the technology reshaping practically every career path in front of them, access to information is the bare minimum... especially at an institution like Northeastern.

Pauwels closed his talk with two questions directed at students: What expertise are you building? How will you make your human judgment both visible and valuable?

Those are the right questions. More students should have heard them asked.

Subscribe to NU Nexus

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe