JJ DANTON
FR | EN

BIO · JEAN-JÉRÔME DANTON

I illuminate what artificial intelligence changes, from individuals to systems, and I offer new thinking on adoption. From the field.

Jean-Jérôme DANTON speaking at an event

Everyone calls me JJ. It's an almost childlike nickname, the one everyone uses, and it often throws people off at first: they expect a tribune, they meet a teacher. 1.93 m, 110 kg, a voice that fills the room, and a presence that listens before it asserts. This paradox is not an accident. It's how I enter difficult subjects: the stature that lets me say uncomfortable things, the gentleness that makes them heard.

I am a consultant in artificial intelligence, a trainer, and a researcher. I run KiXiT, a firm specialising in AI applied to tourism and territorial development. I am an associate lecturer at the University of Clermont Auvergne and at Clermont School of Business. And I am preparing a doctorate on the dynamics of AI adoption in organisations.

What I do

I train. This is the heart of my work, and it's where everything was built. 60 sessions delivered in 2025, in tourist offices, ski resorts, federations, booking platforms, business schools. At every session, between 5 and 30 people walk in with their fears, their certainties, their resistances, and walk out a few hours later with their first concrete wins. These rooms are my laboratory. That's where I watch what AI does to individuals before turning any of it into theory.

I advise. When a destination, a federation or an organisation decides to stop reacting and start understanding, I step in upstream of the tools: which processes to rethink, which skills to build, which political trade-offs to own.

I don't sell AI. I help people decide, lucidly, what they want to do with it.

I build. Conversational assistants, analysis bots, data extraction systems. Always with the same standard: if the technology isn't ready, we say so and we stop. I shut down Anatole.ai, a tourism assistant that worked technically at 99%, because a false piece of information about the opening of a mountain pass can kill. Reliability before speed. That's not a slogan, it's the reason that product no longer exists.

I write. Here, on this site, I publish Les Audaces, long essays that hold together my field observations and the theoretical frames I draw from them. On LinkedIn, I take positions faster, sometimes more frontally, on what I see happening.

I speak. 10 conferences a year, in federations, professional gatherings, institutional events. The short format of a talk is a test: what can't be said in 45 minutes to a mixed audience is probably not clear enough.

Why I do it

Because I believe what is happening with artificial intelligence is a civilisational moment, and civilisational moments always play out in the same place: in the gap between those who have understood and those who endure.

I have watched that gap widen inside my training rooms. The executive who walks in saying "AI is not for me" and who, two hours later, has drafted her first call for tenders with a language model. The difference between her company in three years and her competitor's, who didn't get that one hour, may well be the survival of one and the disappearance of the other. Multiply that hour by thousands, and you get the difference between a territory that adapts and a territory that falls behind.

And then there is a more personal reason. I come from a background where education was a permission, not a birthright. I learned to code at 39, at night, with AI as my private tutor. What I lived as emancipation, I want as many people as possible to live too. AI does not liberate on its own. It can just as easily imprison as emancipate. What we do with it, and what we let others do with it, is what decides.

What I think about AI

I am neither a technophile nor a technophobe. Both postures are forms of abdication. Technophilia gives up on seeing what breaks. Technophobia gives up on seeing what is at stake. My stance lies elsewhere: adaptive and critical. I watch what arrives, I measure what it changes, and I search for the frames that let us make use of it without losing what matters.

Artificial intelligence does not replace professions. It redistributes power. Economic, cognitive, relational.

I believe AI acts on two planes at once, and you cannot understand one without the other. At the level of the individual, it transforms the relationship to skill, to effort, to autonomy. I have formulated this as a simple reversal.

We were drilled with: when you want, you can.

When you can, you want.

Exposure to the possible triggers desire before motivation. All my pedagogy is built on this mechanism, and a good part of my research tries to theorise it properly. I have made a full essay of it: When you can, you want.

At the level of systems, AI redistributes value. It is not neutral. When French tourism bookings concentrate on American platforms, when conversational models decide which destinations a traveller is offered, when productivity tools organise the work rhythms of millions of employees, political and economic choices are at work. Who pays, who captures, who decides. These questions do not go away because technology advances. They shift, and we have to follow them.

I believe AI adoption in organisations unfolds across 5 levels, from the initial relief to full systematisation, and most companies stall well before they have done the interesting work. I believe BYOA, the individual appropriation of AI that runs far ahead of corporate decisions, is already here and that we measure it poorly. I believe the skills that matter in the era of language models are judgement, discernment, and the ability to prompt with rigour, and that these skills can be learned, but not through the usual training.

Finally, I believe critical independence is a value that is not up for debate. When a tool does not hold up, I say so. When an actor weaponises an adoption narrative to sell hot air, I say so. When the technology is not ready, I stop. That independence has already cost me contracts, and it will cost me others. It is the price of the trust people place in me, and I intend to keep it.

Les Audaces live here. The firm, the training, the interventions, live at kixit.ai. For press, research or institutional requests, write to me directly.

Jean-Jérôme DANTON