Mia Zelu, a virtual model created with artificial intelligence, became an online sensation after uploading photos from the Wimbledon tournament—and managed to deceive over 40,000 people, as well as Israeli sports websites that featured stories about her as if she were a real person. Her Instagram account already has more than 165,000 followers, many of whom believed she was a real woman enjoying watching tennis.

The photos, showing Mia dressed in white summer dresses or pastel green shades, tanned and holding a typical tournament drink, looked completely realistic. Tennis courts were visible in the background—and the responses poured in accordingly.

One of the people who fell for it was none other than famous 27-year-old Indian cricket player Rishabh Pant, who was himself at the Wimbledon tournament. When he realized the account that liked his posts was fake, he quickly deleted all interactions with it.

Mia Zelu conveyed “the essence of Wimbledon” and communicated like any real influencer. In one of her posts, she asked: “What was your favorite match at Wimbledon?”, using typical Instagram language that seemed to have been written by a human.

The creators of Mia—whose identities remain undisclosed—defined her profile as an “AI-based influencer and digital storyteller.” According to them, she also has a “sister” named Anna Zelu, who has 266,000 followers.

Even though the profile clearly states that this is not a real person, users continue to send her messages, including marriage proposals and invitations to go out.

Mia is the product of a sophisticated algorithm that generates visual content that appears exceptionally realistic. Each post is accompanied by cleverly written and personal text, with a tone that closely mimics the behavior of real people.

The digital influencer lives a “life” reminiscent of celebrities: Attending concerts, staying on yachts, and enjoying beach vacations—all, of course, entirely virtual.

Psychology and technology experts warn of a growing phenomenon—people forming emotional connections with characters created by artificial intelligence, or virtual “friends” like chatbots. According to them, these entities may be available and responsive immediately, but they are not capable of providing real emotional support or tangible solutions.

“Such dependency may cause long-term harm,” experts warn. “These characters are nonjudgmental, which makes them pleasant—but ultimately, they only simulate emotional support and cannot offer a real relationship.”