Airbnb has apologized to a London-based academic after a host falsely accused her of causing over £12,000 in damage to a New York apartment. The host submitted photos of a cracked coffee table and other alleged damages, which the woman believes were digitally altered or AI-generated.
The woman had rented the Manhattan apartment for two and a half months but left early due to safety concerns. Shortly after her departure, the host claimed she had damaged furniture and appliances, including staining a mattress and breaking a vacuum cleaner. She denied all accusations, stating she left the apartment in good condition and had only two visitors during her stay.
Upon reviewing the host’s photos, she noticed inconsistencies in the images of the coffee table, suggesting possible manipulation. Despite this, Airbnb initially ruled in the host’s favor, demanding she pay £5,314. She appealed, providing witness testimony and pointing out the discrepancies in the evidence.
After the Guardian intervened, Airbnb reversed its decision, refunding her £500 initially, then offering £854—a fraction of her booking cost. When she refused and threatened to stop using Airbnb, the company fully refunded her £4,269 and removed a negative review the host had posted.
The woman expressed concern that others might fall victim to similar fraudulent claims, especially with the rise of easily manipulated AI-generated images. Airbnb has since warned the host for violating its policies and stated he could be removed if further complaints arise. The company admitted it could not verify the authenticity of the submitted images.
The host, listed as a “superhost” on Airbnb, did not respond to requests for comment.Airbnb apologized and stated they would review how her case was handled. “We take damage claims seriously—our specialist team examines all available evidence to reach fair outcomes for both parties. To ensure fairness, decisions can be appealed,” they said.
Serpil Hall, director of economic crime at the consulting firm Baringa, noted that altering images and videos is now “easier than ever.” The software for this is cheap, widely accessible, and requires minimal skill to use.
In a recent case, an insurance company noticed a rise in false claims for vehicle and home repairs using manipulated photos.
“Many companies now recognize that images can’t be trusted at face value during disputes,” Hall explained. “There’s a growing need for forensic tools and fraud detection models to verify their authenticity.”