An Australia-based tour operator has removed online content after an AI-generated blog post directed travelers to a set of hot springs in northeast Tasmania that do not exist. The post, published on the Tasmania Tours website, recommended the so-called “Weldborough Hot Springs” as a nature-focused getaway and described forest pools and mineral-rich waters, according to screenshots cited by multiple outlets.
Weldborough is a small rural community roughly 110 kilometers (68 miles) from Launceston, a detail highlighted in reporting that underscored how far visitors were driving for the attraction. The incident has become a widely shared example of how confident, polished-sounding AI-generated travel content can lead to real-world confusion, especially when readers assume a professional travel site has verified what it publishes.
How The Error Reached The Public
The website in question is operated by Australian Tours and Cruises, a New South Wales-based business that runs multiple tour-booking sites. Owner Scott Hennessy said the company had outsourced marketing work to a third party that used AI to produce content, and that the problematic post went live while he was out of the country and not able to review it as usual.
Hennessy characterized the mistake bluntly, saying, “Our AI has messed up completely.” In subsequent reporting, the company said it had removed AI-generated blog posts and was reviewing content more carefully, while also pushing back against allegations of fraud. The episode also triggered online reputational fallout; the business told CNN it had been hit by intense criticism and that the damage to its reputation was deeply distressing.
Local Businesses Dealt With The Fallout
The practical impact was felt most immediately in Weldborough itself—particularly at the Weldborough Hotel, a notable local stop in a remote area. The hotel’s owner, Kristy Probert, said she began receiving repeated inquiries from travelers who arrived expecting guidance to the hot springs. She described a pattern of frequent calls and in-person visitors asking where to find pools that were portrayed online as nearby.
Probert told outlets that the Weld River is “freezing” and is not a hot spring destination, and that the area is better known for fossicking, where visitors may search for sapphire and other minerals, than for thermal bathing. The mismatch between the online description and the on-the-ground reality meant some travelers took detours into a sparsely serviced region with limited landmarks beyond the pub and surrounding forest roads, turning what was intended as a wellness-style outing into a dead-end excursion.
Wider Concerns About AI “Hallucinations” In Travel
Tourism researchers say the Weldborough incident highlights a broader challenge as AI tools become more common in trip planning and travel marketing. Anne Hardy, an adjunct professor in tourism at Southern Cross University, said AI has become widely used across the sector—from promotional copy to itinerary building and costing—and that a significant share of travelers now rely on AI outputs when planning trips.
Hardy cited figures indicating about 37% of tourists use AI for travel advice or itineraries, and pointed to research suggesting that 90% of AI-generated itineraries contain at least one mistake. In destinations like Tasmania—where remote walks can involve long distances, limited services, and patchy mobile coverage—small factual errors (such as distances, access conditions, or the existence of facilities) can carry higher stakes than they might in major cities. The case has renewed calls for stronger quality control by businesses publishing AI-assisted content and for travelers to cross-check location details using multiple reliable sources before setting out.
