A diner on Byres Road in Glasgow pulls up ChatGPT and asks for the nearest place doing a decent Sunday roast. Until last week, ChatGPT would have offered a reasonable guess based on text it had seen during training. From 26 March, it can answer with your precise GPS coordinates — and the gap between those two experiences matters considerably for the restaurants on that street.
The feature is opt-in, but it will be on
OpenAI added location sharing to ChatGPT on 26 March 2026. Users choose between approximate location (a broad geographic area) and precise location (address-level GPS, deleted from OpenAI's servers once the response is generated). The rollout started on iOS and the web, with Android listed as coming shortly.
The opt-in framing is genuine. ChatGPT will not access your location without permission. But 700 million people use ChatGPT each week, and consumer AI features that add useful context — this one improves answers for dining, navigation, weather and local events in one toggle — tend to accumulate users over time. Within a year, location-aware ChatGPT will not be a niche configuration; it will be the default experience for a substantial share of its users.
What ChatGPT uses to build the answer
ChatGPT Search runs on Bing's index. When you ask "where should I eat near me," the model fans out across Bing's data, your location signal, and whatever structured information Bing has indexed about local businesses.
That means the answer you get is only as good as what Bing can understand about the restaurants nearby. A restaurant whose menu and details exist only as a JPEG attached to a Facebook post is, at this stage of the query, effectively invisible. One with a proper website, a Google Business Profile, and Schema.org markup on its menu pages gives the model structured objects to reason about: cuisine, price range, allergen information, opening hours.
The early accuracy data is underwhelming — in testing, SEO consultant Glenn Gabe found results landing on restaurants 45 minutes away. ChatGPT's local layer is not yet reliable. But the infrastructure is being built, and the restaurants with machine-readable data are the ones that will be cited when the accuracy improves.
The interface is changing shape
The local results coming back from ChatGPT are no longer just a list of names in prose. The interface now returns entity-style business cards: name, address, hours, images, links. Panels. Map views. The search experience inside ChatGPT is converging on what Google Local has looked like for years, but from a standing start.
A restaurant with no structured data is invisible to this interface at the exact moment someone is deciding where to eat.
For a restaurant that is properly indexed — a website with structured data, consistent contact and address information across directories, a Google Business Profile that matches the on-site schema — the move into ChatGPT search is additive. Another surface, another way to turn up. For one that is not, it is another discovery channel on which it does not appear.
Why now is the moment to get this right
The frustrating thing about structured data is that the work required is roughly the same whether you do it before or after a platform shift. The restaurant that marked up its menu in 2024 gets ChatGPT location search for free. The restaurant that marks it up in 2026, after realising it has been absent from every new discovery surface, is playing catch-up to a gap that keeps widening.
Menu software that outputs Schema.org markup — GMMO is one option — handles this as a side-effect of keeping the menu current. You describe the dish; the machine-readable version lands wherever a crawl asks for it.
ChatGPT does not know where your restaurant is yet. Not accurately, not reliably. But it is learning, and the restaurants that teach it now will be the ones it answers with confidence when the accuracy catches up.
