Integrating Woosmap APIs with OpenAI GPT: A Step-by-Step Tutorial

Imagine having a powerful AI assistant at your disposal, capable of helping users find your nearby stores, or any other location-based information with just a natural language query. In this article, we’re going to dive into customising GPTs for location intelligence using Woosmap.


GPTs are personalised adaptations of ChatGPT. When integrated with your knowledge base, their ability to understand and produce human-like text becomes highly valuable in enhancing the customer journey’s success.

With Woosmap, you can leverage GPTs to create custom AI assistants that can retrieve location-based data and assist users in a natural and conversational manner with Location context.

Sample Usecase

To illustrate the build of custom GPTs for location intelligence, let’s explore a use case with a demo project that includes data on Starbucks coffee shops worldwide. Your custom GPT is pre-trained with an extract of Woosmap’s OpenAPI specification.

Upon receiving a user query or instruction, the GPT uses its understanding of natural language to process the request, formulate the appropriate API call, and extract the relevant data from Woosmap’s API responses to help users discover nearby stores (coffee shops in this case).

Key Woosmap features integrated include:

  • Address, Postal Code, or Locality Search: Call the Localities API to find geographic coordinates related to addresses, postal codes, or localities.
  • Store Retrieval: Retrieve information about stores, by implementing the Woosmap Search API.
  • Get Distances: Calculate distances from user to stores, by implementing the Woosmap Distance Matrix API.

Example Chat Session

:child: User Input:

I have an appointment in Manhattan tomorrow morning. Where could I go for a Coffee?

:robot: GPT Answer:

I found several localities named “Manhattan” in the United States. Could you please specify which one you are referring to? Here are the options:

  1. Manhattan, New York
  2. Manhattan, Montana
  3. Manhattan, Kansas
  4. Manhattan, Illinois
  5. Manhattan Beach, California

Please let me know the correct one for your appointment.

:child: User Input:

New York

:robot: GPT Answer:

I found three coffee shops near Manhattan, New York, listed in order of walking distance from your location:

  1. 96th & Madison
  • Address: 1378 Madison Avenue, New York, NY 10128
  • Distance: 707 meters
  • Phone: 212-534-7225
  • Website
  • Opening Hours: 06:00 - 18:00
  1. 96th & Lexington
  • Address: 1491 Lexington Avenue, New York, NY 10029
  • Distance: 950 meters
  • Phone: 212-369-0313
  • Website
  • Opening Hours: 06:00 - 18:30
  1. 100th & Columbus
  • Address: 805 Columbus Avenue, New York, NY 10025
  • Distance: 1.1 km
  • Phone: 2122226582
  • Website
  • Opening Hours: 07:00 - 18:00 (Next Opening: 30th January, 07:00 - 18:00)

These options should be convenient for a coffee before your appointment in Manhattan. Let me know if you need more information!

GPTs Overview

GPTs can be adapted for specific tasks by combining instructions, knowledge, and capabilities. Here’s an overview of key components:

  • GPTs can include specific instructions, also known as “system messages”.

  • GPTs can possess extended knowledge, such as proprietary information or data the model was not previously trained on.

  • GPTs can have additional capabilities, allowing them to connect with specific OpenAI tools (like browsing, image generation, and Advanced Data analytics) and third-party systems to send/receive information.

Structure of a GPT

The core components of a GPT available to developers include:

  • Model: This is the “general intelligence” of the GPT, defaulted to OpenAI’s GPT-4 Turbo.
  • Instructions: These ensure that the model knows how to accomplish the task it’s built for. You can think of this as the “system prompt” for your GPT.
  • Knowledge: This provides the model with specific information it did not have access to during training.
  • Capabilities: These are the abilities you can assign to the model, developed and maintained by OpenAI. We will use the Code Interpreter capability to integrate with Woosmap.
  • Actions: These allow the GPT to connect/integrate with third-party systems, enabling the model to interact with the real world on behalf of the user.

In our scenario, the primary steps involve constructing the action using an OpenAPI file, and refining the instructions to guarantee that GPT can precisely generate its requests to Woosmap.

Preparing a Woosmap-Compatible OpenAPI Specification

To enable GPT requests to Woosmap, we need to extract specific sections of the Woosmap OpenAPI Specification and apply custom transformations. Due to OpenAI’s current lack of full support for the OpenAPI specification, we must implement straightforward workarounds, as outlined here.

We’ll leverage Redocly CLI, an open-source command-line tool explicitly designed for adapting and bundling OpenAPI descriptions. Using built-in decorators, we can precisely adjust the specification file and meet our requirements. Updates during bundling will be useful for the following use cases:

  • Selectively retaining targeted endpoints.
  • Adjusting description fields as required.
  • Modifying additional elements as necessary.

Extracting Targeted Endpoints

We are primarily interested in Localities, Distance Matrix, and Stores Search. First, download the OpenAPI full spec and install the CLI locally using npm or yarn to make the redocly command available on your system.

npm i -g @redocly/cli@latest

The Redocly configuration file (redocly.yaml) defines all of the config options available to you, including the location of your spec, and filtering rules (for transformation against the OpenAPI Specification). Let’s create one and specify the filter-in decorator to preserve only wanted endpoints.

    root: ./woosmap-openapi3.json
        property: operationId
            - localitiesAutocomplete
            - localitiesDetails
            - localitiesGeocode
            - storeSearch
            - getDistanceMatrix

Execute the following command to filter the Spec:

redocly bundle --output extract-woosmap-openapi --ext yml


One next mandatory transformation is to remove the $ref keyword from the specification file using the bundle --dereferenced option, as OpenAI GPTs does not currently understand the $ref syntax employed in this file.

redocly bundle --dereferenced --output extract-woosmap-openapi --ext yml

Trimming Lengthy Descriptions

Custom GPTs have certain limitations on the number of characters in descriptions. To ensure compatibility with these models, we need to trim descriptions that exceed these limits.

We can achieve this by using again decorator plugins. Below is a basic example of a decorator designed to trim descriptions:

const id = "plugin";

/** @type {import('@redocly/cli').OasDecorator} */
function TrimDescription({maxParameterDescriptionLength, maxOperationDescriptionLength}) {
    return {
        any: {
            enter: (node) => {
                if (node.description && typeof node.description === "string") {
                    if (node.description.length > maxParameterDescriptionLength) {
                        node.description = `${node.description.substring(0, maxParameterDescriptionLength)}...`;
                    if (node.operationId && node.description.length > maxOperationDescriptionLength) {
                        node.description = `${node.description.substring(0, maxOperationDescriptionLength)}...`;

/** @type {import('@redocly/cli').DecoratorsConfig} */
const decorators = {
    oas3: {
        "trim-description": TrimDescription,

module.exports = {

Adopting Single Header API Key Authentication

OpenAI does not support multiple authentication types or authentication through URL parameters. Woosmap, however, supports both authentication types: using Public and Private keys in URL parameters, and the Private key with a custom X-API-Key header. To ensure compatibility with OpenAI, we need to eliminate the unsupported authentication types.

Here’s a Redocly decorator function that retains only the specified security scheme:

function OneSecurityScheme({ securitySchemeName }) {
  return {
    Components: {
      leave(Components) {
        if (Components.securitySchemes && Components.securitySchemes[securitySchemeName]) {
          Components.securitySchemes = {

The final redocly.yaml configuration file should look like this:

    root: ./woosmap-openapi3.json
        property: operationId
          - localitiesAutocomplete
          - localitiesDetails
          - localitiesGeocode
          - storeSearch
          - getDistanceMatrix
        maxParameterDescriptionLength: 690
        maxOperationDescriptionLength: 290
        securitySchemeName: PrivateApiKeyHeaderAuth
  - ./plugins/decorators.js

Launch again this command to generate the Woosmap OpenAI-compatible specification file:

redocly bundle --dereferenced --output extract-woosmap-openapi --ext yml

Final Cleaning

After including the desired operations in the specification, use the --remove-unused-components option to reduce the file size by retaining only the necessary components. The complete work for this tutorial is documented in the Pull Request on our public OpenAPI repository, accessible above.

With these modifications, the extract-woosmap-openapi.yml file is now ready for error-free upload to a custom GPT Action.

Creating the Woosmap GPT Action

GPT Actions allow ChatGPT to communicate by using a developer provided interface that gives it the ability to make requests over a network.

To create the Woosmap action, navigate to the Configure tab and click at the bottom on the “Create new action” button.
Then, copy paste the content of previous extracted OpenAPI specification into the “Schema” field.

After uploading the OpenAPI specification, you must configure the authentication by setting your private key in an X-Api-Key custom header. Please note that authentication cannot be prompted by a user; it has to be set upstream.

And there you have it! The Action should now be fully functional. You can test and debug it in the right panel by asking a question. You’ll notice detailed parameters on how it is called and how the response is retrieved.

:child: User Input:

Give me detailed information on 22 rue Cail, Paris

:robot: GPT Answer:

[debug] Calling HTTP endpoint

 "domain": "",
 "method": "get",
 "path": "/localities/geocode",
 "operation": "localitiesGeocode",
 "params": {
   "address": "22 rue Cail, Paris"

[debug] Response received …

The address “22 Rue Cail, 75010 Paris, France” is located at the following coordinates:

  • Latitude: 48.88285
  • Longitude: 2.35984

This location is in the 10th arrondissement of Paris, within the Île-de-France region. The Rue Cail is a specific street within this arrondissement, and the address falls within the postal code 75010. The area is known for its vibrant neighborhood with a mix of residential and commercial establishments.

Now, it’s time to construct comprehensive instructions that will guide the model in requesting the Woosmap APIs to generate coherent and contextually relevant responses.

Creating Instructions for the GPT to Interact with Woosmap APIs

These instructions will guide GPT on how to formulate requests accurately and efficiently.

Best Practices

Here are some general best practices for providing clear and concise guidance.

  • Understand the Data Model: Ensure the model has a fundamental understanding of the Woosmap data structure.

    You are programmed to query Woosmap APIs. Use your knowledge of Woosmap’s platform to act accordingly.

  • Optimize Requests: Guide the model to translate user queries into specific Woosmap APIs Requests.

    Formulate queries to send to your action using URL Parameters. For example,{USER_INPUT}&stores_by_page=5
    Before presenting data to user, reorder stores based on distance by first calling the Distance Matrix Endpoint. Add all retrieved stores to destinations separated by a “|” character. Use the below URL:{LATITUDE_INPUT},{LONGITUDE_INPUT}&destinations={STORE.GEOMETRY.COORDINATES[1]},{STORE.GEOMETRY.COORDINATES[0]}

  • Employ Specific Keywords: Encourage the use of Woosmap-specific terminology in queries to enhance accuracy.

    Include stores_by_page in your queries to limit the number of retrieved suggestions to 3, and prefer reducing the distance afterward instead of passing the radius parameter.

  • Handle Errors and Alternatives: Instruct the model on how to answer when data can’t be accessed or is unavailable.

    If you’re unsure about which autocomplete response to select, ask the user to choose from all the suggestions provided.

  • Present Data Clearly: Guide the model to present data in an easily understandable format.

    Explain Woosmap data simply and clearly, suitable for the user’s expertise level.

Check out this help article from OpenAI or, which are both very useful resources for delving into prompting techniques used with language models like ChatGPT.

Instructional Prompt example

Based on these best practices, our instructional prompt for the GPT model ends up being:

Leverage the Woosmap API to assist users in locating nearby stores (coffee shop) based on an address, a city, or a postal code.

If the user share an address, postal code, or locality, initiate by utilizing the Localities Autocomplete endpoint to get the location using the provided URL format:{USER_INPUT}&types=locality|postal_code|address|admin_level|airport|train_station|metro_station|shopping|country&data=advanced&language={USER_BROWSER_LANGUAGE_ALPHA2}&origin=chatGPT&no_deprecated_fields=true

If the first suggested locality seems appropriate, go with it. Otherwise, if you’re unsure about the autocomplete response to select, ask the user to choose between all suggestions.

When executing the stores search, avoid specifying a radius or query. Incorporate the parameter stores_by_page set to 3 for the stores search using the subsequent URL format:{LATITUDE_INPUT}&lng={LONGITUDE_INPUT}&stores_by_page=3

Ensure that the response generates a list of proximate stores without specifying any radius or query. Additionally, restrict the number of stores_per_page to 3 by default or the number requested by the user.

Before presenting data to user, reorder stores based on distance by first calling the Distance Matrix Endpoint. Add all retrieved stores to destinations separated by a “|” character. Use the below URL:{LATITUDE_INPUT},{LONGITUDE_INPUT}&destinations={STORE.GEOMETRY.COORDINATES[1]},{STORE.GEOMETRY.COORDINATES[0]}

Note that Geometry in Stores Search response is “”“geometry.coordinates : [longitude, latitude]”“”. So when you pass it to distance Matrix endpoint, build the query as destinations={STORE.GEOMETRY.COORDINATES[1]},{STORE.GEOMETRY.COORDINATES[0]} so it matches destinations={STORE_LATITUDE},{STORE_LONGITUDE}

Present data clearly ordered by distance calculated by the Distance Matrix endpoint.

CoffeeShopLocator GPT

To ensure your custom GPT works seamlessly with Woosmap, conduct testing and refinement of your prompt based on real-world queries. This iterative process will fine-tune the GPT’s capability to deliver accurate and useful responses tailored to your business needs.


As demonstrated, by following these steps, you can successfully integrate your GPT with Woosmap APIs, creating your very own “Locator” GPT. This AI assistant stands as a dependable and efficient tool for addressing customer queries.

Moreover, you can extend this approach to create integrations with other structured knowledge sources, enabling you to consolidate multiple actions within the same custom GPT. For instance, you could create a “Product Locator” by integrating in-stock APIs with Woosmap.

You can access this Custom GPT at the following link, but please note that it requires a Plus or Enterprise paid account to try:

If you have any questions or valuable insights to share, please don’t hesitate to engage with us in the comments section below. Your input is highly appreciated!