Research IBM Notes Application code with AI
What’s this Notes app even doing?
The original developer is long gone, nobody has a clue, and now you’re tasked with replacing it. But before you touch a single line, you need to understand how it works.
The problem? Looking at the design drives you mad — code is scattered everywhere. That’s the curse of many Notes apps: no clear start, no obvious end. So… where do you even begin?
We’ve built a system that makes sense of a Notes application’s design by letting you ask plain-language questions like:
- What’s the main form of this app?
- What does this form actually do?
- Which subforms are being called?
- What are the buttons programmed to do?
Exploring the Notes App Through Chat
Let’s first look at how it works, then I’ll walk you through how we built it — starting with the chat-based user interface.
First, choose the Notes app you want to research.
It has 3 options that you will need to do the research.
Step 1 – Find Notes Design Elements to Analyze
The problem with Notes design elements is sheer volume. A typical app can have hundreds — forms, views, agents, subforms, script libraries, and more. Some are critical; others haven’t been touched in years.
This first step helps you identify the important elements so you can focus your research where it matters. You can ask questions like “What is the most important form?” — spelling mistakes are fine, and you can ask in any language. The tool will understand and reply.
Example: “List all subforms used in this Notes database.”
It might return around 10 results — you can even set your own maximum.
This is far easier to digest than scrolling endlessly in the Notes Designer client and trying to piece things together from raw code.
It will find around 10 subforms — and you can set the maximum number of results, by the way.
This information is far easier to understand than digging through the Notes Designer client and reading raw code — agree?
The designer can now copy and paste any information that seems important into a Word document — for now, just to build an overview of the app, what’s important, and what’s not.
Step 2 – Retrieve the DXL Code
Once you spot an element worth deeper inspection, copy its ID (for example: a677d5f5d8dfac1100258c530054ae77) and paste it into the DXL field in part two of the app.
Here, the tool displays the raw DXL code for that design element. While nearly impossible for humans to read, this format is perfect for AI to analyze.
DXL stands for Domino XML Language.
Basically, DXL is the “blueprint” of a Notes element in XML form — perfect for machine processing, painful for humans.
Step 3 – Ask ChatGPT About the Code
In the final pane, you can ask any question about the retrieved Notes design element DXL — for example:
- Summarize this agent
- What fields are on this form?
- Does the form run any code? If so, where?
- Which other design elements does this form call?
The AI responds with a clear explanation, highlighting logic, dependencies, and references to other design elements.
Conclusion – Why This Approach Works
For anyone who isn’t a hardcore Notes developer, diving straight into Domino Designer is overwhelming. A single Notes app can contain an ocean of code, spread over hundreds of elements, with no clear starting point. You can’t just export everything into Word and expect clarity — there’s too much, and it’s too fragmented.
Our approach solves that by letting you explore the app interactively, in plain language. All you need is a few hours of basic familiarity with Notes apps. The tool gives you structured, easy-to-understand explanations that you can compile into a design document — the essential blueprint for any replacement project.
Since all code is available via DXL exports, you can, in theory, fully understand the app’s logic, business rules, and interdependencies. And because the AI does the heavy lifting, you avoid the usual frustration and can focus on designing the future system.
In short: this is the first practical, scalable way for non-Notes experts to truly understand a Notes application before replacing it.
How it has been build
IBM Lotus Notes Design Export (DXL)
The AI system needs Lotus Notes database design input. We have extended our Domino application scanner (NDDM) with a function to export all the HCL / IBM Notes application design elements as text. We can only use text as input for the AI system, by the way. Please check some Notes app design exports in the images below.
Each Notes application design element is exported to a single Notes document. In this example, the Notes application has 3 agents, 8 forms, and so on. NDDM creates a highly detailed export of all the Lotus Notes functions on a Notes form. For example, it exports the Notes fields on the form, types of fields, all buttons, and all formulas and LotusScript code used in various places on the form. At the time of writing this blog, I must admit that the design exports we create can be improved, but for now, they are sufficient.
I have used make.com to create automation scenarios that build the AI system to process the IBM Notes application design data (text) in order to answer my questions.
Workflow Scenario 1: From Notes Design Data to Pinecone and Mongo
The goal of this scenario is to take all Notes application design data (forms, views, agents, script libraries, and so on) from NDDM and store it in a cloud database for further processing in scenarios 2 and 3. Pinecone will be explained later on.
The complete make.com scenario is displayed in the image below.
This flow takes each Notes design element from a selected NSF and runs it through three layers of AI analysis:
Summarization – It asks GPT to read the raw DXL code and write a short, plain-English explanation of what the element is, what it does, and how it’s used.
Functional classification – GPT decides which role the element plays in the application (e.g., data entry, reporting, workflow routing) and explains its reasoning.
Importance scoring – GPT estimates how important the element is to the app’s overall function and justifies the score.
The text is then cleaned to remove formatting noise and turned into an embedding vector so it can be semantically searched later.
Finally, both the vectors (in Pinecone) and the raw + analyzed data (in MongoDB) are stored, so you can quickly find and understand the most relevant design elements when reverse-engineering or replacing the app.
Please refer to this video about vectors >> https://www.youtube.com/watch?v=ZGa8cQDel80
Why Vectors?
OpenAI’s GPT-4 model currently has a limit of about 120,000 tokens per prompt (roughly 200–300 pages of text). The CSW Dev Notes application contains far more code than that, so we need to break it into smaller chunks. Keeping prompts smaller not only fits the limit but also improves accuracy.
We also want to store the designs of many different Notes applications in a database. A vector database like Pinecone makes this possible — and powerful — by allowing us to search and retrieve the most relevant pieces of information instantly. How this works will become clear when we look at the upcoming Make.com scenarios.
UI
The UI itself is built in Retool, but Retool’s role here is mainly presentation and user interaction — all the complex logic, AI processing, and data handling happens in Make.com. Each section of the Retool screen is wired to a specific Make.com scenario, and we’ll walk through each of those in the next sections.
Step 1 – Find Notes Design Elements to Analyze
In plain words, here’s what’s happening:
Retool sends the user’s question (and optional max results) to Make.com via a webhook.
Make.com cleans the input (removing newlines, etc.).
OpenAI Embedding API converts the question into a vector.
Pinecone search finds matching Notes design elements based on that vector.
OpenAI GPT-4o checks each match and says “YES” or “NO” with a confidence score, based on whether it’s relevant to the question.
Text aggregator collects all YES matches and formats them into a readable output (type, name, NSF title, ID, summary, etc.).
Webhook response sends this formatted text back to Retool for display in the UI.
So in the Retool app, this scenario powers the “Find relevant design elements” part of the screen.
Retool itself just shows the results — all the searching, AI scoring, and formatting happens inside Make.com.
Step 2 – Retrieve the DXL Code
This Make.com scenario is what Retool calls when you select a specific Notes design element and want its raw DXL code.
Retool sends the design element’s ID to Make.com via a webhook.
The ID is cleaned (removing newlines/spaces issues).
Make.com queries MongoDB for the matching record in the correct Notes database collection, filtering on that DesignID.
It retrieves the DesignCodeDXL field — the raw XML representation of the Notes design element.
The scenario returns the DXL back to Retool via the webhook response.
This is the “Retrieve DXL” part of your Retool UI. Retool just shows the result — all the actual fetching and data handling is done in Make.com.
Step 3 – Ask ChatGPT About the Code
This Make.com scenario powers the DXL Chat part of the Retool app — where you can ask ChatGPT questions about the raw DXL code of a Notes design element.
Retool sends two things to Make.com via a webhook:
- Your question.
- The DXL code for the selected design element.
Make.com calls OpenAI GPT-4o with a special system prompt telling it to act as a Lotus Notes design expert. It sends along your question and the full DXL code.
GPT-4o processes the request and generates an answer based on the code.
The answer is sent straight back to Retool, where it’s shown in the chat pane.
The key here is that Retool only handles the interface — all the “thinking” (analysis of DXL) is done in Make.com via OpenAI.
Conclusion – How It’s Been Built and What’s Next
The architecture is actually quite straightforward. NDDM is the critical piece — it exports every design element of a Notes database to DXL, giving us the clean text format the AI needs. Once the export exists, the Make.com scenarios take over: they analyze, classify, score, store, and make each element searchable in Pinecone and MongoDB.
Because NDDM can process any number of NSF files, we can easily prepare tens or even hundreds of Notes applications for AI analysis and make them available in the chat interface. This gives client-side designers a fast, intelligent way to research and understand complex Notes apps without wading through the Designer client.
And this is just the starting point — AI capabilities evolve rapidly, so we expect to improve both the accuracy and depth of analysis significantly in upcoming projects. The result will be an even smarter, faster, and more intuitive research tool for Notes-to-modern-platform migrations.