You are currently viewing Lotus Notes app design analyses with AI

Lotus Notes app design analyses with AI

  • Post author:
  • Post published:January 11, 2025
  • Post category:AI

Research IBM Notes Application code with AI

What’s this Notes app even doing?

The original developer is long gone, nobody has a clue, and now you’re tasked with replacing it. But before you touch a single line, you need to understand how it works.

The problem? Looking at the design drives you mad — code is scattered everywhere. That’s the curse of many Notes apps: no clear start, no obvious end. So… where do you even begin?

We’ve built a system that makes sense of a Notes application’s design by letting you ask plain-language questions like:

  1. What’s the main form of this app?
  2. What does this form actually do?
  3. Which subforms are being called?
  4. What are the buttons programmed to do?

Exploring the Notes App Through Chat

Let’s first look at how it works, then I’ll walk you through how we built it — starting with the chat-based user interface.

First, choose the Notes app you want to research.

It has 3 options that you will need to do the research.

Step 1 – Find Notes Design Elements to Analyze

The problem with Notes design elements is sheer volume. A typical app can have hundreds — forms, views, agents, subforms, script libraries, and more. Some are critical; others haven’t been touched in years.

This first step helps you identify the important elements so you can focus your research where it matters. You can ask questions like “What is the most important form?” — spelling mistakes are fine, and you can ask in any language. The tool will understand and reply.

Example: “List all subforms used in this Notes database.”
It might return around 10 results — you can even set your own maximum.

This is far easier to digest than scrolling endlessly in the Notes Designer client and trying to piece things together from raw code.

It will find around 10 subforms — and you can set the maximum number of results, by the way.

See below for an example output.

#2 – Subform – sf_DisbursementSummary_View – Test Disbursement Request for Archiving Fulle – a677d5f5d8dfac1100258c530054ae77
** Summary: This Notes design element is a Subform named “sf_DisbursementSummary_View” used within the “Test Disbursement Request for Archiving Fulle” database. Its purpose is to display a summary of key information related to disbursement requests. The subform includes computed-for-display fields tied to metadata, extracting and showing values such as Request Number, Date Requested, Organization, Location, Vendor details, Invoice, Email, Payment Type and Method, and conditional fields like tax routing or management exception categories. Key logic involves dynamic field resolution via formulas parsing a metadata string based on field names. Conditional hiding is used to tailor the displayed fields depending on organization or payment type. This subform is likely embedded in forms or views to present a consistent, read-only summary of a disbursement entry, offering a user-friendly way to view processed or submitted data across different request scenarios.
** Category: Reporting or export —
Reasoning: ### 1. Design Type Subform. ###
2. Trigger The subform is embedded into other forms to display computed summary data. ###
3. Business Logic The subform uses numerous @Functions, primarily @If, @Subset, @Member, @Word, and string manipulation to retrieve and display metadata-driven computed values. The content includes computed-for-display fields only, with no editable fields, and conditional logic is used to selectively show specific data based on payment type, vendor, and organization fields. No data entry or routing logic is performed directly. ###
4. Usage It is used to present a formatted, tabular summary of disbursement-related metadata, likely as part of a read-only view or print/export format. This includes values such as Request Number, Vendor Info, Invoice Number, Payment Type, and others, compiled from metadata for display purposes. ###
5. Conclusion The subform’s key function is to present structured, computed display-only data in a clean format for clarity, referencing, reporting, or print purposes. Given its structured presentation and computed fields, it aligns best with Reporting or export.
** Importance: 90%
Reason: This subform is a critical component for displaying comprehensive summaries of disbursement requests. It includes computed-for-display fields that dynamically render key data points such as request number, organization, vendor details, payment type, and more, many of which rely on formulas referencing shared metadata (fld_metadata). The use of conditional logic (e.g., hide-when formulas) to show or hide sections based on request context indicates complex business rules tied to organizational and payment characteristics. Additionally, it outputs data visualization for users and likely feeds downstream business processes and reports. Its design tightly couples display logic with business data interpretation, making it highly relevant for both user experience and system functionality; thus, migration must preserve its logic accurately.

This information is far easier to understand than digging through the Notes Designer client and reading raw code — agree?

The designer can now copy and paste any information that seems important into a Word document — for now, just to build an overview of the app, what’s important, and what’s not.

Step 2 – Retrieve the DXL Code

Once you spot an element worth deeper inspection, copy its ID (for example: a677d5f5d8dfac1100258c530054ae77) and paste it into the DXL field in part two of the app.

Here, the tool displays the raw DXL code for that design element. While nearly impossible for humans to read, this format is perfect for AI to analyze.

DXL stands for Domino XML Language.
Basically, DXL is the “blueprint” of a Notes element in XML form — perfect for machine processing, painful for humans.

Step 3 – Ask ChatGPT About the Code

In the final pane, you can ask any question about the retrieved Notes design element DXL — for example:

  1. Summarize this agent
  2. What fields are on this form?
  3. Does the form run any code? If so, where?
  4. Which other design elements does this form call?

The AI responds with a clear explanation, highlighting logic, dependencies, and references to other design elements.

A full DXL response example is listed below

From the DXL you’ve provided for the subform named sf_DisbursementSummary_View, we can determine that the design references other design elements within the Lotus Notes application. Specifically, several subform, image resource, and possibly script or shared fields are invoked.

Here are the other design elements that are explicitly or dynamically referenced within this subform:

### 1. **Referenced Subforms**
These are other subform design elements dynamically or statically embedded within sf_DisbursementSummary_View:

#### A. Conditionally Included Subform via Formula:
xml


@If(@Contains(NextTitle;"AP Processor" : "Tymetrix Processor");
"sf_AP_Comment";
"sf_AP_Comment_View")


✅ This implies that either:
sf_AP_Comment
sf_AP_Comment_View
…will be included based on the formula logic. So, two other subforms are referenced conditionally.

### 2. **Image Resources**

Image resources are included in the form layout for visual icons:

#### A. Red Ball Image:
xml


✅ This references an image resource named:
redball

### 3. **Shared Fields**
From patterns like this:
xml


"If banking details are on vendor account, no check will be issued!" + @NewLine +
"Payment will be included on the next regular payment run."


✅ While not explicitly defined as shared fields, the naming (e.g., IO_PIPEDBKD4W_1) might indicate use of standardized computed fields persisted across various forms/subforms. These may internally call shared script libraries or constants.

### 4. **LotusScript Code – Dynamic Behavior**

The following LotusScript declaration block is embedded in $cs_BU item:

text
Declare Sub Onfocus(Source As Field)

Sub Onfocus(Source As Field)
Dim ui As New NotesUIWorkspace
Dim uidoc As NotesUIDocument
Set uidoc = ui.CurrentDocument
If uidoc.FieldGetText("cs_Organization") = "" Then
Messagebox "You must select an organization before choosing a business unit"
Call uidoc.FieldSetText("cs_BU","")
Call uidoc.Refresh
End If
End Sub

This implies that the field cs_BU is wired to the Onfocus event, possibly through event binding. This custom validation logic represents behavioral design indirectly calling:

– **Script context**: for runtime logic
– **Field dependencies**: cs_Organization, cs_BU

### 5. **Field Dependency Referencing**
Many computed fields reference a field named:
fld_metadata

Example formula pattern (used in dozens of fields):
formula
myName := @ThisName;
fieldName := @ReplaceSubstring(myName; "l_"; "");
fldNames := @Word(fld_metadata; "^"; 1);
myIdx := @Member(fieldName; fldNames);
myData := @Subset(@Subset(fld_metadata; myIdx); -1);
@Word(myData; "^"; 3)

✅ This suggests reliance on:
– A structured metadata field fld_metadata (could be a rich text, text list, or hidden field)
– Possibly a **shared field** if reused in other forms as well
– Complex computed logic using @functions

### Summary of Referenced Design Elements:

| Type | Design Element Name(s) | Notes |
|——————-|———————————————|—————————————-|
| Subform | sf_AP_Comment, sf_AP_Comment_View | Selected via @If formula |
| Image Resource | redball | Used for field validation markers |
| Computed Field | IO_PIPEDBKD4W_1 and many others | Reusable text/numeric display fields |
| Script Code | From $cs_BU item LotusScript | Event-driven validation logic |
| Field dependency | fld_metadata, cs_Organization, cs_BU | Dynamically resolves display fields |

In conclusion, the subform sf_DisbursementSummary_View dynamically includes at least two other subforms conditionally, uses a graphical resource (redball), depends heavily on one or more metadata-driven fields (fld_metadata), and contains embedded LotusScript for behavior, indicating it strongly interconnects with other design components across the Lotus Notes application. 

Conclusion – Why This Approach Works

For anyone who isn’t a hardcore Notes developer, diving straight into Domino Designer is overwhelming. A single Notes app can contain an ocean of code, spread over hundreds of elements, with no clear starting point. You can’t just export everything into Word and expect clarity — there’s too much, and it’s too fragmented.

Our approach solves that by letting you explore the app interactively, in plain language. All you need is a few hours of basic familiarity with Notes apps. The tool gives you structured, easy-to-understand explanations that you can compile into a design document — the essential blueprint for any replacement project.

Since all code is available via DXL exports, you can, in theory, fully understand the app’s logic, business rules, and interdependencies. And because the AI does the heavy lifting, you avoid the usual frustration and can focus on designing the future system.

In short: this is the first practical, scalable way for non-Notes experts to truly understand a Notes application before replacing it.

How it has been build

IBM Lotus Notes Design Export (DXL)

The AI system needs Lotus Notes database design input. We have extended our Domino application scanner (NDDM) with a function to export all the HCL / IBM Notes application design elements as text. We can only use text as input for the AI system, by the way. Please check some Notes app design exports in the images below.

Each Notes application design element is exported to a single Notes document. In this example, the Notes application has 3 agents, 8 forms, and so on. NDDM creates a highly detailed export of all the Lotus Notes functions on a Notes form. For example, it exports the Notes fields on the form, types of fields, all buttons, and all formulas and LotusScript code used in various places on the form. At the time of writing this blog, I must admit that the design exports we create can be improved, but for now, they are sufficient.

I have used make.com to create automation scenarios that build the AI system to process the IBM Notes application design data (text) in order to answer my questions.

Workflow Scenario 1: From Notes Design Data to Pinecone and Mongo

The goal of this scenario is to take all Notes application design data (forms, views, agents, script libraries, and so on) from NDDM and store it in a cloud database for further processing in scenarios 2 and 3. Pinecone will be explained later on.

The complete make.com scenario is displayed in the image below.

This flow takes each Notes design element from a selected NSF and runs it through three layers of AI analysis:

Summarization – It asks GPT to read the raw DXL code and write a short, plain-English explanation of what the element is, what it does, and how it’s used.

Functional classification – GPT decides which role the element plays in the application (e.g., data entry, reporting, workflow routing) and explains its reasoning.

Importance scoring – GPT estimates how important the element is to the app’s overall function and justifies the score.

The text is then cleaned to remove formatting noise and turned into an embedding vector so it can be semantically searched later.
Finally, both the vectors (in Pinecone) and the raw + analyzed data (in MongoDB) are stored, so you can quickly find and understand the most relevant design elements when reverse-engineering or replacing the app.

Please refer to this video about vectors >> https://www.youtube.com/watch?v=ZGa8cQDel80

Why Vectors?

OpenAI’s GPT-4 model currently has a limit of about 120,000 tokens per prompt (roughly 200–300 pages of text). The CSW Dev Notes application contains far more code than that, so we need to break it into smaller chunks. Keeping prompts smaller not only fits the limit but also improves accuracy.

We also want to store the designs of many different Notes applications in a database. A vector database like Pinecone makes this possible — and powerful — by allowing us to search and retrieve the most relevant pieces of information instantly. How this works will become clear when we look at the upcoming Make.com scenarios.

UI

The UI itself is built in Retool, but Retool’s role here is mainly presentation and user interaction — all the complex logic, AI processing, and data handling happens in Make.com.  Each section of the Retool screen is wired to a specific Make.com scenario, and we’ll walk through each of those in the next sections.

Step 1 – Find Notes Design Elements to Analyze

In plain words, here’s what’s happening:

Retool sends the user’s question (and optional max results) to Make.com via a webhook.

Make.com cleans the input (removing newlines, etc.).

OpenAI Embedding API converts the question into a vector.

Pinecone search finds matching Notes design elements based on that vector.

OpenAI GPT-4o checks each match and says “YES” or “NO” with a confidence score, based on whether it’s relevant to the question.

Text aggregator collects all YES matches and formats them into a readable output (type, name, NSF title, ID, summary, etc.).

Webhook response sends this formatted text back to Retool for display in the UI.

So in the Retool app, this scenario powers the “Find relevant design elements” part of the screen.
Retool itself just shows the results — all the searching, AI scoring, and formatting happens inside Make.com.

Step 2 – Retrieve the DXL Code

This Make.com scenario is what Retool calls when you select a specific Notes design element and want its raw DXL code.

Retool sends the design element’s ID to Make.com via a webhook.

The ID is cleaned (removing newlines/spaces issues).

Make.com queries MongoDB for the matching record in the correct Notes database collection, filtering on that DesignID.

It retrieves the DesignCodeDXL field — the raw XML representation of the Notes design element.

The scenario returns the DXL back to Retool via the webhook response.

This is the “Retrieve DXL” part of your Retool UI. Retool just shows the result — all the actual fetching and data handling is done in Make.com.

Step 3 – Ask ChatGPT About the Code

This Make.com scenario powers the DXL Chat part of the Retool app — where you can ask ChatGPT questions about the raw DXL code of a Notes design element.

Retool sends two things to Make.com via a webhook:

  1. Your question.
  2. The DXL code for the selected design element.

Make.com calls OpenAI GPT-4o with a special system prompt telling it to act as a Lotus Notes design expert. It sends along your question and the full DXL code.

GPT-4o processes the request and generates an answer based on the code.

The answer is sent straight back to Retool, where it’s shown in the chat pane.

The key here is that Retool only handles the interface — all the “thinking” (analysis of DXL) is done in Make.com via OpenAI.

Conclusion – How It’s Been Built and What’s Next

The architecture is actually quite straightforward. NDDM is the critical piece — it exports every design element of a Notes database to DXL, giving us the clean text format the AI needs. Once the export exists, the Make.com scenarios take over: they analyze, classify, score, store, and make each element searchable in Pinecone and MongoDB.

Because NDDM can process any number of NSF files, we can easily prepare tens or even hundreds of Notes applications for AI analysis and make them available in the chat interface. This gives client-side designers a fast, intelligent way to research and understand complex Notes apps without wading through the Designer client.

And this is just the starting point — AI capabilities evolve rapidly, so we expect to improve both the accuracy and depth of analysis significantly in upcoming projects. The result will be an even smarter, faster, and more intuitive research tool for Notes-to-modern-platform migrations.