AI Comes to FileMaker: Powerful New LLM Features for Smarter Apps

Posted by Kyo Logic on July 15, 2025

The FileMaker 2025 release marks a major leap forward for Claris developers, introducing a suite of AI and large language model (LLM) capabilities that bring intelligence directly into your custom applications. With these new features, businesses can build smarter, faster, and more intuitive systems—using AI not just for novelty, but as a core part of their workflows.

From generating dynamic content to enabling natural language search and semantic analysis, FileMaker’s AI toolkit unlocks possibilities that were unthinkable in earlier versions. Here’s a look at what’s new and how it could transform your apps.

Generate AI Responses with Prompt-Based Scripting

Developers can now integrate AI-generated content directly into their apps using prompt-based scripting. With the Generate Response from Model feature, FileMaker can send user-defined prompts to an AI provider like OpenAI or Cohere and return context-aware results in real time.

Use case: Automatically generate email drafts, marketing content, or client-facing summaries directly within your FileMaker system—tailored to your specific business needs.

Perform SQL Queries and Finds Using Natural Language

With Perform SQL Query / Find by Natural Language, users no longer need to know complex query syntax. FileMaker interprets plain text commands, converts them into SQL or FileMaker queries, and executes them seamlessly.

Use case: A team member types “Show all orders over $5,000 from last quarter” into a search field. FileMaker translates it into a precise query and displays the results instantly.

Reusable Prompt Templates and Regression Models

The addition of Prompt Templates and ML Regression Models allows developers to create reusable AI logic within apps. This makes it easy to standardize AI-powered processes and apply predictive analytics where needed.

Use case: Build a predictive model to forecast inventory demand based on historical sales and integrate it into multiple layouts or workflows.

RAG Actions for Knowledge Base Integration

Retrieval-augmented generation (RAG) enables FileMaker to pull information from PDFs and documents in your system and use it as a context source for AI responses.

Use case: Create an AI-powered help assistant that answers employee questions using your company’s internal manuals and policies stored in FileMaker.

Vector Functions for AI Semantic Search

With new vector math functions, FileMaker supports semantic search capabilities. This allows applications to understand user intent and deliver more relevant results—even when exact keywords aren’t used.

  • Normalize, add, and subtract embeddings for advanced data analysis.

  • Enable semantic search on text and images for a more intuitive user experience.

Use case: A user searches “eco-friendly packaging” and FileMaker retrieves related product records, even if those exact words don’t appear in descriptions.

Why These Features Matter

The new AI/LLM features in FileMaker allow businesses to:

  • Work Smarter – Automate content creation, search, and data analysis tasks that were previously manual.

  • Deliver Better UX – Let users interact with apps in natural language, improving accessibility and adoption.

  • Stay Competitive – Build intelligent systems without relying on external platforms or extensive custom code.

  • Maintain Data Security – Keep AI workflows within the trusted FileMaker ecosystem.

FileMaker’s new AI/LLM capabilities are designed to help developers and businesses build the next generation of custom apps—smarter, faster, and more intuitive. From natural language queries to predictive analytics and semantic search, these tools make it possible to integrate AI at every level of your solution.

Interested to learn more about how Claris FileMaker’s AI features can transform your workflows? Reach out to Kyo Logic here.