When scripting in FileMaker I pay particular attention to the following:
- Code clarity: make sure your code is humanly readable; clean and documented;
- Reuse elements: declare a variable once, use it whenever you need it;
- Keep your code compact: if you don’t need it, don’t put it in or leave it in;
- Generate a sub-script if you have to use the same set of lines in multiple places;
- Speed: make it as fast as possible; and
- Error handling: make sure errors are caught and appropriately handled.
You don’t have to adhere to all of the above requirements, but your process will work better if you do. Recently I had a situation where speed was not sufficient for the process I was working on so I had to rethink the approach. The process involved populating a SQL table with data generated within a FileMaker app.
If you work in environments with multiple software systems, you cannot escape having to exchange data with other databases, such as Oracle, SQL, or mySQL. FileMaker offers several features that allow for interacting with data in other sources. ESS uses the ODBC connection set up on your server (or desktop) to access supported external SQL sources as if they are a FileMaker table. The same ODBC data source can be used to execute commands.
There are two “Execute SQL” features in FileMaker: one is a calculation function and the other a script step.
ExecuteSQL Calculation Function
If you haven’t yet, I’d recommend getting familiar with the ExecuteSQL calculation simply because it just makes life a little simpler in FileMaker. I like to use it to get an ID of a certain record or simply get a record count for a set criteria. It’s also useful for
ExecuteSQL has become one of my favorite calculation functions. It’s fast and gets the results done but it can only query the data source (SELECT). There are plug-ins out there that can perform other steps, such as INSERT. It can get sluggish if you use it on a dataset larger than 40,000 records.
Execute SQL Script Step
My user scenario was the following: we scan data at a facility into a database hosted using FileMaker Server. There are multiple users scanning date at the same time. The data needs to be posted to a SQL database in almost real-time, because they need to run reports on it in another system.
Since we had issues on the client’s SQL server (which we suspect might have something to do with the ESS tables being accessed all the time), we decided to try a different method. I set up a script to push the records into SQL via the Execute SQL script step executing the INSERT command.
This was my initial approach to the new process:
- Search for the unprocessed records;
- Generate a field list variable;
- Generate a data list variable;
- Insert the current record’s data into SQL; and
- Loop to repeat steps 2 – 4 for each record in the found set.
Throughout the process I learned that I needed to issue a command before inserting a record:
SET IDENTITY_INSERT [ database_name . [ schema_name ] . ] table { ON | OFF }
This command can be combined with the INSERT lines, which makes it simpler.
This script got the job done, but it would take five minutes to process fewer than 1,500 records. Since they wanted to run the script every five minutes the script would bite its own tail. I started tinvestigating to see if there’s a way I can cut the time down. Turns out you can INSERT multiple rows into SQL with one command, as many as 1000 records at a time. I refactored my script once more to post records in a batch. Along the way, I also learned that FileMaker Server does not suppress the 401 error, so I modified my process — see the reference above.
This is the new process:
- Check for unprocessed record with ExecuteSQL calc;
- If they exist, search for the unprocessed records;
- Generate a field list variable;
- Loop through up to 1000 records to generate the data list variable;
- Push data to SQL; and
- Continue looping until all the records in the found set have been processed,
Now steps-wise this doesn’t look like a huge change, but the script time got reduced from about five minutes to four seconds.
Below is the set of SQL commands I used to insert 1,000 rows into SQL at the same time.
The Command
"SET IDENTITY_INSERT [dbo].[SCN__Scan] on " & ¶ &
"INSERT INTO SCN__Scan (" & $_fieldset_horizontal & ") VALUES " & $_data & ¶ &
" SET IDENTITY_INSERT [dbo].[SCN__Scan] off"
The Field Set
The $_fieldset_horizontal variable is just a comma-separated list gathering the names of the fields on a layout using the function FieldNames ( Get (FileName) ; Get (LayoutName ) ). So once you gathered your fields, you need to replace your carriage returns with commas: Substitute ( $_fieldset ; ¶ ; “, ” ).
The Data
The $$_data variable is also a comma-separated list of values with prentices surrounding each record; it essentially looks like this:
(value1, value2, value3), (value1, value2, value3), (value1, value2, value3), etc.
Error Checking
You cannot program without debugging. You can get error reporting from the Get ( LastExternalErrorDetail ) function after each execution of the Execute SQL script step. It’s very detailed (much more so than the error reporting on the ExecuteSQL calculation function) so you can correct errors as they occur.
Testing
When you are testing you generally need test data. I assembled a quick script that generates 1,500 records with Perform Script on Server (PSOS) script step, but first truncates the table. Since I have access to the client’s SQL data source, I can also truncate the SQL table. Wash, rinse, repeat.
Running It from Server
It’s always another can of worms when you run a PSOS or scheduled server script. Two things that come to the rescue are logging any and all errors and logging script execution times. I set up an ErrorLog and a ScriptLog table in my solution and log the results.
Custom FileMaker Development with ExecuteSQL
If you are seeking ways to customize your solution further with ExecuteSQL and other functions, please contact my team today. We’ve helped thousands of organizations improve and strengthen their applications, and we’d love to partner with you too.
Nice job. Minor point of order: the calculation function is ExecuteSQL (one word). The script step is Execute SQL (two words).
Thank you, Tom. Good to hear from you. Will make the correction.
Wow, good staff. Thank you for the elaborate explanation, nice to have yet another way to avoid loops.
Hi,
Is it possible to have a example solution ?
Thank you
Théo
Hi do you have a demo file ?
Kind regards.
Antoine
Excellent example and use case!
You take batch size 1000 records per query. I wonder is there a limit on SQL or FileMaker side to handle max Insert operations in a single Excute SQL script step? Or any limitation in terms of bytes sent to SQL server?
Regards,
Osman Cabi