This has been an interesting week for my development process. Usually I spend my time working predominately within a single language at a time, but this past week I’ve delved into Java, FileMaker, shell scripting, PHP and AppleScript, all for a single solution.
It started when a non-profit organization hired me to integrate FileMaker with Amazon S3, allowing them to upload an image directly from a FileMaker container field to an S3 bucket. Their original idea for how to do this was to use 360Work’s WebAssistant plugin to fake sending form data using the POST protocol, and initially it seemed this would work, but after some contact with the wonderful people at 360Work’s tech support, it didn’t look so promising.
The next plan was to use ScriptMaster, also from 360Works. I was actually using this for part of the solution with WebAssistant, as submitting data via web forms requires the encoding of some pieces of information with Base64 and hashing of some with SHA-1, both of which I accomplished with ScriptMaster. But now the plan was to exclusively use ScriptMaster and the Java API available for Amazon Web Services.
But I ran into no end of trouble with that route. I’m sure much of the trouble was due to my inexperience with Java. Pretty much the only exposure I’ve had to productively using Java is with the ScriptMaster plugin (which, strictly speaking, uses Groovy, but from what I understand, Groovy’s language is a superset of Java’s). Here the problem centered around making the Web Services’ Java API’s jar file reliably accessible to the Java code I was writing. Amazon’s instructions for getting started with the Java API never worked for me. I spent far too much time trying to get it to work before I finally decided to try the PHP API instead.
Amazon’s analogous instructions for getting started with the PHP API went completely smoothly, and once I had a working PHP script that would take a speicific file and upload it to S3, generalizing that to work with FileMaker was pretty easy, if not exactly straightforward.
This was possible because the client is running FileMaker under OS X, which allows me to execute AppleScript from FileMaker, and shell scripts from AppleScript and PHP scripts from shells scripts. Yes, it’s convoluted, but I did get it to work, and the rest of this article will demonstrate how. While doing so it’ll demonstrate a few other techniques that may be of interest:
Follow along with the article or download my version of it.
The goal of the software was to allow a single authorized user to run a script in FileMaker that would take the contents of a container field and upload it to a particular directory in a particular bucket on Amazon’s S3. As said above, I accomplished this with a combination of FileMaker scripts and calculations, AppleScript, shell commands and PHP. Leaving for the moment the security checks and error checking, the procedure is outlined by the flowchart below, with each step showing which technology was handling that particular portion.
That’s how the computer executes the proceedure, but building the solution is going to take us from top to bottom and back again through that proceedure. FileMaker needs to execute the AppleScript, but AppleScript needs specific fields to be present to work. The PHP script itself is standalone, but within the solution needs to have the library put in place by FileMaker and extracted by AppleScript. Given all of the interdependencies, we’ll go through this very slowly.
Given the number of disparate files that will be needed for this solution, I recommend you create a project folder for yourself to keep everything organized. Mine’s found in ~/Projects/chiv/chiv.aws_integration/
. Launch FileMaker (I’ll be using version 13, but this solution should work with any version above at least 9, perhaps earlier) and create a new file. Mine’s called S3Upload.fmp12
. You will need FileMaker Pro Advanced in order to integrate the custom functions that come later.
Rename the default table from S3Upload
to Files
and create three fields:
Field Name | Field Type | Options |
id |
Number | Auto-enter serial number, prohibits modification during data entry |
file |
Container | |
url |
Text |
These are the fields that it’s assumed your solution already has, although the file
field is the only one actually required if you don’t need the URL of the uploaded file.
Now create the supporting fields:
Field Name | Field Type | Options | Notes |
filename |
Calculation with Text result | GetAsText( file ) |
|
g_bucket |
Text | Global | Stores the name of the bucket to upload the file to |
g_directory |
Text | Global | Stores the directory path to upload the file to |
g_key |
Text | Global | Temporarily stores the user access key to identify the AWS account to Amazon’s API |
g_secret |
Text | Global | Temporarily stores the secret access key to authenticate the AWS account to Amazon |
g_php_script |
Container | Global | Stores the PHP script that uses the AWS API |
g_php_script_filename |
Calculation with Text result | GetAsText( g_php_script ) |
|
g_aws_php_api_archive |
Container | Global | Stores the AWS PHP API zipped archive |
g_aws_php_api_archive_filename |
Calculation with Text result | GetAsText( g_aws_php_api_archive ) |
|
g_temp_dir_path |
Text | Global | The path to the temporary directory where the PHP script, the API archive and the user’s file will be saved to |
When you’ve finished, your Managed Database dialog in the Fields tab should look something like this:
The divider fields (such as _____ Table Fields __________________________________________________
) are just for organizational purposes. Feel free to exclude them.
Save your field definitions by clicking the OK
button and organize the fields on the layout to be a bit easier to work with as you develop the solution.
Remember that the project defined this upload proceedure to only be executable by a single user, so we’ll set up a couple of accounts, one authorized and the other unauthorized, to work for our testing.
Open the Security dialog from File»Manage»Security
. Rename the Admin account to Developer and give it a password of “developer”. Create an “Authorized” and an “Unauthorized” account, assigning each to the “[Data Entry Only]” privilege set and giving each a password of “authorized” and “unauthorized” respectively. Click OK to save your changes and enter the changed developer credentials to confirm that you retain full access to the system.
So that we can easily change the name of the authorized account as well as forego having to remember its name, open File»Manage»Custom Functions
and create two custom functions:
appl.AuthorizedAccount = "Authorized"
appl.FullAccessPrivilegeSet = "[Full Access]"
While we’re here, we’ll create three more custom functions. The first two are for additional security purposes. The AWS access key and secret key are pretty important to keep secure, and placing them within custom functions gives us an additional layer of security.
appl.S3AccessKeyID = "your_access_key_id"
appl.S3SecretAccessKey = "your_secret_access_key"
The final custom function we’ll use will be a simple one to convert a filename into a path to that file in the temporary directory:
appl.TempDirFilepath( _filename ) = "filemac:" & Get( TemporaryPath ) & _filename
In case you’re unfamiliar with the Get( TemporaryPath )
function in FileMaker, it’s quite useful. For this solution we need a place to store some files, but we only need to store them for a little while and once we’re done with them we don’t care what happens to them. These files aren’t going to be too large, so we don’t need to worry about deleting them when we’re done, but we want them to be deleted some time. So, we care that we can have some place to save the files and that we can read and write to that location, but we don’t care where it is, nor what happens to the files when we’re done so long as they’re deleted at some point. Get( TemporaryPath )
exactly satisfies this need.
When FileMaker launches is automatically creates a temporary folder. Perhaps FileMaker uses it for some purposes, but the key fact is that it allows us to use it for our purposes. The exact location of this folder will likely differ between FileMaker launches. While I’m writing this FileMaker 13 is using /private/var/folders/02/mzl8bjw973713k0yn76z70n40000gn/T/S10/
and FileMaker 12 is using /private/var/folders/02/mzl8bjw973713k0yn76z70n40000gn/T/S10.1/
, and the next time it’ll be somewhere in /private/var/folders/
, and the final folder will be named S
followed by some number, but beyond that it’ll probably be different. When FileMaker quits, it’ll automatically delete the temporary folder, along with all of its contents.
Note that the function returns the startup drive name, which will cause a bit of trouble when passing this path from FileMaker to AppleScript. For example, Get( TemporaryPath )
is currently returning /Aslan/private/var/folders/02/mzl8bjw973713k0yn76z70n40000gn/T/S10/
. When AppleScript uses this, it FileMaker needs to remove the /Aslan
portion on my system and the startup disk’s name on whichever system it’s running on. When we build the FileMaker script you’ll see this taken care of with the following calculation:
$temp_dir = "/" & Substitute( Get( TemporaryPath ); Get( SystemDrive ); "" )
Get( SystemDrive )
, on my system, returns /Aslan/
. Note a slash on boths sides. We use the Substitute()
function to remove that entire string from the temporary path and then prepend a slash at the beginning.
Create a couple of records and put a file in each and you’ll be ready to test the solution when everything else is in place.
Before working with the AWS API for PHP, you’ll of course need an AWS account and a bucket in S3, but you’ll also need to download the API library. If you follow the instructions provided by Amazon which involve the use of git
and curl
on the command line, it should go smoothly. If you’re uninterested in executing the sample.php file that they provide, you can stop following the instructions after those for installing the SDK and dependencies. In that case you can execute the following four commands within Terminal while in your project directory:
git clone https://github.com/awslabs/aws-php-sample.git
cd aws-php-sample/
curl -sS https://getcomposer.org/installer | php
php composer.phar install
Those four commands do quite a lot of work, cloning Amazon’s git repository for the sample PHP project, changing the working directory to the downloaded sample PHP project’s, executing the PHP script downloaded from getcomposer.org (a dependency manager for PHP), and executing the composer to get the actual API files.
It’s those API files that we’re interested in, which you’ll find in a vendor directory within the aws-php-sample directory. We need to zip that folder up so that it can be stored in a FileMaker container field. We’ll use the command line for this because doing so with the contextual menu in the Finder is resource-fork aware, and will add unneeded files (in a __MACOS
directory, see the question on SuperUser for sufficient details) to the archive that would be extracted and visible when the zip
command line we’ll use later extract the archive. If you prefer, it won’t break anything to use the Finder’s contextual menu to compress the files, but as long as we’re in the command line already and in the appropriate directory, execute the following from the shell:
zip -r vendor vendor
This command instructs the shell to create a new zip archive called vendor.zip
(the first vendor
argument above) from the directory vendor (the second vendor argument) recursively including all the folders found (the -r
argument). When you have done so, you’ll have a vendor.zip
file in the aws-php-sample
folder. Drag this to the g_aws_php_api_archive
field in FileMaker.
Our next step will be to write the PHP script that accepts some arguments and uses them and the AWS API to upload a file to S3.
<?php
define("ERR_MISSING_ARGUMENTS", 1);
require 'vendor/autoload.php';
use Aws\S3\S3Client;
$req_args = array( "bucket", "key", "secret", "file", "directory" );
$options = getopt( '', array_map('append_colon', $req_args ) );
$missing_args = array();
foreach ( $req_args as $req_arg ) {
if ( !array_key_exists( $req_arg, $options ) ) {
$missing_args[] = $req_arg;
}
}
if ( count( $missing_args ) != 0 ) {
echo "\nAt least one required argument is missing." .
" Required arguments include 'bucket', 'key', 'secret', 'file' and 'directory'\n";
exit( ERR_MISSING_ARGUMENTS );
}
extract( $options );
$filename = basename($file);
$client = S3Client::factory( array(
'key' => $key,
'secret' => $secret,
) );
$result = $client->putObject( array(
"Bucket" => $bucket,
"Key" => $directory . '/' . $filename,
'SourceFile' => $file,
"ACL" => 'public-read'
) );
exit( 0 );
function append_colon( $str ) {
return $str . ':';
}
?>
Save this file as upload_file_to_s3
and try the following command in your favorite terminal emulator, substituting the appropriate bucket name, credentials, file path and destination directory:
php upload_file_to_s3.php --bucket="bucketname" --key="your_key" --secret="your_secret_key" --file="/Users/user/path/to/a/file.gif" --directory="destination_directory"
The script is very short and most of the code is just making sure the required arguments received by the script are present.
A few (at least to me) interesting aspects of the script: $req_args
is an array of strings, which are just the names of the arguments. This is used in two places, to convert the arguments to variables and to confirm that all the required arguments are present. The confirmation of their existance needs to have the strings appear just as they are in the array, but the conversion to variables involves the getopt()
function, which expects required arguments to have a colon afterwards. So that the single array can be used for both purposes, the array_map()
function is used to create an array of colon-appended strings from the plain strings using the append_colon()
funciton.
Note that although the getopt()
documentation specifies that required options be followed by a colon, I haven’t yet found how getopt()
treats required options differently from optional options, which is why the script above does its own work to ensure that the required options are present.
Also note the ERR_MISSING_ARGUMENTS
constant which is used as an exit value when there are missing arguments. This indicates to the caller (be it a manually entered command in the Terminal or, as will happen later, an AppleScript using the do shell script
command) that there was a failure. Any command line tool that exits with a value other than zero is interpreted to have failed in some way, which is why at the end of the script, when we’ve succeeded in uploading the file, we exit with a return value of 0
.
Add the upload_file_to_s3.php
file to the g_php_script
container field in FileMaker.
FileMaker can’t execute PHP directly, so we use AppleScript as the glue between our FileMaker system and the PHP code. The AppleScript code will also be responsible for extracting the zipped PHP API files.
Create a new AppleScript with the following contents. Mine’s named execute_php_file.scpt
and was created in Script Debugger, but using Apple’s included AppleScript Editor, or for that matter, a text editor, will work fine. In fact, you don’t need this to exist in a separate file, as it’ll be stored within the Perform AppleScript
script step in FileMaker, and you can wait to enter it directly there if you wish.
tell current record
set upload_file_name to cell "filename"
set php_script_filename to cell "g_php_script_filename"
set aws_php_api_archive_filename to cell "g_aws_php_api_archive_filename"
set bucket to cell "g_bucket"
set directory to cell "g_directory"
set the_key to cell "g_key"
set secret to cell "g_secret"
set temp_dir to cell "g_temp_dir_path"
end tell
set cd_cmd to "cd " & temp_dir & "; "
set cmd to cd_cmd & "unzip -o " & temp_dir & aws_php_api_archive_filename
do shell script cmd
set cmd to cd_cmd & "php " & temp_dir & php_script_filename & ¬
" --bucket='" & bucket & "'" & ¬
" --key='" & the_key & "'" & ¬
" --secret='" & secret & "'" & ¬
" --file='" & temp_dir & upload_file_name & "'" & ¬
" --directory='" & directory & "'"
do shell script cmd
Soon we’ll take the contents of this script and use it within a Perform AppleScript
script step within FileMaker. Like the PHP script, this is very short. All it’s doing is getting infromation from FileMaker (which our FileMaker script will soon ensure is available) and performing two command line executions, one to unzip the PHP API archive and the other to run the PHP script.
Note that AppleScript is enclosing argument values within single quotes. This will ensure that they are correctly passed to the PHP script if (for example) the file path has spaces anywere in it.
Also note that each do script command is begun with a cd
command to ensure that the following command is executed within the correct working directory, in this case, the temporary directly, which will be received from FileMaker.
Finally, in case you’re familiar with AppleScript but not with integrating it with FileMaker, you may have noticed that the above script is missing a tell application "FileMaker Pro"
block. If you’re editing the above within AppleScript Editor you may want to enclose the above tell current record
block within one for debugging purposes. When the script is within a Perform AppleScript
script step the default target for commands will be the FileMaker application running the script. In Script Debugger we have the option to set the default target, but other AppleScript editors may require that outside tell block.
Our PHP and AppleScript scripts are ready, so let’s return to FileMaker and put the final piece into place, the FileMaker script.
Create a new script in FileMaker called Upload File from Current Record
with the following steps:
Allow User Abort[Off]
If[
( Get( AccountName ) ≠ appl.AuthorizedAccount )
and
( Get( AccountPrivilegeSetName ) ≠ appl.FullAccessPrivilegeSet )
]
Show Custom Dialog["Unauthorized Account"; "Unauthorized Account"; Default Button: Cancel]
Exit Script[Result: False]
End If
Set Variable[$?; Value:
Let(
[
$php_script_fm_filepath = appl.TempDirFilepath( Files::g_php_script_filename );
$aws_php_api_archive_fm_path = appl.TempDirFilepath( Files::g_aws_php_api_archive_filename );
$upload_fm_filepath = appl.TempDirFilepath( Files::filename );
$temp_dir = "/" & Substitute( Get( TemporaryPath ); Get( SystemDrive ); "" )
]
""
)
]
Set Field[Files::g_temp_dir_path; $temp_dir]
Set Field[Files::g_key; appl.S3AccessKeyID]
Set Field[Files::g_secret; appl.S3SecretAccessKey]
Export Field Contents[Files::g_php_script; $php_script_fm_filepath]
Export Field Contents[Files::g_aws_php_api_archive; $aws_php_api_archive_fm_filepath]
Export Field Contents[Files::file; $upload_fm_filepath]
Perform AppleScript[
tell current record
set upload_file_name to cell "filename"
set php_script_filename to cell "g_php_script_filename"
set aws_php_api_archive_filename to cell "g_aws_php_api_archive_filename"
set bucket to cell "g_bucket"
set directory to cell "g_directory"
set the_key to cell "g_key"
set secret to cell "g_secret"
set temp_dir to cell "g_temp_dir_path"
end tell
set cd_cmd to "cd " & temp_dir & "; "
set cmd to cd_cmd & "unzip -o " & temp_dir & aws_php_api_archive_filename
do shell script cmd
set cmd to cd_cmd & "php " & temp_dir & php_script_filename & ¬
" --bucket='" & bucket & "'" & ¬
" --key='" & the_key & "'" & ¬
" --secret='" & secret & "'" & ¬
" --file='" & temp_dir & upload_file_name & "'" & ¬
" --directory='" & directory & "'"
do shell script cmd
]
Set Field[Files::url;
"https://s3.amazonaws.com/" & Files::g_bucket & "/" & Files::g_directory & "/" &
Substitute(
Files::filename;
" "; "+"
)
]
Set Field[Files::g_key; ""]
Set Field[Files::g_secret; ""]
Exit Script[Result: True]
Like the AppleScript and PHP scripts, this one is very straightforward. After ensuring that the current user has the ability to continue using the proceedure, it calculates a number of variables, all dealing with file and directory paths needed by the PHP script (and therefore by the AppleScript). Storing this information into the global fields, it then exports the three files to the temporary directory and executes the AppleScript. Finally, it sets the URL field based on what it should be given the file name and upload directory, and then clears out the sensitive information before exiting execution.
Note the necessity of disallowing the user to abort the script and clearing out the Amazon login information for protecting, as far as possible, the contents of the Amazon credentials.
Once you have duplicated my efforts as described above, filled in a bucket and upload directory, created a file record in FileMaker and placed a file in the file container field, and supplied your AWS credentials in the custom functions, you can execute the above AppleScript and it should successfully upload the file. But I don’t consider this a perfect solution by any means.
What’s provided does conform to my client’s requirements, and they were able to use the technique to integrate the solution into their existing system, but I find quite a few deficiencies to the solution as it currently exists.
putObject()
method in PHP and use that instead of simply calculating it.Get( TemporaryPath )
, and converts it to a string formatted for FileMaker’s Export Field Contents script step. But I didn’t use it here because it’s much more complicated and makes use of other functions. The complexity isn’t required here because the solution already requires OS X to work (because we’re using AppleScript). But the single custom function I used here in place of that, appl.TempDirFilepath
, won’t take advantage of later improvements to the code I keep elsewhere unless it’s using that code to begin with.I plan to address these issues in a future article, but in the meantime I hope you’ve found the solution thus far helpful.
*
Be the first to comment.