-
Notifications
You must be signed in to change notification settings - Fork 8
Preflight And Postflight Scripts
pinpoint, as of v0.0.5, supports running preflight
and postflight
scripts. The preflight
script will run prior to pinpoint doing the majority of it's lookup processing, however scenarios do exist where your preflight
script will never run. While the postflight
script will run after a lookup has been complete. These are both completely optional.
Both preflight
and postflight
must live in the root of your CacheDir
directory, default=/Library/Application Support/pinpoint
.
A preflight
script must be named "preflight" and have no extension. It must be marked as executable and be owned by root
.
A postflight
script must be named "postflight" and have no extension. It must be marked as executable and owned by root
.
- If either script has a return code other than 0 this will be logged to pinpoints standard logging.
- If either script takes longer than 10 seconds to complete pinpoint will kill the thread and continue the run.
- If either script fails for any reason pinpoint will continue the run.
The most common usage for pinpoint's preflight
and postflight
scripts is to upload client data to a reporting system.
A sample script to upload to an Amazon s3 bucket is provided below:
NOTE: the below script uses external python modules, these modules must be installed on your client machines. Running the pip install
command on a fleet of computers is not a recommended method for mass deployment, as such the following makefile can assist in creating a deployable package: python-tinys3, while a pre-built package can be downloaded here.
#!/usr/bin/python
# encoding: utf-8
# This script requires the tinys3 module which has a dependency on requests.
# You can install both with:
# sudo pip install tinys3
#
# Author: Clayton Burlison <https://clburlison.com>
# Created: April 25th, 2016
import tinys3
import subprocess
S3_ACCESS_KEY = ''
S3_SECRET_KEY = ''
ENDPOINT = 's3.amazonaws.com' # this is the US East standard endpoint
DEFAULT_BUCKET = 'my_super_awesome_bucket'
LOG_FILE = '/Library/Application Support/pinpoint/location.plist'
def get_serialnumber():
'''Returns the serial number of the Mac'''
cmd = "/usr/sbin/ioreg -c \"IOPlatformExpertDevice\" | awk -F '\"' \
'/IOPlatformSerialNumber/ {print $4}'"
proc = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
(out, err) = proc.communicate()
return out.strip()
conn = tinys3.Connection(S3_ACCESS_KEY,
S3_SECRET_KEY,
tls=True,
endpoint=ENDPOINT,
default_bucket=DEFAULT_BUCKET)
f = open(LOG_FILE,'rb')
try:
SERIAL = get_serialnumber()
conn.upload(SERIAL + '.plist',f,
headers={
'x-amz-storage-class': 'REDUCED_REDUNDANCY'
},
public=False)
except Exception, e:
print ('We ran into an error. %s', e)
Packaging the preflight
or postflight
script can be done with the Makefile located in the examples folder. The process would look like:
-
Clone this project:
git clone https://github.com/clburlison/pinpoint.git pinpoint && cd $_/examples/pinpoint-script
-
Create a working directory file structure:
make setup
-
Modify the
preflight
and/orpostflight
script(s) inside of the pkgroot folder. If you don't plan on using one of the scripts delete it. -
After all changes are made test your script locally.
-
Create the package.
make pkg
pinpoint is licensed by Clayton Burlison under The MIT License (MIT)
pinpoint
External projects