Lethbridge, AB, Canada
-

Job Description

My first professional job out of university, I started at Granduke Geomatics in Lethbridge as it was being acquired by Farmers Edge. On the Cloud team, we were responsible for constructing several backend services, the most prominent of which was our data ingestion service. It was a Django REST API that received binary telemetry data from farm vehicle equipment. The data would be POSTed to our API, stored in Postgres and cloud storage, and then Celery workers - using RabbitMQ as their message broker - would parse the binary data into JSON and insert it into Elasticsearch.

During my time on the team, I was also the primary developer of an Alerts system backend. We used the tools we already knew (Django REST, Celery, RabbitMQ) to create a system which drank from the firehose of our parsed data and evaluated it against user-provided rules that would generate alerts when matched. The Rules were sort of a cross between email rules and iOS Reminders. Rules could match on any combination of geofence boundaries, date/time, device, or any attribute value being parsed out of the raw data. Matches would generate an Alert which was then pushed to the user. The system also had a configurable way to map input and output streams, to make integrations easy.

Another smaller, brief project we did in a similar vein was to have a service that replayed the data of desired devices in a loop for demonstration and testing purposes. For a long time we reused the same technologies, but this was eventually migrated into a cloud function.