
I am a motivated learner with a great interest in computer science and engineering topics. I thrive on learning new technologies and keeping up to date and then applying these technologies to solve problems at work. I have taken a great interest in developing applications that are scalable, distributed, resilient, and adhere to the principles of the Reactive Manifesto. Combining Typed Functional Programming with the reactive platform is the right way to develop applications with the aforementioned requirements. I believe that great products are not just built with great technical tools but with emotional intelligence and great teams as well. I am a strong advocate of Typed Functional Programming as it is essential in building reasonable, maintainable and highly scalable applications. I currently work as a principal solutions architect and developer.
I provide technical expertise on:
I provide technical expertise on:
I help companies get software into production in a sustainable manner. I work remotely and on-site in the Toronto area. I currently function as a principal solutions architect and team lead for a large US-based derivative and insurance firm.
Building streaming systems and microservices with Functional Scala (Cats, Cats Effect, FS2 Streams, HTTP4S, Doobie) and Kafka (Kafka S...
Building streaming systems and microservices with Functional Scala (Cats, Cats Effect, FS2 Streams, HTTP4S, Doobie) and Kafka (Kafka Streams, Kafka Connect, etc.) and deploying services to AWS ECS and Fargate using Terraform. I also advise clients on technical architecture and train people in purely Typed Functional Programming and Scala. I work fully remote.
Responsible for data pipelines, automation, engineering, architecture, databases and cloud infrastructure
Built out the next ite...
Responsible for data pipelines, automation, engineering, architecture, databases and cloud infrastructure
Built out the next iteration of the big data ETL pipeline using Spark and Scala utilizing functional data engineering techniques like resilient partitioned tables, idempotent jobs, dimensional snapshots, using data lineage and data quality metrics in order to measure data quality, quickly pinpoint data problems and lay the foundation for machine learning
Created, maintained and optimized core algorithms in the geospatial domain that work on terabytes of data each day and are used to deliver key insights to downstream clients
Technologies: Apache Spark, Functional Scala (ScalaZ, Cats, Monix, FS2 Streams, Akka, Akka Streams, Akka Cluster), Python (for Airflow), SQL, Javascript (Node.JS), Databricks, AWS, Jenkins, Kong, PostgreSQL, MySQL, DynamoDB, Kinesis