okuye

okuye

Mentor
Rising Codementor
US$15.00
For every 15 mins
ABOUT ME

As a seasoned software engineer with a strong background in multiple programming languages and development frameworks, I have a proven track record of delivering high-quality software solutions. My experience includes developing REST APIs using Scala, Play framework, and Akka, as well as working with various libraries such as CATS, TAPIR, and FS2. I have also utilized test frameworks like ScalaTest for writing test cases and coordinating with teams on end-to-end testing. My career aspirations include continuing to contribute to the development and enhancement of products, working in cross-functional teams to write and test production-quality code, and ensuring that new and updated digital services are thoroughly tested for accessibility and can be maintained and improved over the long term. I am excited about the prospect of working with a talented team of software developers and collaborating with stakeholders to gather, analyze, and validate requirements for modifications in business processes, policies, and information systems.

London (+00:00)
Joined November 2018
EXPERTISE
8 years experience
Playframework and AKKA
Playframework and AKKA
10 years experience
10 years experience
10 years experience

REVIEWS FROM CLIENTS

okuye's profile has been carefully vetted and approved as a Codementor. Connect with okuye now, and leave a review for them once you're done!
SOCIAL PRESENCE
GitHub
pre-interview
A checkout system for a shop which only sells apples and oranges
HTML
0
0
AkkaProject
Test Repo
0
0
EMPLOYMENTS
Senior Data Engineer
WEJO
2018-07-01-Present

• Engineering the Company data platforms for scale, performance, reliability and security.
• Work with other members of the Data E...

• Engineering the Company data platforms for scale, performance, reliability and security.
• Work with other members of the Data Engineering team to design and build big data streaming capabilities using Hadoop , spark and kafka.
• Work with the product owners and business analysts in analysing business requirements to design and implement data processing pipelines, associated data and database structures and fine tune performance to meet those requirements.
• Review new external data sets and open data sources to understand potential usage.
• Work with Infrastructure and DevOps teams to release and maintain live products.
• Design, Implement & Test all data processing systems.
• Participate in establishing processes and best practices around development standards, version control, quality control, deployment, maintenance and change management.

Java
Scala
MongoDB
View more
Java
Scala
MongoDB
Apache Spark
Apache Kafka
View more
Scala Developer
Shop Direct Group
2018-01-01-2018-06-01

• Spark / Scala coding, unit testing, system testing
• Agile Backlog Grooming
• Definition of Acceptance Criteria for QA and ...

• Spark / Scala coding, unit testing, system testing
• Agile Backlog Grooming
• Definition of Acceptance Criteria for QA and Business Analysts
• Writing technical design documentation (high and low level) as required
• Liaising with QA team to ensure that the documentation is fit for purpose
• Working with system team to perform load, performance and destructive testing
• Developing CI/CD pipeline for production and pre-production environments
Using primarily AWS and leveraging Hadoop, Kafka, and Cassandra

Scala
Oracle
Cassandra
View more
Scala
Oracle
Cassandra
Akka
Apache Spark
Play Framework
View more
Architect/Analyst Developer
Cap Gemini HMRC EDH Project
2014-12-01-2017-05-01

Designed and developed a web application using Django, which used parsed SOLR queries based on chosen parameters at runtime.

U...

Designed and developed a web application using Django, which used parsed SOLR queries based on chosen parameters at runtime.

Used Sqoop and Flume to move data through various landing stages in a hadoop eco system once they had been through a cleansing process using shell scripts before placing Hive tables over them where required.

Developed MapReduce programs in Java to parse raw data and also used Morphlines to perform ETL operations on data before indexing them for SOLR.

Used Pentaho PDI and Informatica to transform data.

Used Spark and Scala for program development using TDD methods and data analysis.

Used AWS as an environment for proof of concept development and deployments.

Scala
MySQL
MongoDB
View more
Scala
MySQL
MongoDB
Oracle
IBM DB2
Apache Spark
Microsoft SQL Server
Pentaho
Apache Hadoop
View more