Job description
- 6 month initial contract
- Perth based; hybrid - 3 days onsite
- Multiple roles *3
- AWS Toolset
Tasks to be performed on a daily basis:
The Principal Data Engineer shall be responsible for the building, testing, and operation of an enterprise data platform.
The engineer shall ensure that the platform is developed and deployed with a secure-by-design approach, delivering robust, thoroughly tested, and maintainable solutions. Moreover, the Platform Data Engineer shall provide support, security reviews, and technical reviews for delivery data engineers working on the platform.
A data engineer with expertise in AWS toolset advises on, develops, and maintains data engineering solutions on the AWS Cloud ecosystem. They design, build, and operate batch and real-time data pipelines using AWS services such as AWS EMR, AWS Glue, Glue Catalog, and Kinesis. Additionally, they create data layers on AWS RedShift, Aurora, and DynamoDB.
The data engineer also migrates data using AWS DMS and is proficient with various AWS Data Platform components, including S3, RedShift, RedShift Spectrum, AWS Glue with Spark, AWS Glue with Python, Lambda functions with Python, AWS Glue Catalog, and AWS Glue Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like Apache Airflow and dbt, Spark / Python or Spark / Scala on AWS Platform.
The data engineer schedules and manages data services on the AWS Platform, ensuring seamless integration and operation of data engineering solutions.
Mandatory Requirements:
- Development experience with Go, SQL, C#, .net, JavaScript, shell scripts & container platforms like Docker.
- The engineer shall have experience integrating with timeseries source systems: Honeywell Plant Historian Database, OSI Pi
- The engineer shall have extensive experience in Authentication mechanisms including but not limited to (OAuth 2.0, OIDC, Microsoft Entra, Key Pair Authentication, Certificate based authentication, SAML based SSO).
- Use of AWS services for cloud-based platform tooling infrastructure including but not limited to: Lambda,ECS,MSK,RDS,EC2, Secrets Manager, ALB, Cloud Watch, Event Bridge
Required Professional and Technical Expertise:
- Development of standard patterns for data pipelines and workflows utilizing Streamsets, Kestra, dbt, Git.
- APIs / Integration for data access and manipulation.
- Development of API's and Web Applications to support the management of the Platform (Security, Sandbox creation etc.)
- Experience in Terraform for AWS and Azure deployments.
- Python scripting for data common processing and automation tasks.
- CLI development to automate common platform tasks.
- Data storage design and processing solutions employing Snowflake.
- Quality assurance and testing
Desired Attributes and Skills:
- Provide technical support to delivery data engineers working on the platform
- Conduct security reviews for data engineering projects to ensure compliance with standards.
- Offer technical reviews and feedback to enhance the quality of deliverables.
- Testing and Quality Assurance experience
To be considered for the role click the 'apply' button or for more information about this and other opportunities please contact Gizaille Shaheen on 03 86804305 or email: [email protected] and quote the above job reference number.
