- Global Brand
- Strategic Thinking Organisation
- Sydney location
- WFH flexibility
Our client is part of a global Group and is the largest provider of Television Audience Measurement services in the world.
Key experience required:
- Superior data processing/modelling / mining experience
- Pandas Python library for data analysis experience
- AWS experience
- Responsible for developing AWS S3 big data solution for combined Television and on-demand video measurement for population across Australia.
- Responsible for ETL process for combining Television and On demand video viewing data(CIMF)
- Responsible for Big Data Analytic Solution using AWS Athena
- Responsible for managing and developing the DAG component for everyday data processing DAG component for CIMF(Combined Item Master File) on Apache Airflow.
- Managing data crawler services: Responsible for managing crawler services for creating data catalogs across different production accounts.
- Developing event driven services on AWS lambda for connecting microservices architecture together.
- Developing resilient, fault tolerant application workflow using AWS SQS and SNS services for VOZ.
- Involved in agile methodology working on VOZ services by utilising project management tools like Jira, Confluence and Bitbucket.
To submit your interest for this role, click on the Apply button quoting reference #5074
Please note suitable applicants will be contacted within 2 business hours.
Due to high volume, we will do our very best to contact all applicants and do apologise if there is a delay in this process. Please contact [email protected] with an application enquiry
To apply for this job email your details to email@example.com.