Analyze and interpret all complex data on all target systems and analyze and provide resolutions to all data issues and coordinate with data analyst to validate all requirements, perform interviews with all users and developers
Perform tests and validate all data flows and prepare all ETL processes according to business requirements and incorporate all business requirements into all design specifications
Develop all data warehouse models and prepare reports for all meta data integration into systems and draft all ETL scripts and prepare required reports for all end users
Collaborate with all developers and business users to gather required data and execute all ETL programs and scripts on systems and implement all data warehouse activities and prepare reports for same
Develop and perform tests on all ETL codes for system data and analyze all data and design all data mapping techniques for all data models in systems
Document all test procedures for systems and processes and coordinate with business analysts and users to resolve all requirement issues and maintain quality for same
Bachelor Degree from reputable University with education background from Computer Science / Information Technology, Science & Technology, Business Studies / Management or equivalent
Proven work experience as a Data Engineer/ETL Developer
Understand ETL Concept, Automation Scheduling, & Data Pipelines INTERNAL
At least more 3 years working experience as Data Engineer/ETL Developer with relevant experiences in Data Mart design or in IBM Infosphere Change Data Capture and Kafka implementation
Experienced with data engineering tools such as Spark, Hadoop, Talend, NoSQL, Pentaho, Informatica, AirFlow, Presto, Hive Kafka, IBM DataStage and/or Microsoft SSIS
Have a good understanding of data modelling dan big data stacks
Have a good understanding on Programming Concept, especially Python and SQL
Familiarity with cloud big data environment such as AWS (EMR, Glue, Athena, Redshift), GCP (BigQuery, Dataflow, Dataproc) and Azure (Azure Synapse Analytics, Databricks, ADLS Gen 2)
Familiar with management tools (Jira, Git, Trello)
Proficiency in software engineering best practices such as clean/maintainable code, Git, code review, unit/integration testing, Infrastructure as Code, and Continuous Integration/Continuous Delivery.
DevOps/DataOps skills are plus points.
Excellent communication and interpersonal skills (A good problem-solving attitude)
Have a good command in English and Bahasa (actively)
Possess good comunication, target oriented, and passionate to build career in sales.
Conduct sales activities of PATRIA mining and non mining products, createâ¦