
Job Description :
Develop technical directions, task priorities, and guidelines in Data Pipeline development.
Ensure data quality and integrity through data cleansing and validation processes.
Collaborate with the Data Science and Analyst teams to provide ready-to-use data.
Optimize data storage and processing systems to enhance performance.
Requirements :
Must Have:
Minimum Diploma (D3) or Bachelor’s Degree (S1) in Computer Science, Information Systems, or other related fields.
Min 3–5 years of experience as a Data Engineer, building and managing data pipelines and optimizing ETL processes.
Experienced in building reliable data pipelines for data extraction, transformation, and loading using tools such as Apache Airflow, Talend, dbt, AirByte, or Spark.
Strong understanding of SQL (MySQL, PostgreSQL, ClickHouse) and NoSQL (MongoDB) for data storage and management.
Proficient in programming languages such as Python, Scala, or Java for data manipulation, pipeline development, and data processing.
Ability to optimize pipelines to ensure data efficiency and reliability.
Nice to Have:
Knowledge of Kafka and Flink for real-time data processing.
Familiarity with data visualization tools such as Tableau, Power BI, or Looker.
Ability to write automation scripts to support data processing workflows.
Proficiency in cloud services (AWS, Azure, GCP) such as S3, BigQuery, Snowflake, or Redshift for data management.
Understanding of encryption, access control, and security policies to protect sensitive data.
Ability to create comprehensive documentation of pipelines, data structures, and tool usage to support team collaboration.
Link Copied!
Similar Job
People Partner
Human Capital
WFO - Full-Time
IT Operational Engineer
IT Infrastructure, Network, & Security
Hybrid - Full-Time
IT Operational Engineer
IT Infrastructure, Network, & Security
WFA - Full-Time

Job Application Form
Please Fill Out the Form Below to Submit Your Job Application
