Job vacancy Data Engineer
Data Engineer
Client Background:
BPO wing that offering a spectrum of services, from data analysis, customer service, and risk
management to administration, IT support, human resources, marketing, and beyond.
Location: KL
Tenure: Permanent role
Salary: Basic Salary
Responsibilities:
Data Pipeline Development:
• Design, develop, and maintain efficient ETL pipelines, ensuring the extraction, transformation, and loading of data from various sources into our data warehouse.
• Implement data integration solutions for diverse business units, enhancing data accessibility for analytical purposes.
• Optimize and monitor data pipeline performance to ensure accurate and timely data processing.
Data Architecture:
• Design and implement a resilient data architecture aligning with the organization's data strategy and overarching goals.
• Evaluate and select appropriate technologies for data storage, processing, and retrieval, considering performance, scalability, and cost.
Data Quality and Governance:
• Implement data quality checks and validation processes to maintain the accuracy and consistency of data throughout the pipeline.
• Collaborate on establishing and enforcing data governance policies, including data lineage, metadata management, and access controls.
Database Management:
• Administer and maintain data storage solutions, ensuring optimal performance, security, and availability.
• Monitor and troubleshoot data-related issues, aiming for quick resolution and minimal downtime.
Collaboration and Documentation:
• Collaborate with cross-functional teams, including data analysts, software engineers, and business stakeholders, to comprehend data requirements and deliver effective solutions.
• Document data processes, workflows, and data dictionaries to facilitate knowledge sharing and team member onboarding.
Continuous Improvement:
• Stay abreast of emerging data engineering technologies, best practices, and industry trends, integrating them into the existing data infrastructure where applicable.
• Identify opportunities to optimize and enhance existing data pipelines and processes, fostering efficiency and scalability.
Requirements:
• Bachelor's or higher degree in Computer Science, Engineering, or a related field.
• Proven experience in data engineering roles, with a focus on designing and building ETL pipelines and cloud data architectures.
• Proficiency in programming languages such as Python, SQL, Java, or Scala.
• Strong experience with data integration tools, ETL frameworks, and workflow management tools (e.g., Apache Spark, Apache Airflow, Talend, etc.).
• Solid understanding of relational databases, data modeling, and query optimization.
• Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud) and related services (e.g., S3, Redshift, BigQuery) is a plus.
• Familiarity with version control systems and agile development methodologies.
• Excellent problem-solving skills and attention to detail.
• Effective communication skills to collaborate with technical and non-technical stakeholders.
• Proficient in Mandarin language is mandatory.
Sub Specialization : Information Technology;Others Type of Employment : Permanent Minimum Experience : 2 Years Work Location : Kuala Lumpur
EPS Malaysia - Recruitment & Outsourcing Agency
View all 270 Jobs
More Job Vacancies
Digital Marketing Specialist ( work from home)
Employee | Full Time | 19 May, 2024JOB BYEPS Malaysia - Recruitment & Outsourcing AgencyAccount Manager( Work from Home)
Employee | Full Time | 19 May, 2024JOB BYEPS Malaysia - Recruitment & Outsourcing AgencyFinance Officer
Employee | Full Time | 17 May, 2024JOB BYEPS Malaysia - Recruitment & Outsourcing AgencyHR Admin Executive
Employee | Full Time | 17 May, 2024JOB BYEPS Malaysia - Recruitment & Outsourcing Agency
See all jobs