Date Posted:  5 May 2025

Senior Executive - Data & Analytics

Job Category:  Information Technology
Regular/Temporary:  Permanent
Work Company:  Gamuda Land (T12) Sdn Bhd
Location: 

Petaling Jaya, 10, MY, 47820

Job Summary

Design, build, and maintain scalable data pipelines that support the business’s reporting, analytics, and operational needs. This role focuses on ingesting raw data from diverse sources, transforming it into clean, structured datasets, and making it accessible for stakeholders across the company. Work extensively with cloud technologies, leveraging platforms like Google Cloud Platform (GCP), specifically BigQuery for data warehousing, Cloud Composer (Apache Airflow) for workflow
orchestration, and Cloud Storage for data lake storage.

Key Responsibilities

  • Gain a thorough understanding of company operations and business terminology, with a specific focus on real estate-related activities.
  • Accurately understand business requirements and translate them into clear, actionable technical specifications for data processes.
  • Design, develop and implement robust ETL (Extract, Transform, Load) and ELT (Extract, Load,Transform) processes to facilitate efficient data movement and transformation.
  • Enhance SQL queries for performance and cost efficiency within BigQuery, ensuring fast and reliable data retrieval.
  • Build and manage Apache Airflow Directed Acyclic Graphs (DAGs) for task scheduling and data pipeline automation.
  • Work closely with data analysts and business users to understand their data needs and deliver effective data solutions.
  • Ensure the quality, integrity, and security of data across the ecosystem, adhering to best practices in data management.
  • Regularly improve data models and pipeline architectures to align with evolving business goals and analytics requirements.
  • To comply and adhere to all matters pertaining to Quality, Safety & Health and Environment related to the job scope and work place as required by the Company.
  • Any other duties that will be assigned from time to time by the Management.

Qualifications

  • Degree in Business/Commerce, Management Information Services, Information Science, Information Technology, Software Engineering, Computer Science, Engineering or equivalent experience.
  • Familiarity with analytics tools such as Tableau and Power BI to create simple reports and visualizations from data.
  • Understanding of data warehousing and ETL processes to assist in organizing and preparing data for analysis.
  • Introductory experience with Google Cloud services, especially BigQuery and Cloud SQL, to support data retrieval and management.
  • Basic understanding of Generative AI concepts and their potential applications in data analysis and reporting.
  • Ability to use Power BI for creating reports, with knowledge of features like the Power BI Data Gateway and Power Automate.
  • Willingness to work collaboratively with team members, aiding in data projects and contributing to team goals
  • Utilize various technologies, including:
    SQL for data manipulation.
    Python for scripting and orchestration of data workflows.
    Docker for packaging applications and Kubernetes Pod images.
    Google Cloud Platform services, including BigQuery for data warehousing, Cloud Composer for workflow orchestration, and Cloud Storage for data lake management.
    CI/CD Pipelines using Git for version control and continuous integration/continuous deployment.

Skills & Abilities

  • Familiarity with data management platforms, data modeling techniques, and data visualization tools, combined with a foundational understanding of change management principles.
  • Proven ability to identify critical issues, collect and analyze relevant data for in-depth investigation, and develop actionable recommendations.
  • Proficient in recognizing potential risks in data processes and making timely recommendations to mitigate these risks effectively.