Senior Data Engineer (Innovation Insurer)


  • Build data pipelines: Architecting, creating, maintaining and optimizing data pipelines is the primary responsibility of the data engineer.
  • Drive automation through effective metadata management: automate the most common, repeatable and tedious data preparation and integration tasks, in order to minimize manual processes and errors and improve productivity. The data engineer also assists with renovating the data management infrastructure to drive automation in data integration and management.
  • Collaborate across departments: work collaboratively with varied stakeholders (notably data analysts and scientists) to refine their data consumption requirements.
  • Educate & train: be knowledgeable about how to address data topics, including using data & domain understanding to address new data requirements, proposing innovative data ingestion, preparation, integration and operationalization, and training stakeholders in data pipelining & preparation.
  • Participate in ensuring compliant data use: ensure that data users and consumers use the data provisioned to them responsibly. Work with data governance teams, and participate in vetting and promoting content to the curated data catalog for governed reuse.
  • Become a data and analytics evangelist: The data engineer is a blend of “analytics evangelist”, “data guru” and “fixer.” This role will promote the available data and analytics capabilities and expertise to business leaders to help them leverage these capabilities in achieving business goals.


  • 6+ years of work experience in data management including data integration, modeling, optimization and data quality, of which 3+ years supporting data and analytics initiatives for cross-functional teams
  • Foundational knowledge of Data Management practices, with strong experience in:
    • Various data management architectures like data warehouse, data lake and data hub, and supporting processes like data integration, governance and metadata management
    • Designing, building and managing data pipelines for data structures encompassing data transformation, data models, schemas, metadata and workload management
    • Working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets using data integration technologies
  • Experience in data governance, notably in moving data pipelines into production, with exposure to:
      • Data preparation tools (Trifacta, Alteryx,…)
      • Database programming languages (including SQL and PL/SQL for relational databases, MongoDB or Cassandra for nonrelational databases)
      • Working with SQL on Hadoop tools and technologies (HIVE, Impala, Presto, Hortonworks Data Flow, Dremio, Informatica, Talend,…)
      • Advanced analytics tools for Object-oriented/object function scripting (R, Python, Java, C++, Scala,…)
      • Message queuing technologies (Kafka, JMS, Azure Service Bus, Amazon SQS…)
      • Stream data integration (Apache Nifi, Apache Beam, Apache Kafka Streams, Amazon Kinesis…) & stream analytics technologies (KSQL, Apache Spark Streaming, Apache Samza….)
      • Continuous integration tools (eg Jenkins)
  • Ability to automate pipeline development
      • Experience with DevOps capabilities like version control, automated builds, testing and release management capabilities with Git, Jenkins, Puppet, Ansible
      • Adept in Agile and able to apply DevOps and DataOps principles to data pipelines
  • Exposure to JavaScript technologies (jQuery, Angular, React and Node.js)
  • Exposure to hybrid deployments (cloud and on-premise), with an ability to work across multiple environments & operating systems through containerization techniques (Docker, Kubernetes, AWS ECS, etc.)
  • Strong experience with popular data discovery, analytics and BI tools (PowerBI, Tableau, Qlik…)
  • Bachelor’s degree in STEM or a related technical field, or equivalent work experience

Job Details

  • Negotiable
  • Hong Kong
  • Permanent


    *Required (All information are kept private and confidential)