工作描述
16 天前
Job Overview
KPMG China provides multidisciplinary services from audit and tax to advisory, with a strong focus on serving our clients' needs and their industries.
We are committed to being an equal opportunity employer, with zero tolerance for any form of discrimination against any persons. It is important for us to create an inclusive, diverse and agile workplace for our people to develop and thrive at both a personal and professional level.
About the Role
We are seeking a highly skilled Data Engineer to join our team in Hong Kong. As a Data Engineer, you will be responsible for building enterprise-grade data solutions to execute clients' vision.
You will work closely with our technical architects and application designers to understand data conversion requirements fully and design conversion procedures and applications.
Key Responsibilities:
• Build the enterprise-grade data solution to execute clients' vision
• Understand the current state process of the clients and bring it to future state improvement and solution
• Develop, construct and test architectures such as databases and large-scale processing solutions
• Create ETLs/ELTs to handle data from various data sources and various formats
• Provide advice on the data solutions & development tools selection
• Develop API and employ a variety of languages and tools
• Build transformation and validation code that applies complex data aggregation and calculation in different programming languages depending on the project scope and requirements
• Document and write technical specifications for the requirements of the solution
• Plan, design and lead the implementation of a large-scale data platform
Requirements
To be successful in this role, you will need:
• Bachelor's degree (or higher) in mathematics, statistics, computer science, engineering or related field
• At least 1 year of working experience on data pipeline e.g., Azure Data Factory / Airflow / Informatica / Databricks
• Hands on programming skill e.g., SQL / Java / Python / C++ / Scala / SAS / Kafka
• Practical Experience with implementing large-scale enterprise data solutions e.g. data lake/data warehouse/data mesh
• Deep understanding of cloud computing and data technologies, business drivers, emerging computing trends, and deployment options e.g., Azure / AWS / Google Cloud / Alibaba Cloud / Tencent Cloud
• Experience with Agile & DevOps methodologies
Salary and Benefits
The estimated salary for this role is approximately $120,000 - $180,000 per annum, depending on experience.
We offer a comprehensive benefits package, including health insurance, retirement savings plan, and generous paid time off.
KPMG China provides multidisciplinary services from audit and tax to advisory, with a strong focus on serving our clients' needs and their industries.
We are committed to being an equal opportunity employer, with zero tolerance for any form of discrimination against any persons. It is important for us to create an inclusive, diverse and agile workplace for our people to develop and thrive at both a personal and professional level.
About the Role
We are seeking a highly skilled Data Engineer to join our team in Hong Kong. As a Data Engineer, you will be responsible for building enterprise-grade data solutions to execute clients' vision.
You will work closely with our technical architects and application designers to understand data conversion requirements fully and design conversion procedures and applications.
Key Responsibilities:
• Build the enterprise-grade data solution to execute clients' vision
• Understand the current state process of the clients and bring it to future state improvement and solution
• Develop, construct and test architectures such as databases and large-scale processing solutions
• Create ETLs/ELTs to handle data from various data sources and various formats
• Provide advice on the data solutions & development tools selection
• Develop API and employ a variety of languages and tools
• Build transformation and validation code that applies complex data aggregation and calculation in different programming languages depending on the project scope and requirements
• Document and write technical specifications for the requirements of the solution
• Plan, design and lead the implementation of a large-scale data platform
Requirements
To be successful in this role, you will need:
• Bachelor's degree (or higher) in mathematics, statistics, computer science, engineering or related field
• At least 1 year of working experience on data pipeline e.g., Azure Data Factory / Airflow / Informatica / Databricks
• Hands on programming skill e.g., SQL / Java / Python / C++ / Scala / SAS / Kafka
• Practical Experience with implementing large-scale enterprise data solutions e.g. data lake/data warehouse/data mesh
• Deep understanding of cloud computing and data technologies, business drivers, emerging computing trends, and deployment options e.g., Azure / AWS / Google Cloud / Alibaba Cloud / Tencent Cloud
• Experience with Agile & DevOps methodologies
Salary and Benefits
The estimated salary for this role is approximately $120,000 - $180,000 per annum, depending on experience.
We offer a comprehensive benefits package, including health insurance, retirement savings plan, and generous paid time off.
更多来自 KPMG Singapore
咨询与系统集成
中西区, 香港
7 天前
全职
办公室工作
技术、信息和媒体
Insurance Audit - Senior Associate, Singapore
KPMG Singapore
网络安全
中西区, 香港
7 天前
全职
办公室工作
技术、信息和媒体
Hong Kong - Microsoft Dynamics D365 Solution Architect (Manager level), Technology Enablement
KPMG Singapore
咨询与系统集成
中西区, 香港
7 天前
全职
办公室工作
技术、信息和媒体
更多类似工作
Consultant/ Senior Consultant (Data Engineer) - AI & Data - Hong Kong(313067)
Deloitte Touche Tohmatsu
中西区, 香港
🎉 Got an interview?