Job Description:
Data Engineer - GCP, Big Query, Airflow
Job Description: We are seeking a GCP Data Engineer with 3-5 years experience who is confident and proficient in Python programming, with 1-2 years of experience specifically in data processing tasks. The successful candidate will play a pivotal role in designing, building, and optimizing our data pipelines on GCP, ensuring that our data infrastructure is robust, efficient, and scalable.
Key Responsibilities:
*Develop, maintain, and optimize data pipelines on Google Cloud Platform (GCP).
*Big Query, Airflow
*Utilize Python for data processing tasks, ensuring high performance and reliability.
*Collaborate with data scientists, analysts, and other engineers to gather requirements and deliver high-quality data solutions.
*Implement best practices for data engineering, including data governance, security, and compliance.
*Troubleshoot and resolve issues related to data processing and pipeline performance.
Qualifications:
*Primary Skill: Confident Python programming.
*Experience: 1-2 years of hands-on experience using Python specifically for data processing tasks.
*Proven experience with Google Cloud Platform (GCP) and its data services (e.g., BigQuery, Dataflow, Pub/Sub).
*Strong understanding of data engineering concepts, including ETL processes, data warehousing, and data modeling.
*Familiarity with SQL and experience in writing complex queries.
*Excellent problem-solving skills and attention to detail.
*Ability to work independently and as part of a team in a fast-paced, dynamic environment.
Preferred Qualifications:
*Experience with additional programming languages such as Java or Scala.
*Knowledge of containerization and orchestration tools like Docker and Kubernetes.
*Familiarity with CI/CD pipelines and version control systems (e.g., Git).
Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.
To find out more about Huxley, please visit www.huxley.com
Huxley, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC387148 England and Wales
Job Description: We are seeking a GCP Data Engineer with 3-5 years experience who is confident and proficient in Python programming, with 1-2 years of experience specifically in data processing tasks. The successful candidate will play a pivotal role in designing, building, and optimizing our data pipelines on GCP, ensuring that our data infrastructure is robust, efficient, and scalable.
Key Responsibilities:
*Develop, maintain, and optimize data pipelines on Google Cloud Platform (GCP).
*Big Query, Airflow
*Utilize Python for data processing tasks, ensuring high performance and reliability.
*Collaborate with data scientists, analysts, and other engineers to gather requirements and deliver high-quality data solutions.
*Implement best practices for data engineering, including data governance, security, and compliance.
*Troubleshoot and resolve issues related to data processing and pipeline performance.
Qualifications:
*Primary Skill: Confident Python programming.
*Experience: 1-2 years of hands-on experience using Python specifically for data processing tasks.
*Proven experience with Google Cloud Platform (GCP) and its data services (e.g., BigQuery, Dataflow, Pub/Sub).
*Strong understanding of data engineering concepts, including ETL processes, data warehousing, and data modeling.
*Familiarity with SQL and experience in writing complex queries.
*Excellent problem-solving skills and attention to detail.
*Ability to work independently and as part of a team in a fast-paced, dynamic environment.
Preferred Qualifications:
*Experience with additional programming languages such as Java or Scala.
*Knowledge of containerization and orchestration tools like Docker and Kubernetes.
*Familiarity with CI/CD pipelines and version control systems (e.g., Git).
Please click here to find out more about our Key Information Documents. Please note that the documents provided contain generic information. If we are successful in finding you an assignment, you will receive a Key Information Document which will be specific to the vendor set-up you have chosen and your placement.
To find out more about Huxley, please visit www.huxley.com
Huxley, a trading division of SThree Partnership LLP is acting as an Employment Business in relation to this vacancy | Registered office | 8 Bishopsgate, London, EC2N 4BQ, United Kingdom | Partnership Number | OC387148 England and Wales
Job number 1698754
metapel
Company Details:
Huxley
Huxley is part of SThree PLC, the leading global specialist STEM (Science, Technology, Engineering and Mathematics) recruitment company, with offices ...