GCP Data Engineer

Job Details


3 -

5 years

Job Description:

Tech stack: BigQuery , any ETL tool (Informatica, Talend, DataStage), Dataflow, Dataproc
• 3-5 years Experience in Data warehouse and Data lake implementation
• 1-2 years of experience in Google Cloud Platform (especially Big Query).
• 1-2 years of working experience in converting ETL jobs (in Informatica/Talend,/DataStage) into Dataflow or Dataproc and migrated in CI\CD pipeline
• Design, develop and deliver data integration/data extraction solutions using IBM DataStage or other ETL tools and Data Warehouse platforms like Teradata, BigQuery.
• Proficiency in Linux/Unix shell scripting and SQL.
• Knowledge of data Modelling, database design, and the data warehousing ecosystem.
• Ability to troubleshoot and solve complex technical problems.
• Excellent analytical and problem-solving skills.
• Knowledge of working in Agile environments.

Apply Now