Email: recruitinghead@padmastech.com
Job Description:
Design and build scalable| secure| and high-performance data pipelines on GCP.
Develop and optimize ETLELT workflows using Cloud Composer| Dataflow| DATAPROC| and Big Query.
Implement data ingestion frameworks for batch and streaming data (PubSub| Kafka| Dataflow).Model| partition| and optimize datasets in BigQuery for analytics use cases.
Collaborate with data scientists| architects| and business teams to deliver end to end data solutions.
Ensure data quality| reliability| and robustness through monitoring| validation| and automation.
Implement CICD pipelines for data workflows using Cloud Build| Git| and terraform.
Optimize cost| performance| and scalability across GCP data services.
Ensure security best practices| IAM policies| and compliance with organization standards.
Skills: Digital : BigData and Hadoop Ecosystems~Digital : Google Data Engineering
Experience Required: 8-10
No map available.