GCP + Kafka Developer – Build Real-Time Data Pipelines That Power Global Insights
Req ID: 10344567 | Job ID: 9499 | Locations: Chennai / Bangalore | Experience: 6–8 Years
Core Skills: Google Cloud | BigQuery | Dataproc | Airflow (Cloud Composer) | Kafka
If you love building data systems that hum — fast, scalable, and automated — this is where your skills pay off.
We’re hiring a Cloud Data Developer to engineer real-time, event-driven pipelines across Google Cloud Platform (GCP) and Kafka, turning raw data into instant, actionable insights.
What You’ll Do (And Own):
-
Design, build, and optimize data pipelines using Kafka and Google Cloud tools (BigQuery, Dataproc, Airflow).
-
Create and manage DAGs in Cloud Composer to orchestrate automated workflows.
-
Handle large-scale data transformations and real-time event processing.
-
Collaborate with data architects, analysts, and product teams to deliver high-availability solutions.
-
Optimize queries, storage, and architecture for cost and performance.
-
Drive automation and efficiency throughout the data lifecycle.
You’ll Fit If You…
-
Have 6–8 years of experience in data engineering and cloud development.
-
Know Kafka inside-out — from stream design to scaling and monitoring.
-
Have hands-on expertise with BigQuery, Dataproc, Airflow (Cloud Composer).
-
Can work independently and own end-to-end delivery.
-
Think automation-first and hate manual repetition.
Why This Role?
✅ High-impact projects with enterprise-level data flow.
✅ Exposure to the latest GCP and Kafka ecosystems.
✅ Collaborative engineering culture, minimal bureaucracy.
✅ Rapid interviews and onboarding — start fast, deliver faster.
Build Pipelines That Move at Business Speed.
If you know Kafka and GCP, your next big move starts here.
Apply Now: https://forms.gle/5mn2Kyd2ysXk2LBG7
WhatsApp CV: https://wa.link/5pv88e
Confidential Discussion: https://wa.link/lptg0z
No comments:
Post a Comment