Wednesday, February 25, 2026

Data & AI Ops Engineer [Job ID: 9941]

 

DATA & AI OPS ENGINEER — INSURANCE — KL — STOP “MOVING DATA”… START RUNNING THE DATA FACTORY

If you’re the kind of engineer who can build pipelines and keep them alive in production — incident queues, SLAs, data quality, performance — this role is for you.

This is not a “one-off ETL dev” job. It’s end-to-end ownership of the data flow that powers:

  • dashboards and reporting

  • customer-facing apps

  • backend systems

  • AI-driven use cases


Position: Data & AI Ops Engineer

Job ID: 9941
📍 Location: Kuala Lumpur
🧩 Working Mode: Onsite
🏦 Domain: Insurance


The Purpose (What You’ll Really Be Paid For)

Design, build, and maintain enterprise-grade data pipelines, models, and integration frameworks — ensuring accuracy, consistency, and availability across platforms — while handling IR/SR tickets within SLA and keeping the data ecosystem reliable.


What You’ll Own (Key Responsibilities)

  • Design, develop, and maintain scalable data pipelines + integration workflows (batch + near real-time)

  • Enforce data quality with robust validation, cleansing, transformation

  • Work with developers/analysts/system owners to define data requirements and deliver aligned solutions

  • Optimize storage and retrieval performance across databases, data lakes, cloud platforms

  • Deploy and enhance data models, APIs, ETL/ELT frameworks aligned to architecture and governance

  • Monitor and resolve Incident Requests (IR) / Service Requests (SR) within SLA expectations

  • Support data-centric IT projects: planning, coordination, risk mitigation, stakeholder engagement

  • Maintain strong documentation: data flows, schemas, integration logic (audit-ready)

  • Ensure seamless integration with core business systems and external platforms

  • Stay current on modern data/AI tools and best practices to improve efficiency


What You Must Bring (Requirements)

Knowledge

  • Strong understanding of data architecture: data modeling, pipeline design, ETL/ELT, distributed processing

  • Strong RDBMS foundation; plus points for ADF, Databricks, Power BI

  • API-based integration exposure: REST / SOAP / MQ

  • Cloud data services awareness: Azure / AWS / GCP

  • Governance & compliance: lineage, access control, audit readiness

  • AI/ML fundamentals (chatbots, predictive analytics, model deployment context)

  • BI/reporting basics: visualization, role-based access, decision support

Experience / Education

  • Degree + 6+ years hands-on (data engineering/integration/analytics platforms) including 3+ years building enterprise pipelines
    OR

  • Diploma + 8+ years practical experience including 4+ years in senior/lead capacity

Skills

  • Strong hands-on building and optimizing scalable pipelines

  • End-to-end ownership: ingestion → transformation → validation → orchestration (cloud/on-prem)

  • Production mindset: availability, performance, operational resilience under pressure

  • Strong communication with technical + non-technical stakeholders

  • Excellent documentation (English): specs, data process docs, stakeholder reports

  • Strong cross-functional collaboration across dev/BA/business teams


Apply Now (Fastest Route)

Use Job ID: 9941 so we can route you immediately.

Google Forms: https://forms.gle/5mn2Kyd2ysXk2LBG7
WhatsApp CV: https://wa.link/5pv88e
Confidential Discussion: https://wa.link/lptg0z


#Hiring #DataEngineer #DataOps #AIOps #DataPipeline #ETL #ELT #DataIntegration #APIs #RDBMS #Azure #Databricks #ADF #PowerBI #DataGovernance #InsuranceTech #KualaLumpurJobs #MalaysiaJobs #OnsiteJobs #ApplyNow

No comments:

Post a Comment

Developer – L3 Support

  DEVELOPER — L3 PRODUCTION SUPPORT (CORE BANKING: INVESTMENTS & WEALTH) — KL — 3–5 YEARS — STOP “JUST FIXING BUGS”… START PROTECTING MO...