Location
Bengaluru, Karnataka, India
Salary
Not specified
Type
fulltime
Posted
Today
Job Description
Title: Senior Data Platform Engineer [SDE 2]
Location: Bangalore
The Role:
We are seeking a Senior Data Platform Engineer to join our dynamic and growing team,
responsible for building, scaling, and optimising our cloud-native data platform. You will design
and implement scalable data pipelines that power business-critical analytics, real-time insights,
and AI/ML solutions. As an experienced data architect and builder, you will have the opportunity
to redefine our data infrastructure, enabling the next generation of products and driving
innovative data initiatives.
Roles and Responsibilities:
- Design, Develop \& Deploy scalable Data Platform with real-time and batch processing solutions
- Write highly scalable code by following best practices in concurrency, resource optimisation, and fault tolerance.
- Build and maintain the Infra with the best IoC practices
- Identify and implement internal process improvements, including automating manual tasks, optimising data flows, and redesigning infrastructure for greater scalability and resilience.
- Develop automated monitoring, alerting, and logging systems for pipelines to ensure reliability, uptime, and data quality.
- Build and maintain data tools that enable analytics, BI, and data science teams to
experiment and optimise their workflows.
- Ensure end-to-end data governance by implementing policies for data quality, security, and compliance.
- Optimise data models and architectures to power machine learning, reporting, and real-time analytics.
- Collaborate closely with data scientists and product engineers to integrate ML models into production systems and improve system functionality.
- Work with stakeholders across the organisation to ensure alignment between data infrastructure and business needs.
Mandatory Qualifications:
- 4\+ years of hands-on experience in Building and managing large-scale data platforms.
- Experience with Version control, CI/CD pipelines, Kubernetes, Docker, and IaC tools.
- Strong Experience in Building and Maintaining Data-Intensive APIs and Distributed Data Processing Systems
- Strong programming skills in Scala, Java, and optimising SQL queries.
- Strong expertise in Spark, Flink, Iceberg, Delta Trino, Kafka
- Experience with both relational SQL and NoSQL databases
- Strong understanding of REST APIs, Microservices, event-driven architectures, and Stream processing frameworks.
- Experience working in cross-functional, agile environments.
- Open-source contributions or active participation in developer communities.
- Prior experience in a product-based company is highly preferred.
Looking for more opportunities?
Browse thousands of graduate jobs and entry-level positions.