About the Role
We are seeking a skilled MDE Data Engineer to join our dynamic GMDIT team in Bangalore, India. In this role you will be designing, developing, and maintaining the solutions that collect, process, and store large volumes of data. In addition, you will be creating the pipelines required to manage and analyze the data. You are required to have a strong understanding of data modeling, data architecture, development practices, and data security to ensure that the systems created are scalable, reliable, secure and efficient.
Responsibilities
- Develop, build, test, and maintain data pipelines and ETL processes, to ensure data is accurately, efficiently, and securely transferred from source to target to support analytics use cases.
- Integrate data from various sources (e.g., databases, APIs, files) to create a unified source of data.
- Create and maintain data models and schemas that support business requirements and data analysis.
- Monitor and optimize the performance of data processing and queries to proactively address issues of slow queries or bottlenecks.
- Ensure data security and compliance with applicable regulations by implementing appropriate access controls, encryption, and auditing measures.
- Document solution design, pipelines, and processes to ensure clarity and maintainability.
- Implement and enforce existing standards (e.g., data validation, de-duplication, naming), and look for opportunities to create new standards for review and approval.
- Collaborate with other teams to diagnose and resolve issues related to data processing, integration, and storage, providing support and solutions as needed.
- Look for opportunities for continuous improvement including automation of repetitive tasks and streamlining of data processing workflows.
- Participate in governance processes including intake, prioritization, and root cause analysis.
Required Qualifications
- Minimum bachelor’s degree in computer science or related area (or equivalent).
- Minimum 3+ Years experience working with data structures, data modelling & analytics.
- Proficiency in Azure, Databricks Env, GITHUB, SDLC Process, and data modelling.
- Experienced in building and maintaining data pipelines for data transformation and movement.
- Strong skills in SQL and in Python.
- Proficiency in Scala or Java.
- Effective communication with an ability to collaborate with global team members and stakeholders, and communicate technical concepts clearly.
- Demonstrated problem solving and analytical thinking.
- Travel ~10%.
Desired Qualifications
- Experience in Azure DevOps, GitHub.
- Understanding of Bicep Codes.
- Working experience in SDLC Process.