Job Location HYDERABAD OFFICE INDIA PSC PGH Job Description Data Engineer Your expertise and work will be essential in driving impactful analytics that will be leveraged across the globe to facilitate data-driven decision-making. In this role, you will be responsible for building/enhancing enterprise data pipelines incorporating various Azure Databricks components & tools. This is your chance to work on exciting, global projects and have the opportunity to work with multi-functional and multinational teams within and outside of P&G.
Eligibility / Qualification Required:
Position Responsibilities: Implementing data modeling and driving the optimization of data and analytics solutions at scale Designing and developing applications that source data from various systems, using programming in Python and SQL Working cross-functionally with business users, data management teams, cloud integration architects, information security teams, and platform teams Delivering right solution architecture, automation and technology choices starting from experimentation and proof of concept phases of new analytical models that generate insights and answers to business question Suggest and implement architecture improvements The Ideal Candidate: Experience with BI, Analytics, Data Modeling, and Data Provisioning (acquisition from various sources, transformation and sharing) and coding Demonstrated knowledge of Databricks Demonstrated knowledge of: SQL, Python, PySpark, coding standards Experience with Cloud (Azure preferred), ETL concept understanding Experience working in Agile environment Strong written and verbal English communication skills to influence others Ability to work collaboratively across functions and organize work Understanding of DevOps & CI/CD concepts Job Qualifications Bachelor's degree in Computer Science, Information Systems, or a related field, a Master’s degree is a plus Databricks platform experience - creation and job scheduling, integration with other data tools and services. ETL (Extract, Transform, Load) skills - proficiency in data modeling techniques and experience with ETL (Extract, Transform, Load) processes. Programming skills - strong programming skills (minimum 3+ years) in languages such as Python, Pyspark, SQL. CI/CD experience, Github. Experience in optimizing Spark jobs for performance is also crucial.
How to Apply:
Apply online through the official P&G careers portal.
Apply Now