Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 11.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Governance Technology Lead at Deutsche Bank Group, you will play a crucial role in spearheading the adoption and operation of modern data governance tooling across the entire organization. Your responsibilities will include collaborating closely with the Head of Data Platforms, Data Architecture, and the Chief Data Office to ensure high-quality, secure, and compliant data flows. This leadership position combines technical expertise, strategic vision, hands-on engineering, and team management. Your key responsibilities will involve supporting the continuous refinement of the data governance vision, leading the engineering team responsible for designing and operating data governance and observability stack, driving the implementation of Collibra, and working with various stakeholders to ensure a consistent approach to data governance and quality management. Additionally, you will define and monitor KPIs to ensure enterprise-wide adoption of data governance tooling and core data standards. To excel in this role, you should have at least 7 years of experience with data governance tooling and related concepts, a strong software engineering background with proficiency in programming languages like Python or Java, a deep understanding of SDLC concepts, automation, and efficient tech delivery, as well as hands-on experience with tools like the Collibra Suite and modern data platforms such as Snowflake or GCP BigQuery. Your leadership skills will be essential in managing technical teams and cross-functional projects. At Deutsche Bank Group, we offer a culture of continuous learning, training, and development to support your career progression. You will receive coaching and guidance from experts in your team and have access to a range of flexible benefits that can be tailored to suit your needs. Join us at Deutsche Bank Group to be part of a positive, fair, and inclusive work environment where we empower each other to excel together every day. Visit our company website for more information: [https://www.db.com/company/company.htm](https://www.db.com/company/company.htm). Together, we are Deutsche Bank Group.,
Posted 1 week ago
5.0 - 7.0 years
5 - 14 Lacs
Pune, Gurugram, Bengaluru
Work from Office
• Handson experience in objectoriented programming using Python, PySpark, APIs, SQL, BigQuery, GCP • Building data pipelines for huge volume of data • Dataflow Dataproc and BigQuery • Deep understanding of ETL concepts
Posted 1 week ago
5.0 - 7.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
J ob Title: Senior Data Engineer - Big Data, ETL & Java Experience Level: 5 + Years Employment Type: Full - time About the Role EXL is seeking a Senior Software Engineer with a strong foundation in Java , along with expertise in Big Data technologies and ETL development . In this role, you'll design and implement scalable, high - performance data and backend systems for clients in retail, media, and other data - driven industries. You'll work across cloud platforms such as AWS and GCP to build end - to - end data and application pipelines. Key Responsibilities . Design, develop, and maintain scalable data pipelines and ETL workflows using Apache Spark, Apache Airflow, and cloud platforms (AWS/GCP). . Build and support Java - based backend components , services, or APIs as part of end - to - end data solutions. . Work with large - scale datasets to support transformation, integration, and real - time analytics. . Optimize Spark, SQL, and Java processes for performance, scalability, and reliability. . Collaborate with cross - functional teams to understand business requirements and deliver robust solutions. . Follow engineering best practices in coding, testing, version control, and deployment. Required Qualifications . 5 + years of hands - on experience in software or data engineering. . Proven experience in developing ETL pipelines using Java and Spark . . Strong programming experience in Java (preferably with frameworks such as Spring or Spring Boot). . Experience in Big Data tools including Apache Spark , Apache Airflow , and cloud services such as AWS EMR, Glue, S3, Lambda or GCP BigQuery, Dataflow, Cloud Functions. . Proficiency in SQL and experience with performance tuning for large datasets. . Familiarity with data modeling, warehousing , and distributed systems. . Experience working in Agile development environments. . Strong problem - solving skills and attention to detail. . Excellent communication skills Preferred Qualifications . Experience building and integrating RESTful APIs or microservices using Java. . Exposure to data platforms like Snowflake, Databricks, or Kafka. . Background in retail, merchandising, or media domains is a plus. . Familiarity with CI/CD pipelines , DevOps tools, and cloud - based development workflows.
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
We have an exciting opportunity to join the Macquarie team as a Data Engineer and implement groups data strategy, leveraging cutting edge technology and cloud services. If you are keen to work in the private markets space for one of Macquaries most successful global divisions, then this role could be for you. At Macquarie, our advantage is bringing together diverse people and empowering them to shape all kinds of possibilities. We are a global financial services group operating in 31 markets and with 56 years of unbroken profitability. Youll be part of a friendly and supportive team where everyone - no matter what role - contributes ideas and drives outcomes. In this role, you will be involved in designing and managing data pipelines using Python, SQL, and tools like Airflow and DBT Cloud, while collaborating with business teams to develop prototypes and maintain data products. You will gain hands-on experience with technologies such as Google Cloud Platform (GCP) services, including BigQuery, to deliver scalable and robust solutions. As a key team member, your strong communication skills and self-motivation will support engagement with stakeholders at all levels. You should have a strong proficiency in data technology platforms, including DBT Cloud, GCP BigQuery, and Airflow. Domain knowledge of asset management and private markets industry, including relevant technologies and business processes, is essential. Familiarity with cloud platforms such as AWS, GCP, and Azure, along with related services, is preferred. Excellent verbal and written communication skills are required to effectively engage stakeholders and simplify complex technical concepts for non-technical audiences. Solid experience in SQL and Python, with a good understanding of APIs, is advantageous. We love hearing from anyone inspired to build a better future with us, if you're excited about the role or working at Macquarie we encourage you to apply. Technology enables every aspect of Macquarie, for our people, our customers, and our communities. Were a global team that is passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrows technology solutions. Our aim is to provide reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know in the application process.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Data Analyst at our organization, you will play a vital role in analyzing our ecommerce clickstream and user behavior data to uncover actionable insights that drive business growth and inform our product roadmap. Leveraging your expertise in tools such as Adobe Analytics, GCP BigQuery, SQL, and Python, you will extract, analyze, and visualize complex datasets. Working closely with our Senior Data Analyst/Architect, you will contribute to the strategic direction of our product development efforts. Your responsibilities will include analyzing ecommerce clickstream data to understand user journeys and optimize website performance, investigating user behavior patterns to enhance customer interactions, and utilizing SQL and Python to extract and manipulate large datasets from our data warehouse. Additionally, you will create insightful dashboards and reports using data visualization tools to effectively communicate findings to stakeholders across different teams. Collaborating with the Senior Data Analyst/Architect, you will translate data-driven insights into actionable recommendations that contribute to the development and prioritization of our product roadmap. You will define and track key performance indicators (KPIs), conduct AB testing analysis, and collaborate with cross-functional teams to provide relevant insights. Ensuring data quality and integrity, staying updated on industry trends, and continuously learning best practices in ecommerce analytics will also be key aspects of your role. To qualify for this position, you should have a Bachelor's degree in a quantitative field and at least 2 years of experience as a Data Analyst, preferably with a focus on ecommerce analytics. Hands-on experience with Adobe Analytics, strong proficiency in SQL, and solid programming skills in Python are essential. Experience with cloudbased data warehouses, data visualization tools, and strong analytical and problemsolving skills are also required. Excellent communication and presentation skills, the ability to work independently and collaboratively in a fast-paced environment, and a passion for data science and ecommerce are important attributes for success in this role. Key Skills: Adobe Analytics, GCP BigQuery, SQL, Python (Pandas, NumPy, Matplotlib, Seaborn), Ecommerce Analytics, User Behavior Analysis, Clickstream Data Analysis, Data Visualization, Data Reporting, AB Testing Analysis, Product Roadmap Contribution, Statistical Analysis, Data Mining, Communication (Written, Verbal), Problem-Solving, Teamwork Mandatory Skills: Data Science, E-commerce,
Posted 2 weeks ago
5.0 - 10.0 years
13 - 23 Lacs
Bhubaneswar, Hyderabad
Work from Office
Greetings From Finix !!! Experience required in Data Engineering or related roles. Strong expertise in Databricks and apache spark. Experience in EDA, SQL, GCP(Bigquery). Strong knowledge required in Python. we are looking for immediate joiners.
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
NTT DATA is looking to hire exceptional, innovative, and passionate individuals who aspire to grow with the organization. If you envision yourself as a part of an inclusive, adaptable, and forward-thinking team, then this opportunity is for you. We are currently in search of a talented MSBI + GCP Developer to become a valuable member of our team based in Chennai, Tamil Nadu, India. As a candidate, you should possess expertise in the following technical skills: - Advanced knowledge of SQL - Advanced proficiency in SSIS - Medium proficiency in GCP BigQuery In this role, your responsibilities will include: - Performing detailed design of complex applications and architecture components - Leading a small group of developers in configuring, programming, and testing - Resolving medium to complex defects and performance issues - Being accountable for service commitments at the individual request level for in-scope applications - Monitoring, tracking, and participating in ticket resolution for assigned tickets - Managing code reviews and providing mentorship to other developers About NTT DATA: NTT DATA is a trusted global innovator of business and technology services, with a revenue of $30 billion. We cater to 75% of the Fortune Global 100 companies and are dedicated to assisting clients in innovating, optimizing, and transforming for long-term success. Recognized as a Global Top Employer, we have a diverse team of experts in over 50 countries and a robust partner ecosystem comprising established and start-up companies. Our services span business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is a prominent provider of digital and AI infrastructure globally and is a part of the NTT Group, which invests over $3.6 billion annually in R&D to facilitate the confident and sustainable transition of organizations and society into the digital future. Learn more about us at us.nttdata.com.,
Posted 2 weeks ago
5.0 - 10.0 years
13 - 20 Lacs
Bhubaneswar, Kolkata, Hyderabad
Work from Office
Design and optimize data pipelines using Databricks, GCP Big Query, SQL, and Python. Perform EDA, support ML, ensure data quality/security, and collaborate on data models. Strong in Spark, data architecture, profiling, and business insights.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
You are a Database Performance & Data Modeling Specialist with a primary focus on optimizing schema structures, tuning SQL queries, and ensuring that data models are well-prepared for high-volume, real-time systems. Your responsibilities include designing data models that balance performance, flexibility, and scalability, conducting performance benchmarking to identify bottlenecks and propose improvements, analyzing slow queries to recommend indexing, denormalization, or schema revisions, monitoring query plans, memory usage, and caching strategies for cloud databases, and collaborating with developers and analysts to optimize application-to-database workflows. You must possess strong experience in database performance tuning, especially in GCP platforms like BigQuery, CloudSQL, and AlloyDB. Proficiency in schema refactoring, partitioning, clustering, and sharding techniques is essential. Familiarity with profiling tools, slow query logs, and GCP monitoring solutions is required, along with SQL optimization skills including query rewriting and execution plan analysis. Preferred skills include a background in mutual fund or high-frequency financial data modeling, hands-on experience with relational databases like PostgreSQL, MySQL, distributed caching, materialized views, and hybrid model structures. Soft skills that are crucial for this role include being precision-driven with an analytical mindset, a clear communicator with attention to detail, and possessing strong problem-solving and troubleshooting abilities. By joining this role, you will have the opportunity to shape high-performance data systems from the ground up, play a critical role in system scalability and responsiveness, and work with high-volume data in a cloud-native enterprise setting.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Data Engineer at Macquarie, you have an exciting opportunity to implement the group's data strategy by leveraging cutting-edge technology and cloud services. If you are enthusiastic about working in the private markets space within one of Macquarie's most successful global divisions, then this role could be the perfect fit for you. At Macquarie, we believe in the power of diversity and empowerment, bringing together a team of people from various backgrounds and enabling them to explore endless possibilities. With our global presence in 31 markets and 56 years of continuous profitability, you will join a supportive team where everyone's ideas are valued, and collective efforts drive impactful outcomes. In this role, your responsibilities will include designing and managing data pipelines using Python, SQL, and tools like Airflow and DBT Cloud. You will collaborate closely with business teams to develop prototypes and maintain data products. Additionally, you will have the opportunity to work with cutting-edge technologies such as Google Cloud Platform (GCP) services, specifically BigQuery, to deliver scalable and robust solutions. Your effective communication skills and self-motivation will be crucial in engaging stakeholders at all levels. To excel in this role, you should have a strong proficiency in data technology platforms, including DBT Cloud, GCP BigQuery, and Airflow. Domain knowledge of asset management and the private markets industry, including relevant technologies and business processes, is highly desirable. Familiarity with cloud platforms such as AWS, GCP, and Azure, along with related services, will be beneficial. Excellent verbal and written communication skills are essential for effectively engaging stakeholders and simplifying complex technical concepts for non-technical audiences. Solid experience in SQL and Python, with a good understanding of APIs, will be advantageous. If you are inspired to contribute to building a better future with us and are excited about the role or working at Macquarie, we encourage you to apply and join our team. About Technology: Technology plays a vital role in every aspect of Macquarie, empowering our people, customers, and communities. We are a global team passionate about accelerating the digital enterprise, connecting people and data, building platforms and applications, and designing tomorrow's technology solutions. Our Commitment to Diversity, Equity, and Inclusion: We are committed to providing reasonable adjustments to individuals who may need support during the recruitment process and through working arrangements. If you require additional assistance, please let us know during the application process.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
chennai, tamil nadu
On-site
NTT DATA is looking for a talented and passionate MSBI + GCP Developer to join their team in Chennai, Tamil Nadu, India. If you are someone who is enthusiastic about growth and innovation and wishes to be part of a forward-thinking organization, then this is the perfect opportunity for you. As an MSBI + GCP Developer at NTT DATA, you will be responsible for performing detailed design of complex applications and architecture components. You may also lead a small group of developers in configuring, programming, and testing. Your role will involve fixing medium to complex defects, resolving performance issues, and being accountable for service commitments at the individual request level for in-scope applications. Additionally, you will monitor, track, and participate in ticket resolution for assigned tickets, as well as manage code reviews and mentor other developers. NTT DATA is a trusted global innovator of business and technology services, with a commitment to helping clients innovate, optimize, and transform for long-term success. As a Global Top Employer, NTT DATA has diverse experts in more than 50 countries and a robust partner ecosystem. Their services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is recognized as one of the leading providers of digital and AI infrastructure worldwide. Being a part of the NTT Group, they invest significantly in research and development to support organizations and society in confidently transitioning into the digital future. If you are ready to be part of an inclusive and adaptable organization that values innovation and growth, apply now and join NTT DATA on their mission to drive technological advancement and transformation. Visit us at us.nttdata.com to learn more about our global initiatives and career opportunities.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
The Looker Admin/SME position entails the management of the Looker platform and provision of expert guidance to Looker users on maximizing the platform for their business requirements. The ideal candidate should possess a background in Looker administration and development, coupled with a solid grasp of data analytics and visualization principles. Key responsibilities associated with this role include overseeing the Looker platform, managing user access, data connections, and security settings. Additionally, the role involves the development and upkeep of Looker models, explores, and dashboards. The Looker Admin/SME is expected to offer technical guidance to clients regarding Looker functionality and best practices, collaborating with diverse teams to deliver top-notch analytics solutions. Furthermore, troubleshooting and resolving platform-related issues constitute a crucial aspect of the job. The qualifications required for this role include a minimum of 3 years of experience in Looker administration and development, a robust understanding of data analytics and visualization concepts, proficiency in SQL, and familiarity with google cloud-based data warehousing solutions like GCP BigQuery. Excellent communication and interpersonal skills are also necessary for effective interaction with clients and team members. About the Company: Purview is a prominent Digital Cloud & Data Engineering company with its headquarters in Edinburgh, United Kingdom, and a global presence across 14 countries. The company offers services to Captive Clients and collaborates with leading IT organizations to provide workforce and solutions. Purview has established a strong foothold in the UK, Europe, and APEC regions, serving a diverse clientele with fully managed and co-managed capacity models. For any further information, please refer to the company contact details below: India Office: 3rd Floor, Sonthalia Mind Space Near Westin Hotel, Gafoor Nagar Hitechcity, Hyderabad Phone: +91 40 48549120 / +91 8790177967 UK Office: Gyleview House, 3 Redheughs Rigg, South Gyle, Edinburgh, EH12 9DQ. Phone: +44 7590230910 Email: careers@purviewservices.com,
Posted 3 weeks ago
5.0 - 10.0 years
0 - 0 Lacs
Hyderabad
Remote
Data Engineering / Big Data part time Work from Home (Any where in world) Warm Greetings from Excel Online Classes, We are a team of industry professionals running an institute that provides comprehensive online IT training, technical support, and development services. We are currently seeking Data Engineering / Big Data Experts who are passionate about technology and can collaborate with us in their free time. If you're enthusiastic, committed, and ready to share your expertise, we would love to work with you! Were hiring for the following services: Online Training Online Development Online Technical Support Conducting Online Interviews Corporate Training Proof of Concept (POC) Projects Research & Development (R&D) We are looking for immediate joiners who can contribute in any of the above areas. If you're interested, please fill out the form using the link below: https://docs.google.com/forms/d/e/1FAIpQLSdvut0tujgMbBIQSc6M7qldtcjv8oL1ob5lBc2AlJNRAgD3Cw/viewform We also welcome referrals! If you know someone—friends, colleagues, or connections—who might be interested in: Teaching, developing, or providing tech support online Sharing domain knowledge (e.g., Banking, Insurance, etc.) Teaching foreign languages (e.g., Spanish, German, etc.) Learning or brushing up on technologies to clear interviews quickly Upskilling in new tools or frameworks for career growth Please feel free to forward this opportunity to them. For any queries, feel free to contact us at: excel.onlineclasses@gmail.com Thank you & Best Regards, Team Excel Online Classes excel.onlineclasses@gmail.com
Posted 3 weeks ago
5.0 - 10.0 years
5 - 8 Lacs
Bengaluru, Karnataka, India
On-site
Key Responsibilities: Analyze business and technical requirements for data movement and transformation processes. Create and execute test plans , test cases , and test scripts for ETL pipelines. Perform source-to-target mapping (S2T) validation and data reconciliation . Validate ETL transformation logic , data loads , aggregations , and data quality rules . Conduct black-box , white-box , and integration testing on data flows. Write complex SQL queries to validate data accuracy across databases (Oracle, SQL Server, PostgreSQL, etc.). Work with ETL developers to understand mappings and raise defects using tools like JIRA , ALM , or Bugzilla . Participate in regression testing during code changes or production deployments. Automate database testing scripts and contribute to continuous integration testing pipelines (if applicable). Document test results and create reports for QA sign-off and audits. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. 36+ years of experience in ETL testing , data warehouse testing , or database testing . Proficient in SQL for data validation and test script development. Hands-on experience with ETL tools (e.g., Informatica , Talend , SSIS , DataStage ). Experience with databases like Oracle , SQL Server , MySQL , or PostgreSQL . Familiarity with data warehousing concepts (Star/Snowflake schemas, OLAP, OLTP). Experience with defect tracking and test management tools. Preferred Qualifications: Experience testing in cloud data platforms (AWS Redshift, Snowflake, Azure Synapse, GCP BigQuery). Knowledge of BI/reporting tools (Tableau, Power BI, Qlik) for end-to-end data validation. Basic scripting or automation skills in Python , Shell , or PowerShell . Exposure to Agile/Scrum environments and CI/CD tools (Jenkins, Git).
Posted 1 month ago
5.0 - 12.0 years
5 - 12 Lacs
Bengaluru, Karnataka, India
On-site
Experience in ETL and Data Warehousing Excellent leadership and communication skills Strong hands-on experience with Data Lakehouse architecture Proficient in GCP BigQuery, Cloud Storage, Airflow, Dataflow, Cloud Functions, Pub/Sub, Cloud Run Built solution automations using various ETL tools Delivered at least 2 GCP Cloud Data Warehousing projects Worked on at least 2 Agile/SAFe methodology-based projects Experience with PySpark and Teradata Skilled in using DevOps tools like GitHub, Jenkins, Cloud Native tools Experienced in handling semi-structured data formats like JSON, Parquet, XML Written complex SQL queries for data analysis and extraction Deep understanding of Data Warehousing, Data Analysis, Data Profiling, Data Quality, and Data Mapping Global delivery model experience (15+ team members) Collaborated with product/project managers, developers, DBAs, and data governance teams for requirements, design, and deployment Responsibilities: Design and implement data pipelines using GCP services Manage deployments and ensure efficient orchestration of services Implement CI/CD pipelines using Jenkins or native tools Guide a team of data engineers in building scalable data pipelines Develop ETL/ELT pipelines using Python, Beam, and SQL Continuously monitor and optimize data workflows Integrate data from various sources using GCP services and orchestrate with Cloud Composer (Airflow) Set up monitoring and alerting using Cloud Monitoring, Datadog, etc. Mentor junior developers and data engineers Collaborate with developers, architects, and stakeholders on robust data solutions Lead data migration from legacy systems (Oracle, Teradata, SQL Server) to GCP Facilitate Agile ceremonies (sprint planning, scrums, backlog grooming) Interact with clients on analytics programs and ensure governance and communication with program leadership
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Remote
Job Title: GCP Developer (Google BigQuery) Senior Developer Location: Offshore (Remote) Experience: 5+ Years Type: Full-Time Job Summary: We are looking for experienced Senior GCP Developers with strong expertise in Google BigQuery , ETL development , and SQL , to work on complex application design and development activities. Candidates must be capable of independently resolving technical issues, managing tickets, and mentoring junior team members. Experience with Linux , SSIS , and Power BI is an added advantage based on role requirements. Key Responsibilities: Design and develop complex architecture components and BigQuery-based solutions Lead a small team of developers in programming, configuring, and testing activities Troubleshoot and resolve medium to complex issues and performance bottlenecks Monitor and resolve assigned support tickets and service requests Conduct code reviews and mentor junior developers Ensure timely delivery and quality of deliverables Collaborate with cross-functional teams for end-to-end development Required Skills & Experience: Strong hands-on experience in Google BigQuery development Proficiency in ETL processes , SQL , and data modeling Experience working in Linux environments (preferred) SSIS package development and troubleshooting or Power BI dashboard/report development Good communication and problem-solving skills Experience managing development tasks independently in an offshore model
Posted 1 month ago
4.0 - 8.0 years
0 - 1 Lacs
Bengaluru
Remote
Offshore Senior Developer: o Performs detailed design of complex applications and complex architecture components o May lead a small group of developers in configuring, programming, and testing o Fixes medium to complex defects and resolves performance problems o Accountable for service commitments at the individual request level for in-scope applications o Monitors, tracks, and participates ticket resolution for assigned tickets o Manages code reviews and mentors other developers Skill/Experience/Education Mandatory Skills Google Big Query development; ETL; SQL, Linux (Preferable); SSIS package building & troubleshooting; advanced data modeling
Posted 1 month ago
3.0 - 7.0 years
3 - 7 Lacs
Mumbai, Maharashtra, India
On-site
KEY ACCOUNTABILITIES Learn Supply Chain business processes, deep dive into Sourcing processes Develop deep knowledge into Data driven Sourcing decision optimization Spend Analysis and Should cost modelling for General Mills and sourcing value through projects Serve as the Technical SME on technology for GMI Supply Chain teams, uncovering key business questions and providing analysis, insights, and solutions to answer them. Partner in developing new capabilities around specific Sourcing areas like Spend Analysis and Intelligent Should Cost Modelling that leverage the Cloud / SAAS platforms / AI Proactive learning mindset Gather project requirements from internal business clients/users; identify and eliminate gaps via partnership with DT architects and AI teams; translate requirements into technical documents; and communicate throughout the entire development process. Use external perspective and internal relationships to improve how we work and what we deliver keep abreast of what is happening within the Digital supply chain space. MINIMUM QUALIFICATIONS Education Full time graduation from an accredited university. Full time Bachelor s/master s degree in computer science/electronics, or any equivalent relevant discipline is preferred. (Mandatory- Note: This is the minimum education criteria which cannot be altered) 7+ years of strong technical experience with Web based/Cloud Database technologies. 4+ years of as technical Analyst in requirements gathering, implementing, supporting IT /software solution. Strong knowledge of Sourcing and Procurement processes Experience in delivering value through Spend Analysis / Should Cost Modelling techniques, data/AI driven automation within Supply chain Strong working knowledge of various Sourcing/ Procurement related tools platforms, preferably SAP, Coupa, Palantir, GCP Strong verbal and written communication skills Solution oriented mindset with strong problem-solving analytical skills Experience of leading driving technical projects or teams PREFERRED QUALIFICATIONS Recent Hands-on experience on projects involving GCP Analytics Experience with real time system/data integrations, ETL reporting technologies Team player, self-driven individual. Strong knowledge/Experience of SDLC with an Agile/SCRUM delivery experience Expert: Problem Solving, Analytical/Data Skills, SQL , GCP-Big query Intermediate: Data Lake, Data Warehousing, GCP Basic: Palantir, BI Tools, Tableau / Google Data Studio, DevOps /CI CD
Posted 1 month ago
10.0 - 15.0 years
10 - 15 Lacs
Mumbai, Maharashtra, India
On-site
KEY ACCOUNTABILITIES Learn Supply Chain business processes. Partner with business SMEs and D&T peers to learn the Supply Chain data needed to drive Data & Analytics for SC Understand the DnA Cloud technology and how it connects current capabilities and shapes future capabilities. Serve as the technical SME on technology for GMI SC teams, uncovering key business questions and providing analysis, insights, and solutions to answer them. Gather project requirements from internal business clients; identify and eliminate gaps via partnership with D&T architects; translate requirements into technical documents; and communicate throughout the entire development process. Partner with our SC business teams in development of new capabilities that leverage the Cloud, AI. Examples include Digital Twin, Control Tower, Predictive Capabilities, Real-time alerting. Superior capabilities in data analysis with a passion for data science, AI, and ML. Proactive learning mindset with a passion to increase your skill on DnA capabilities. Use external perspective and internal relationships to improve how we work and what we deliver keep abreast of what is happening within the digital supply chain space. MINIMUM QUALIFICATIONS 10+ years of overall experience with 7+ years of relevant experience in a data or business analyst position Bachelor s/Master s degree in computer science/Electronics/ Software Engineering, or equivalent relevant discipline Advanced knowledge and strong /recent experience with writing SQL Effective verbal and written communication and influencing skills at the tactical level Strong problem-solving abilities and attention to detail Can do, positive attitude and commitment to a team delivery approach Broad understanding of Enterprise Data warehousing & Analytics PREFERRED QUALIFICATIONS 2+ years experience with GCP Big Query, Cloud Storage desired Familiarity with ADO and Agile-Scrum Comfortable presenting to senior leaders Experience in FMCG/Supply Chain domain Experience in SAP Supply Chain
Posted 1 month ago
10.0 - 15.0 years
9 - 10 Lacs
Mumbai, Maharashtra, India
On-site
KEY ACCOUNTABILITIES Drive projects along Supply Chain business processes; deep knowledge and working experience into Sourcing and External Supply Chain Management Serve as the technical SME on technology for GMI Supply Chain teams, uncovering key business questions and providing analysis, insights, and solutions to answer them. Work as an SME to define project requirements in collaboration with internal business clients/users; Establish best in industry practices for sourcing tools and eliminate gaps via partnership with D&T architects; communicate throughout the entire development process. Partner in developing new capabilities that leverage the Cloud /SAAS platforms. Work with a leadership mindset to own Sourcing / External Supply Chain Management processes Use external perspective and internal relationships to improve how we work and what we deliver keep abreast of what is happening within the Digital sourcing space. MINIMUM QUALIFICATIONS Education Full time graduation from an accredited university. Full time Bachelor s/master s degree in computer science/electronics, or any equivalent relevant discipline is preferred (Mandatory- Note: This is the minimum education criteria which cannot be altered) 10+ years of strong technical experience with Web based/ Cloud & Database technologies 5+ years of as Functional and Technical Analyst in requirements gathering, implementing, supporting IT /software solution. Strong Functional knowledge of Sourcing / Supply Chain processes especially in Global CPG/ FMCG industry Strong working knowledge of various Sourcing/ Procurement related tools & platforms, preferably SAP, Coupa, GCP Working knowledge of Generative AI/Intelligent Automation in Sourcing / Contract insights Strong verbal and written communication skills Solution oriented mindset with strong problem-solving & analytical skills Experience of leading & driving technical projects or teams PREFERRED QUALIFICATIONS Recent Hands-on experience with Supply Chain /ESC External Supply Chain Procurement/Finance transformation tools, platforms, process etc. Experience with real time system/data integrations, ETL & reporting technologies Excellent stakeholder management skills including leadership & vendors. Team player, self-driven individual. Ability to deal with ambiguous ask & situations Strong knowledge/Experience of SDLC with an Agile/SCRUM delivery experience Expert: SQL , ETL tech, Sourcing / External Supply Chain Processes, Problem Solving Intermediate: Data Lake / Warehousing, Analytical / Data Skills, Generative AI /Intelligent Automation, GCP-Bigquery Basic: Cloud, Tableau / Google Data Studio, SAP MM, DevOps CI/CD
Posted 1 month ago
6.0 - 7.0 years
0 - 0 Lacs
Hyderabad
Remote
Senior Application Developer with GCP experience in BigQuery, SQL, CloudRun Re-design and re-platform legacy Revenue Allocation system Mandatory: GCP BigQuery, SQL, CloudRun, Linux Shell Scripting, Kafka, MQ Series, Oracle PL/SQL
Posted 1 month ago
15.0 - 20.0 years
100 - 200 Lacs
Bengaluru
Hybrid
What Youll Do: Play a key role in developing and driving a multi-year technology strategy for a complex platform Directly and indirectly manage several senior software engineers (architects) and managers by providing coaching, guidance, and mentorship to grow the team as well as individuals Lead multiple software development teams - architecting solutions at scale to empower the business, and owning all aspects of the SDLC: design, build, deliver, and maintain Inspire, coach, mentor, and support your team members in their day to day work and their long term professional growth Attract, onboard, develop and retain diverse top talents, while fostering an inclusive and collaborative team and culture (our latest DEI Report) Lead your team and peers by example. As a senior member of the team your methodologies, technical and operational excellence practices, and system designs will help to continuously improve our domain Identify, propose, and drive initiatives to advance the technical skills, standards, practices, architecture, and documentation of our engineering teams Facilitate technical debate and decision making with an appreciation for trade-offs Continuously rethink and push the status quo, even when it challenges your/our established ideas. Preferred candidate profile Results-oriented, collaborative, pragmatic, and continuous improvement mindset Hands-on experience driving software transformations within high-growth environments (think complex, cross-continentally owned products) 15+ years of experience in engineering, out of which at least 10 years spent in leading highly performant teams and their managers (please note that a minimum of 5 years in leading fully fledged managers is required) Experience making architectural and design-related decisions for large scale platforms, understanding the tradeoffs between time-to-market vs. flexibility Significant experience and vocation in managing and enabling peoples growth and performance Experience designing and building high-scale generalizable products with outstanding user experience. Practical experience in hiring and developing engineering teams and culture and leading interdisciplinary teams in a fast-paced agile environment Capability to communicate and collaborate across the wider organization, influencing decisions with and without direct authority and always with inclusive, adaptable, and persuasive communication Analytical and decision-making skills that integrate technical and business requirements
Posted 1 month ago
3.0 - 5.0 years
6 - 8 Lacs
Chandigarh
Work from Office
Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
EPAM has presence across 40+ countries globally with 55,000 + professionals & numerous delivery centers, Key locations are North America, Eastern Europe, Central Europe, Western Europe, APAC, Mid East & Development Centers in India (Hyderabad, Pune & Bangalore). Location: Gurgaon/Pune/Hyderabad/Bengaluru/Chennai Work Mode: Hybrid (2-3 days office in a week) Job Description: 5-14 Years of in Big Data & Data related technology experience Expert level understanding of distributed computing principles Expert level knowledge and experience in Apache Spark Hands on programming with Python Proficiency with Hadoop v2, Map Reduce, HDFS, Sqoop Experience with building stream-processing systems, using technologies such as Apache Storm or Spark-Streaming Good understanding of Big Data querying tools, such as Hive, and Impala Experience with integration of data from multiple data sources such as RDBMS (SQL Server, Oracle), ERP, Files Good understanding of SQL queries, joins, stored procedures, relational schemas Experience with NoSQL databases, such as HBase, Cassandra, MongoDB Knowledge of ETL techniques and frameworks Performance tuning of Spark Jobs Experience with native Cloud data services AWS/Azure/GCP Ability to lead a team efficiently Experience with designing and implementing Big data solutions Practitioner of AGILE methodology WE OFFER Opportunity to work on technical challenges that may impact across geographies Vast opportunities for self-development: online university, knowledge sharing opportunities globally, learning opportunities through external certifications Opportunity to share your ideas on international platforms Sponsored Tech Talks & Hackathons Possibility to relocate to any EPAM office for short and long-term projects Focused individual development Benefit package: • Health benefits, Medical Benefits• Retirement benefits• Paid time off• Flexible benefits Forums to explore beyond work passion (CSR, photography, painting, sports, etc
Posted 2 months ago
4 - 9 years
5 - 12 Lacs
Pune
Work from Office
Night Shift: 9:00PM to 6:00AM Hybrid Mode: 3 days WFO & 2 Days WFH Job Overview We are looking for a savvy Data Engineer to manage in-progress and upcoming data infrastructure projects. The candidate will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder using Python and data wrangler who enjoys optimizing data systems and building them from the ground up. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities for Data Engineer * Create and maintain optimal data pipeline architecture, assemble large, complex data sets that meet functional / non-functional business requirements using Python and SQL / AWS / Snowflakes. * Identify, design, and implement internal process improvements through: automating manual processes using Python, optimizing data delivery, re-designing infrastructure for greater scalability, etc. * Build the infrastructure required for optimal extraction, transformation, and loading ofdata from a wide variety of data sources using SQL / AWS / Snowflakes technologies. * Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. * Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. * Keep our data separated and secure across national boundaries through multiple data centers and AWS regions. * Work with data and analytics experts to strive for greater functionality in our data systems. Qualifications for Data Engineer * Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. * Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. * Strong analytic skills related to working with unstructured datasets. * Build processes supporting data transformation, data structures, metadata, dependency and workload management. * A successful history of manipulating, processing and extracting value from large disconnected datasets. Desired Skillset:- * 2+ years of experience in a Python Scripting and Data specific role, with Bachelor degree. * Experience with data processing and cleaning libraries e.g. Pandas, numpy, etc., web scraping/ web crawling for automation of processes, APIs and how they work. * Debugging code if it fails and find the solution. Should have basic knowledge of SQL server job activity monitoring and of Snowflake. * Experience with relational SQL and NoSQL databases, including PostgreSQL and Cassandra. * Experience with most or all the following cloud services: AWS, Azure, Snowflake,Google Strong project management and organizational skills. * Experience supporting and working with cross-functional teams in a dynamic environment.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough