Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
6 - 10 Lacs
Pune, Chennai, Bengaluru
Work from Office
Expertise with SAP MDG configurations for Data modelling, UI modelling, process modelling, rules and derivations, BRF+, replication configurations. Have technical knowledge of MDG workflows and custom developments for Parallel Splits, merge, parallel steps logics. Have technical knowledge of ERP tables and working exp for ABAP developments. Should have worked on MDG implementation as a technical expert with responsibility to design, develop, test , defect management for MDG WRICEF objects. Customizations of data model, UI model, BRF+ custom developments Webservice installations and deployments Developments for Webservices for inbound and outbounds for MDG Good knowledge of BRF+, Workflow, FPM, Web Dynpro, Enhancements, IDOCs and Proxies Working knowledge of transport management, best practices deployment, code review ABAP, ABAP workflows , OOPS ABAP Location- PAN ,Delhi NCR, Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai,Hyderabad
Posted 1 month ago
9.0 - 11.0 years
11 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Expertise with SAP MDG configurations for Data modelling, UI modelling, process modelling, rules and derivations, BRF+, replication configurations. Have technical knowledge of MDG workflows and custom developments for Parallel Splits, merge, parallel steps logics. Have technical knowledge of ERP tables and working exp for ABAP developments. Should have worked on MDG implementation as a technical expert with responsibility to design, develop, test , defect management for MDG WRICEF objects. Customizations of data model, UI model, BRF+ custom developments Webservice installations and deployments Developments for Webservices for inbound and outbounds for MDG Good knowledge of BRF+, Workflow, FPM, Web Dynpro, Enhancements, IDOCs and Proxies Working knowledge of transport management, best practices deployment, code review ABAP, ABAP workflows , OOPS ABAP Solution architecting for MDG Work with the various technical teams to come up with the future state solution architecture to support the SAP MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance and scalability strategies to support the various business capabilities Develop technical recommendations including integration strategy, synchronization mechanisms, external data capabilities, technology alternatives Share perspectives about best practices, common technical issues & approaches Supports Roadmap creation Will support high level logical data model definition and discussions to ensure feasibility with SAP MDG Owns Solution Architecture deliverable for SAP MDG Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, PAN India
Posted 1 month ago
7.0 - 12.0 years
19 - 25 Lacs
Bengaluru
Remote
Data manager Skills: SAP Analytics, data modelling Preferred candidate profile
Posted 1 month ago
9.0 - 14.0 years
50 - 85 Lacs
Noida
Work from Office
About the Role We are looking for a Staff Engineer specialized in Master Data Management to design and develop our next-generation MDM platform. This role is ideal for engineers who have created or contributed significantly to MDM solutions. Youll lead the architecture and development of our core MDM engine, focusing on data modeling, matching algorithms, and governance workflows that enable our customers to achieve a trusted, 360-degree view of their critical business data. A Day in the Life Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Architect and develop scalable, fault-tolerant MDM platform components that handle various data domains. Design and implement sophisticated entity matching and merging algorithms to create golden records across disparate data sources. Develop or Integrate flexible data modeling frameworks that can adapt to different industries and use cases. Create robust data governance workflows, including approval processes, audit trails, and role-based access controls. Build data quality monitoring and remediation capabilities into the MDM platform. Collaborate with product managers, solution architects, and customers to understand industry-specific MDM requirements. Develop REST APIs and integration patterns for connecting the MDM platform with various enterprise systems. Mentor junior engineers and promote best practices in MDM solution development. Lead technical design reviews and contribute to the product roadmap What You Need 8+ years of software engineering experience, with at least 5 years focused on developing master data management solutions or components. Proven experience creating or significantly contributing to commercial MDM platforms, data integration tools, or similar enterprise data management solutions. Deep understanding of MDM concepts including data modeling, matching/merging algorithms, data governance, and data quality management. Strong expertise in at least one major programming language such as Java, Scala, Python, or Go. Experience with database technologies including relational (Snowflake, Databricks, PostgreSQL) and NoSQL systems (MongoDB, Elasticsearch). Knowledge of data integration patterns and ETL/ELT processes. Experience designing and implementing RESTful APIs and service-oriented architectures. Understanding of cloud-native development and deployment on AWS, or Azure. Familiarity with containerization (Docker) and orchestration tools (Kubernetes). Experience with event-driven architectures and messaging systems (Kafka, RabbitMQ). Strong understanding of data security and privacy considerations, especially for sensitive master data.
Posted 1 month ago
7 - 10 years
15 - 25 Lacs
Pune
Hybrid
Lead Data Engineer (DataBricks) Experience: 7 - 10 Years Exp Salary : Upto INR 25 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Hybrid (Pune) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : AWS Glue, Databricks, Azure - Data Factory, SQL, Python, Data Modelling, ETL Good to have skills : Big Data Pipelines, Data Warehousing Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Position: Lead Data Engineer (Databricks) Location: Pune, Ahmedabad Required Experience: 7 to 10 years Preferred: Immediate Joiners Job Overview: We are looking for an accomplished Lead Data Engineer with expertise in Databricks to join our dynamic team. This role is crucial for enhancing our data engineering capabilities, and it offers the chance to work with advanced technologies, including Generative AI. Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Qualifications: Bachelors or masters degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Skills: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 1 month ago
1 - 2 years
6 - 9 Lacs
Chennai
Work from Office
We are looking for an Associate Data Scientist to analyze and interpret complex datasets, applying advanced statistical and machine learning techniques to extract valuable insights and drive data-driven decision-making. You will work closely with cross-functional teams to identify business challenges and develop innovative solutions that optimize our products and services. In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research. Key Responsibilities: Utilize advanced analytics techniques to analyze large and complex datasets, identifying patterns, trends, and correlations to uncover valuable insights. Develop and implement machine learning models and algorithms for predictive and prescriptive analytics. Clean, pre-process, and validate data to ensure accuracy, completeness, and consistency for analysis purposes. Apply knowledge of Large Language Models (LLMs) to enhance text analysis and generation tasks. Explore and evaluate state-of-the-art LLMs, adapt them to specific tasks, and fine-tune models as necessary to improve performance. Build and deploy scalable text analysis pipelines to process and analyze text data efficiently . Communicate complex findings and insights to both technical and non-technical stakeholders through effective data visualization and storytelling techniques. Propose solutions and strategies to business challenges . Collaborate with engineering and product development teams . The ideal candidate: Minimum 6 months of experience as a Data Scientist Experience in data analysis, and modeling. Solid understanding of statistical concepts and machine learning algorithms, with hands-on experience in applying them to real-world problems. Strong programming skills in languages such as Python or R Experience with machine learning frameworks and libraries such as scikit-learn, TensorFlow, or PyTorch Knowledge and hands-on experience with Large Language Models (LLMs), such as GPT-3 and BERT Familiarity with data visualization tools such as Tableau, Power BI, or Matplotlib to effectively communicate insights. B.E/B.Tech in Computer Science, Engineering or relevant field; OR graduate degree in Data Science or other quantitative field is preferred . Abilities and traits Analytical mind and business acumen Strong math skills (e.g. statistics, algebra) Problem-solving aptitude Excellent communication and presentation skills . What we offer: Competitive salary and benefits package. Opportunity to work with cutting-edge technologies. Flexible working hours and remote work options. Continuous learning and professional development opportunities. A collaborative and supportive work environment.
Posted 1 month ago
8 - 13 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 15 Yrs Location: Pan India Job Description: Minimum Two years experience in Boomi Data modeling Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 1 month ago
10 - 15 years
15 - 18 Lacs
Hyderabad
Work from Office
Skilled in data modeling (ER/Studio, Erwin), MPP DBs (Databricks, Snowflake), GitHub, CI/CD, metadata/lineage, agile/DevOps, SAP HANA/S4, and retail data (IRI, Nielsen). Mail:kowsalya.k@srsinfoway.com
Posted 1 month ago
15 - 24 years
22 - 37 Lacs
Pune
Remote
Responsibilities: Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress. Analyzing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Determine and refine machine learning objectives. Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models. Transforming data science prototypes and applying appropriate ML algorithms and tools. Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world Ensuring that algorithms generate accurate user recommendations. Verifying data quality, and/or ensuring it via data cleaning. Supervising the data acquisition process if more data is needed. Defining validation strategies. Defining the pre-processing or feature engineering to be done on a given dataset Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks. Developing ML algorithms to analyze huge volumes of historical data to make predictions. Running tests, performing statistical analysis, and interpreting test results. Deploying models to production. Documenting machine learning processes. Keeping abreast of developments in machine learning. Preferred candidate profile Bachelor's degree in computer science, data science, mathematics, or a related field. Knowledge as a machine learning engineer. Proficiency with a deep learning framework such as TensorFlow, XgBoost, Wavevnet, Keras, numpy. Advanced proficiency with Python, Java, and R code writing. Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas. Extensive knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture in ANN, CNN, RNN with LSTM. Ability to select hardware to run an ML model with the required latency In-depth knowledge of mathematics, statistics, and algorithms. Superb analytical and problem-solving abilities. Great communication and collaboration skills. Excellent time management and organizational abilities. Benefits of working with OptimEyes 1 . Remote work opportunity ( work from home)2. Work opportunity with a top-notch team, cutting-edge technology, and leadership of extremely successful experts.3. Monthly Bonus along with Salary4. Yearly Bonus Role & responsibilities Preferred candidate profile
Posted 1 month ago
4 - 7 years
6 - 11 Lacs
Hyderabad
Work from Office
The position is based out of CDK Pune office with team responsible to develop the Identity & Access Management platform. IAM / EIS is part of Foundational Platform Services within CDK that provides Identity & Authorization services to various internal & external CDK product platforms. Primary technical activities will be related to development & design of Product Software using Core Java, Spring framework, SQL / PL/SQL using JDBC / Hibernate, RESTful APIs & Angular / ReactJS. Skills Required Strong knowledge of Core Java and Object-Oriented programming concepts. Strong knowledge of Spring Ecosystem (IOC concepts, Spring Boot, Spring Web) Strong understanding of RESTful web services and JSON Functional expertise in the IAM Domain would be preferrable. Hands-on experience in working on frameworks such as OAUTH 2.0, SAML, OPENID and understanding of Authentication & Authorization domain with exposure to OKTA, Azure and such is a plus. Experience in messaging technologies & platforms such as RabitMQ / Kafka. Expertise in Docker, Kubernetes , AWS and Microservices . Experience with relational databases Aurora PostGRE-SQL, including designing queries, data modelling, PL/SQL scripts Experience on UI framework preferably Angular / ReactJS Testing frameworks like JUnit, Selenium, Cypress, Cucumber Strong analytical skills required, including the ability to understand complex business workflows and determine how they can be implemented within the system Strong communication Skills, able to work in global delivery model and work with his/her counterpart across the globe Good understanding of agile methodology for project delivery Basic skills with Software Configuration Management Tools and Integrated Development Environments Core Responsibilities Work with Product / Technical managers and architects to understand the requirements. Ability to come up with Solution Design, evaluate alternate approaches and come up with PoCs Manage, write & test code as per requirements Troubleshoot code issues, fix bugs and apply solution driven approach Execute full software development lifecycle Document and maintain application functionality, use cases, integration approach etc. Comply with project plans and industry standards Monitor and improve the application performance Required Qualifications 4-7 years of experience BE, BTech or Computer graduate
Posted 1 month ago
- 1 years
1 - 1 Lacs
Gurugram
Work from Office
Key Responsibilities: Enter and update data into the company's database and systems accurately and efficiently. Verify data for accuracy and completeness. Process and maintain confidential information. Review and correct any discrepancies or errors in data. Collect, clean, and organize data from various sources to ensure accuracy and completeness. Analyze and interpret complex datasets to identify trends, patterns, and insights. Generate reports, dashboards, and visualizations to communicate findings effectively. Collaborate with business units to understand their needs and translate them into data requirements. Develop and implement data models and forecasts to predict future trends. Ensure the integrity and security of company data. Provide information about products, services, policies, and procedures. Identify opportunities for improvement and suggest improvements to enhance the customer experience Stay up to date with the latest industry trends and technologies related to data analytics.
Posted 1 month ago
3 - 4 years
5 - 6 Lacs
Noida, Gurugram, Bengaluru
Work from Office
Senior Engineer: The T C A practice has experienced significant growth in demand for engineering & architecture roles from CST, driven by client needs that extend beyond traditional data & analytics architecture skills. There is an increasing emphasis on deep technical sk ills like s uch as strong ex pertise i n Azure, Snowflake, Azure OpenAI, and Snowflake Cortex, along with a solid understanding of their respective functionalities. In dividual w ill work on a robust pipeline of T C A-driven projects with pharma clients . This role offers significant opportunities for progression within the practice. What Youll Do Opportunity to work on high-impact projects with leading clients. Exposure to complex and technological initiatives Learning support through organization sponsored trainings & certifications Collaborative and growth-oriented team culture. Clear progression path within the practice. Opportunity work on latest technologies Successful delivery of client projects and continuous learning mindset certifications in newer areas Contribution to partner with project leads and AEEC leads todeliver complex projects & growTCA practice. Development of experttech solutions for client needs with positive feedback from clients and team members. What Youll Bring 3- 4 years of experience in RDF ontologies, RDF based knowledge graph (Anzo graph DB preferred), Data modelling, Azure cloud and data engineering Understanding of ETL processes, Data pull using Azure services via polling mechanism and API/middleware development using Azure services. Strong ability to identify data anomalies, design data validation rules, and perform data cleanup to ensure high-quality data. Experience in pharma or life sciences data: Familiarity with pharmaceutical datasets, including product, patient, or healthcare provider data, is a plus.
Posted 1 month ago
7 - 9 years
9 - 11 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 1 month ago
10 - 14 years
37 - 40 Lacs
Pune
Work from Office
We are looking for an experienced SAP Analytics Cloud (SAC) Consultant to implement, configure, and optimize SAC solutions for data analytics, reporting, planning, and forecasting. The ideal candidate will work closely with business stakeholders Required Candidate profile Proficiency in SAP Analytics Cloud (SAC) features such as stories, data models, and planning functionalities. Experience with data modelling, reporting, and dashboarding
Posted 1 month ago
4 - 9 years
18 - 25 Lacs
Bengaluru
Hybrid
Skill required : Data Engineers- Azure Designation : Sr Analyst/ Consultant Job Location : Bengaluru Qualifications: BE/BTech Years of Experience : 4 - 11 Years OVERALL PURPOSE OF JOB Understand client requirements and build ETL solution using Azure Data Factory, Azure Databricks & PySpark . Build solution in such a way that it can absorb clients change request very easily. Find innovative ways to accomplish tasks and handle multiple projects simultaneously and independently. Works with Data & appropriate teams to effectively source required data. Identify data gaps and work with client teams to effectively communicate the findings to stakeholders/clients. Responsibilities : Develop ETL solution to populate Centralized Repository by integrating data from various data sources. Create Data Pipelines, Data Flow, Data Model according to the business requirement. Proficient in implementing all transformations according to business needs. Identify data gaps in data lake and work with relevant data/client teams to get necessary data required for dashboarding/reporting. Strong experience working on Azure data platform, Azure Data Factory, Azure Data Bricks. Strong experience working on ETL components and scripting languages like PySpark, Python . Experience in creating Pipelines, Alerts, email notifications, and scheduling jobs. Exposure on development/staging/production environments. Providing support in creating, monitoring and troubleshooting the scheduled jobs. Effectively work with client and handle client interactions. Skills Required: Bachelors' degree in Engineering or Science or equivalent graduates with at least 4-11 years of overall experience in data management including data integration, modeling & optimization. Minimum 4 years of experience working on Azure cloud, Azure Data Factory, Azure Databricks. Minimum 3-4 years of experience in PySpark, Python, etc. for data ETL . In-depth understanding of data warehouse, ETL concept and modeling principles. Strong ability to design, build and manage data. Strong understanding of Data integration. Strong Analytical and problem-solving skills. Strong Communication & client interaction skills. Ability to design database to store huge data necessary for reporting & dashboarding. Ability and willingness to acquire knowledge on the new technologies, good analytical and interpersonal skills with ability to interact with individuals at all levels. Interested candidates can reach on Neha 9599788568 neha.singh@mounttalent.com
Posted 1 month ago
5 - 10 years
5 - 10 Lacs
Pune, Gurugram, Bengaluru
Work from Office
Hiring for Top management consulting org for the Data Architect-AI role, Kindly go through the JD in detail As a key pillar of our organization, the Engineering Products team worked on various fields from data & AI perspective- Data Strategy, AI Strategy, Data Modelling, Data Architecture, Cloud Assessment, Industry & AI Value strategy etc. that helps our customers in setting up strong data platform foundation & target roadmap to scaling & evolve towards achieving AI/GEN AI & advanced analytics vision to meet the evolving future needs of technology advancement Location - Gurgaon/Bangalore/Pune/hyderabad/Mumbai Who are we looking for? Years of Experience: Candidates should typically have at least 5- 10 years of experience in AI strategy, management, or a related field. This experience should include hands-on involvement in developing and implementing AI strategies for clients across various industries. Desired experience. Minimum 6 years of experience working with clients in the products industry (Lifesciences, CPG, Industry & Ret that are heavily influenced by AI & Gen AI preferences and behaviors, is highly valued. Candidates who have a deep understanding of AI & Gen AI trends and market dynamics can provide valuable insights and strategic guidance to clients. Minimum 5 years proven experience & deep expertise in developing and implementing AI strategy frameworks tailored to the specific needs and aims of clients within LS, Industry, CPG, and Retail sectors. The ability to craft innovative AI solutions that address industry-specific challenges and drive tangible business outcomes will set you apart. Minimum 6 years strong consulting background with a demonstrated ability to lead client engagements from start to completion. Consulting experience should encompass stakeholder management and effective communication to ensure successful project delivery
Posted 1 month ago
12 - 18 years
14 - 24 Lacs
Hyderabad
Work from Office
Overview Deputy Director - Data Engineering PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. What PepsiCo Data Management and Operations does: Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Data engineering lead role for D&Ai data modernization (MDIP) Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The can didate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Manage a team of data engineers and data analysts by delegating project responsibilities and managing their flow of work as well as empowering them to realize their full potential. Design, structure and store data into unified data models and link them together to make the data reusable for downstream products. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Create reusable accelerators and solutions to migrate data from legacy data warehouse platforms such as Teradata to Azure Databricks and Azure SQL. Enable and accelerate standards-based development prioritizing reuse of code, adopt test-driven development, unit testing and test automation with end-to-end observability of data Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality, performance and cost. Collaborate with internal clients (product teams, sector leads, data science teams) and external partners (SI partners/data providers) to drive solutioning and clarify solution requirements. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects to build and support the right domain architecture for each application following well-architected design standards. Define and manage SLAs for data products and processes running in production. Create documentation for learnings and knowledge transfer to internal associates. Qualifications 12+ years of engineering and data management experience Qualifications 12+ years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 8+ years of experience with Data Lakehouse, Data Warehousing, and Data Analytics tools. 6+ years of experience in SQL optimization and performance tuning on MS SQL Server, Azure SQL or any other popular RDBMS 6+ years of experience in Python/Pyspark/Scala programming on big data platforms like Databricks 4+ years in cloud data engineering experience in Azure or AWS. Fluent with Azure cloud services. Azure Data Engineering certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one business intelligence tool such as Power BI or Tableau Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like ADO, Github and CI/CD tools for DevOps automation and deployments. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment. Comfortable working in a hybrid environment with teams consisting of contractors as well as FTEs spread across multiple PepsiCo locations. Domain Knowledge in CPG industry with Supply chain/GTM background is preferred.
Posted 1 month ago
8 - 10 years
15 - 30 Lacs
Hyderabad, Pune, Chennai
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:8-10yrs Work Location :Hyderabad/Pune/Chennai Job Description: Experience in Power BI Developer, data modelling, data visualization, API We are seeking a Senior Power BI Developer with over 5 years of experience in data analytics and business intelligence The ideal candidate will have a deep understanding of Power BI data modeling and data visualization You will be responsible for designing developing and maintaining business intelligence solutions to help our organization make datadriven decisions Experience in DAX Key Responsibilities Design develop and maintain Power BI reports and dashboards Collaborate with business stakeholders to understand their data needs and translate them into technical requirements Create and optimize data models to support reporting and analytics Integrate Power BI with various data sources including databases cloud services and APIs Ensure data accuracy and integrity in all reports and dashboards Provide training and support to endusers on Power BI tools and functionalities Stay uptodate with the latest Power BI features and best practices Interested candidates, Kindly share your updated resume to ramya.r@spstaffing.in or contact number 8667784354 (Whatsapp:9597467601)to proceed further.
Posted 1 month ago
8 - 13 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-1 Round(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
3 - 8 years
10 - 20 Lacs
Bengaluru
Work from Office
Hi, Greetings from Sun Technology Integrators!! This is regarding a job opening with Sun Technology Integrators, Bangalore. Please find below the job description for your reference. Kindly let me know your interest and share your updated CV to nandinis@suntechnologies.com with the below details ASAP. C.CTC- E.CTC- Notice Period- Current location- Are you serving Notice period/immediate- Exp in Snowflake- Exp in Matillion- 2:00PM-11:00PM-shift timings (free cab facility-drop) +food Please let me know, if any of your friends are looking for a job change. Kindly share the references. Only Serving/ Immediate candidates can apply. Interview Process-2 Rounds(Virtual)+Final Round(F2F) Please Note: WFO-Work From Office (No hybrid or Work From Home) Mandatory skills : Snowflake, SQL, ETL, Data Ingestion, Data Modeling, Data Warehouse,Python, Matillion, AWS S3, EC2 Preferred skills : SSIR, SSIS, Informatica, Shell Scripting Venue Details: Sun Technology Integrators Pvt Ltd No. 496, 4th Block, 1st Stage HBR Layout (a stop ahead from Nagawara towards to K. R. Puram) Bangalore 560043 Company URL: www.suntechnologies.com Thanks and Regards,Nandini S | Sr.Technical Recruiter Sun Technology Integrators Pvt. Ltd. nandinis@suntechnologies.com www.suntechnologies.com
Posted 1 month ago
8 - 13 years
30 - 35 Lacs
Bengaluru
Work from Office
Power BI Architect - J48917 As a Data Engineer BI Analytics & DWH , you will play a pivotal role in designing and implementing comprehensive business intelligence solutions that empower our organization to make data-driven decisions. You will leverage your expertise in Power BI, Tableau, and ETL processes to create scalable architectures and interactive visualizations. This position requires a strategic thinker with strong technical skills and the ability to collaborate effectively with stakeholders at all levels. Key Responsibilities: BI Architecture & DWH Solution Design: Develop and design scalable BI Analytical & DWH Solution that meets business requirements, leveraging tools such as Power BI and Tableau. Data Integration: Oversee the ETL processes using SSIS to ensure efficient data extraction, transformation, and loading into data warehouses. Data Modelling: Create and maintain data models that support analytical reporting and data visualization initiatives. Database Management: Utilize SQL to write complex queries, stored procedures, and manage data transformations using joins and cursors. Visualization Development: Lead the design of interactive dashboards and reports in Power BI and Tableau, adhering to best practices in data visualization. Collaboration: Work closely with stakeholders to gather requirements and translate them into technical specifications and architecture designs. Performance Optimization: Analyse and optimize BI solutions for performance, scalability, and reliability. Data Governance: Implement best practices for data quality and governance to ensure accurate reporting and compliance. Team Leadership: Mentor and guide junior BI developers and analysts, fostering a culture of continuous learning and improvement. Azure Databricks: Leverage Azure Databricks for data processing and analytics, ensuring seamless integration with existing BI solutions. Required Candidate profile Candidate Experience Should Be : 8 To 15 Candidate Degree Should Be : BE-Comp/IT
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane