Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
8 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 2 months ago
7.0 - 9.0 years
9 - 14 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote
Posted 2 months ago
7.0 - 12.0 years
20 - 27 Lacs
Bengaluru
Remote
Profisee Consultant Job Summary: We are seeking a skilled and experienced Profisee Consultant to join our data management team. In this role, you will be responsible for designing, developing, and implementing Master Data Management (MDM) solutions using the Profisee Platform. Youll work closely with business and IT stakeholders to ensure data integrity, governance, and usability across the enterprise . Key Responsibilities: Lead and participate in the implementation of Profisee MDM solutions. Work with stakeholders to gather and analyze MDM requirements. Design and configure Profisee entities, hierarchies, workflows, and match/merge rules. Integrate Profisee with other enterprise systems (ERP, CRM, Data Warehouses). Develop and maintain data quality rules and governance frameworks. Provide ongoing support, troubleshooting, and optimization of MDM solutions. Deliver documentation, training, and knowledge transfer to internal teams. Ensure compliance with data governance, privacy, and security policies. Required Qualifications : Proven experience with Profisee MDM platform (3+ years preferred). Strong understanding of Master Data Management principles and best practices. Experience with data modeling, SQL Server, and integration tools (e.g., SSIS). Familiarity with data quality, data stewardship, and data governance concepts. Ability to gather requirements and translate them into technical solutions. Excellent problem-solving, communication, and stakeholder management skills .
Posted 2 months ago
10.0 - 14.0 years
12 - 18 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Datacenter Architecting - Unix Stack.
Posted 2 months ago
3.0 - 7.0 years
30 - 32 Lacs
Mohali
Work from Office
We are seeking a highly skilled and experienced Senior Data Engineer to join our team. This role will be instrumental in designing, developing, and maintaining our data infrastructure, ensuring the effective processing and analysis of large datasets. The ideal candidate will have a strong background in data modeling, data architecture, and experience with a variety of data technologies. Key Responsibilities: Design and implement robust data pipelines, ETL processes, and data warehouses to support our analytics and reporting needs. Develop and maintain data models, schemas, and metadata to ensure data quality and consistency. Collaborate with data scientists, analysts, and business stakeholders to understand their requirements and translate them into technical solutions. Optimize data pipelines for performance and scalability to handle large volumes of data. Stay up-to-date with the latest data technologies and trends to drive innovation and efficiency. Responsibilities Design, develop, and maintain scalable data architectures, pipelines, APIs, and integrations. Create and optimize data models to support efficient data processing and storage. Manage and maintain databases, including Postgres, SSIS, ensuring data integrity and performance. Develop, deploy, and manage ETL and EDI processes. Develop and maintain scripts and applications using Python for data processing and analysis. Ensure data security and compliance with relevant regulations and best practices. Leverage cloud services (e.g., Azure, AWS) for data storage, processing, and analytics. Collaborate with cross-functional teams to gather requirements and provide data-driven insights. Implement and manage caching solutions to improve data retrieval speeds. Create and maintain comprehensive documentation for all data processes and architectures. Utilize data visualization tools to create interactive dashboards and reports for stakeholders. Qualifications Bachelors degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent work experience). Minimum of 5 years of experience in data engineering or a related field. Proficiency in data modeling, data architecture, and database management with Postgres or SSIS.. Experience with electronic medical records (EMRs) and understanding of the healthcare industry is strongly desired. Strong SQL skills and experience with common ETL tools. Proficiency in Python for data processing and automation. Experience with common caching solutions (e.g., Redis, Memcached). Expertise in data security best practices and regulatory compliance. Hands-on experience with cloud platforms like Azure and AWS. Proficiency with data visualization tools such as Power BI, Tableau, or similar. Excellent problem-solving skills and ability to troubleshoot data issues effectively. Strong communication skills, both written and verbal, with the ability to explain complex technical concepts to non-technical stakeholders. Desired Skills Knowledge of data warehousing concepts and methodologies. Experience with Agile/Scrum methodologies. Familiarity with Power BI administration and deployment.
Posted 2 months ago
5.0 - 7.0 years
18 - 20 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Design, develop, and maintain Cognos dashboards and reports to visualize data insights and support business objectives Collaborate with stakeholders to gather requirements and translate them into effective visualizations Optimize Cognos performance and ensure scalability of dashboards for large datasets Develop and maintain SQL queries for data transformation, and Analytics processes Work closely with the data engineering team to ensure the availability and quality of data for reporting purposes Conduct thorough testing and validation of Cognos dashboards to ensure accuracy and reliability of data visualizations Provide technical support and troubleshooting assistance to end-users regarding Cognos dashboards and related data issues Stay updated with the latest Cognos features, best practices, and industry trends to continuously improve reporting solutions Solid RDBMS experience and data modelling and advanced querying skills. Good to have other BI tool knowledge like Tableau and BI migration experience from Source BI tool to Tableau Strong ability to extract information by questioning, active listening, and interviewing. Strong analytical and problem-solving skills. Excellent writing skills, with the ability to create clear requirements, specifications, and Documentation. Location - Remote, Hyderabad,Ahmedabad,pune,chennai,kolkata.
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to design, program, simulate and test the automation product or process to achieve the efficiency and effectiveness required. Do 1. Instrumental in understanding the software requirements and design of the product Analyze and understand the current technology architecture, system interdependencies and application stacks Formulate project plans by working with project management and outlining steps required to develop project and submit project plans to project management for approval Understand current operating procedures by consulting with users/partners/clients and reviewing project objectives on regular basis Contribute to the automation roadmap design and testing process improvements by researching on automation architectures and developing new automation solutions Improve and maintain the automation framework to be used horizontally across our technology stacks as well as build out reusable libraries across our business line verticals 2. Design and execute software developing and reporting Ensure the environment is ready for the execution process designing, test plans, developing test cases/scenarios/usage cases and executing these cases Development of technical specifications and plans and resolution of complex technical design issues Participate and conduct design activities with the development team relating to testing of the automation processes for both functional and non-functional requirements Implement, track, and report key metrics to assure full coverage of functional and non-functional requirements through automation Eliminates errors by owning the testing and validations of codes Track problems, resolutions, and bug fixes throughout the project and create a comprehensive database of defects and successful mitigation techniques Provide resolutions to problems by taking the initiative to use all available resources for research Design and implement automated testing tools when possible, and update tools as needed to ensure efficiency and accuracy Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Develop programs that run efficiently and adhere to WIPRO standards by using similar logic from existing applications, discussing best practices with team members, referencing text books and training manuals, documenting the code and by using accepted design patterns 3. Ensuring smooth flow of communication with customer & internal stakeholders Work with Agile delivery teams to understand product vision and product backlogs; develop robust, scalable, and high quality test automation tests for functional, regression and performance testing Assist in creating acceptance criteria for user stories and generate a test automation backlog Collaborate with Development team to create/improve continuous deployment practices by developing strategies, formalizing processes and providing tools Work closely with business Subject Matter Experts to understand requirements for automation, then designs, builds and deploys the application using automations tools Ensure long term maintainability of the system by documenting projects according to WIPRO guidelines Ensure quality of communication by being clear and effective with test personnel, users, developers, and clients to facilitate quick resolution of problems and accurate documentation of successes Provide assistance to testers and supports personnel as needed to determine system problems Ability to perform backend/database programming for key projects. Stay up-to-date on industry standards and incorporate them appropriately. Design and implement automated testing tools when possible, and update tools as needed to ensure efficiency and accuracya Display No. Performance Parameter Measure 1. Automation Quality of design/ adherence to design Adherence to project plan Issue resolution and client escalation management Zero disruption/ error in deployment EWS on risks and deployment of mitigation measures 2. Documentation Complete documentation of automation process, test cases, debug data and performance review as per quality standards Mandatory Skills: Telecom NMS Data Modelling South Bound.
Posted 2 months ago
10.0 - 20.0 years
25 - 40 Lacs
Noida, Pune, Bengaluru
Hybrid
Responsibilities Lead functional and technical workshops, demonstrating leadership skills in designing, delivering, testing, and deploying Salesforce solutions. Expertise in Data Modeling, Apex Design Patterns, LWC, and other modern UI techniques. Design and architect scalable and secure Salesforce solutions that meet business requirements. Must have expertise in Salesforce Service Cloud, Einstein AI, Data Cloud & Experience Cloud. Serve as a trusted advisor to the client, conducting conversations with their Enterprise Architects and business stakeholders to shape the architectural vision and establish an architectural roadmap program. Manage customer expectations; negotiate solutions to complex problems with both the customer and third-party stakeholders. Guide customers, partners, and implementation teams on how best to execute digital transformation with the Salesforce platform using Salesforce Industries. Establish trust with the customers leadership, promoting and implementing best practices with Salesforce Industries and Salesforce. Ensure best practices in coding standards, design patterns, and integration processes are followed. Develop and maintain technical documentation for designed solutions. Build out sophisticated business processes using native Salesforce Industries technology and the toolkit of the Force.com platform and integration tools. Work closely with Delivery Managers, Solution Architects, and directly with clients to architect technology solutions to meet client needs. Highlight and manage risk areas in the solution proactively, committing to seeing issues through to completion. Qualifications Minimum 12-15 years of total experience in IT. Minimum 8 years of total Salesforce experience in Salesforce architecture and integration. Minimum 5 years of experience developing Salesforce customizations (Apex/Lightning), integrations, and executing data migrations. Minimum of 3-5 years of experience creating the technical architecture for complex Salesforce implementations. 7+ years of experience in defining, designing, delivering, and deploying Salesforce-based technical solutions, in the capacity of the accountable or responsible contributor. Design and implement Salesforce solutions aligned with business strategy and objectives. Lead technical requirements sessions, architect and document technical solutions aligned with client business objectives. Translate business requirements into well-architected solutions that best leverage the Salesforce platform. Provide guidance on the deployment of Salesforce CRM implementations, integrations, and upgrades. Mandatory to have at least one Developer Track Certification (Platform Developer I) along with at least one Cloud Consultant Certification from either Community, Field Service, Sales, Service, or CPQ. Mandatory to have either System Architect or Application Architect certification. Other relevant Salesforce certifications (Data cloud, Experience Cloud) are a plus. Excellent communication (written and oral) and interpersonal skills, with the ability to present to a variety of audiences (executive to technically detailed audiences). Excellent leadership and management skills. Education / Certification Bachelor’s/University degree or equivalent experience. Salesforce certifications (e.g., Platform Developer I, System Architect, Application Architect) are preferred.
Posted 2 months ago
2.0 - 7.0 years
7 - 17 Lacs
Mumbai
Work from Office
Greetings!!! We have an opening with Reputed Finance Industry for the role of Data Management Business Analyst Experince: 2+ years Role & responsibilities Extract and analyze data from the MES system to identify trends, performance metrics, and areas for improvement. Business requirements should be elicited, analyzed, specified, and verified Create documents like Functional Specification Documents (FSD) with Table-Column mapping. Engage with various stakeholders like Business, Data Science, PowerBI team for cross-functional data validation and support. Identify opportunities to optimize manufacturing processes based on data analysis and user feedback. Preferred candidate profile 2 + years of relevant experience in Data Modelling & Management experience . Interested candidates can share your resume to josy@topgearconsultants.com
Posted 2 months ago
4.0 - 5.0 years
8 - 12 Lacs
Tamil Nadu
Work from Office
Duration: 12Months Position Description: Develop RPA and Chat Bot solutions using Pega and Case Management solutions. Developing integrations for RPA and Chat Bot solutions using Webservices and API's Pairs with other software engineers to cooperatively deliver user stories. Uses the test driven development methodology to realize the technical solution. Perform requirements gathering, Business Analysis, Fit Gap Analysis, System Testing, Documentation (FDD and TDD) and End User Training Analyze integration requirements and work with legacy systems Should coordinate with business skill teams, PDO IT teams, Architects and Product vendor (Pega) in developing and deploying the automation solutions. Bot Maintenance Skills Required: 4+ Experience in PEGA RPA Development. Experience in Delivering global RPA or DPA projects using Agile methodology. Strong knowledge on C# scripting. Pega certification Knowledge in AI or ML modeling is added advantage. Hands-on Experience on Data Modelling, Stored Procedures, SQL Server Strong understanding of Case Management, Case Hierarchy, Flow Rules, Data Propagation Good working knowledge on Declarative Rules, Database Integrations, Connectors and Services, Decisioning & ML Working Knowledge on UI Design Strong Knowledge on Application Debugging, Performance Tuning, Quality Assurance & Packaging Working knowledge on Pega Security Framework Working knowledge on RPA Integration and Robot Manager Portal Features Experience Required: 4-5 years of Pega RPA development experience Education Required: Bachelor's degree B.EMCA
Posted 2 months ago
3.0 - 5.0 years
14 - 18 Lacs
Pune
Work from Office
Proficient in T-SQL for complex database querying and optimization Expertise in Power BI desktop and service for report/dashboard development Hands-on experience with SQL Server database design and management Strong data modeling skills, including dimensional modeling and star schema Ability to transform raw data into meaningful, actionable information Preferred Skills ("Good to Have"): Experience with Azure Data Services (e.g., Azure SQL, Azure Synapse Analytics, Azure Data Factory) Knowledge of data warehousing concepts and best practices Familiarity with ETL processes and data integration tools Understanding of Power BI governance, security, and deployment strategies Exposure to agile software development methodologies Strong problem-solving and analytical skills Excellent communication and stakeholder management abilities Key Responsibilities: Design and develop interactive, visually appealing Power BI dashboards and reports Implement complex data models and DAX calculations to meet business requirements Optimize SQL queries for high performance and scalability Automate data refresh processes and implement data security protocols Collaborate with business stakeholders to understand reporting needs Provide technical guidance and training to end-users Continuously improve dashboard design, functionality, and user experience Stay up-to-date with the latest Power BI and MS-SQL Server features and best practices
Posted 2 months ago
3.0 - 7.0 years
5 - 8 Lacs
Pune
Work from Office
Job Title - Senior Engineer for Data Management Private Bank Role Description Our Data Governance and Architecture team is driving forward data management together with the Divisional Data Office for Private Bank. In close collaboration between business and IT we assign data roles, manage the documentation of data flows, align data requirements between consumers and producers of data, report data quality and coordinate Private Banks data delivery through the group data hub. We support our colleagues in the group Chief Data Office to optimize Deutsche Banks Data Policy and the associated processes and methods to manage and model data. As part of the team you will be responsible for work streams from project planning to preparing reports to senior management. You combine regulatory compliance with data driven business benefits for Deutsche Bank. Your key responsibilities Establish and maintain the Private Bank contribution to the Deutsche Bank Enterprise Logical and Physical Data models and ensure its usefulness for the Private Bank business Understand the requirement of the group functions risk, finance, treasury, and regulatory reporting and cast them into data models in alignment with the producers of the data Co-own Private Bank relevant parts of the Deutsche Bank Enterprise Logical and Physical Data models Support the Private Bank experts and stakeholders in delivering the relevant data Optimize requirements management and modelling processes together with the group Chief Data Office and Private Bank stakeholders Align your tasks with the team and the Private Bank Data Council priorities Your skills and experience In depth understanding of how data and data quality impacts processes across the bank in the retail sector Hands on-experience with data modelling in the financial industry Extensive experience with data architecture and the challenges of harmonized data provisioning Project and stakeholder management capabilities Open minded team player: making different people work together well across the world Fluent in English
Posted 2 months ago
9.0 - 14.0 years
32 - 37 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Should be able to conduct requirement gathering sessions, estimations for SAP MDG application. Should be aware of the phases of the SAP activate methodology for project execution. Work with the various technical teams to come up with the future state solution architecture to support the MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance, and scalability strategies to support the various business capabilities. Share perspectives about best practices, common technical issues & approaches. support high level logical data model definition and discussions to ensure feasibility with MDG. Full life cycle implementation cycle with Blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience. Functional design for Data modelling UI modelling Rules and Validations in BRF+ through configurations Replication modelling DRFin and DRFout, IDOCs, ALE, SOAP services for SAP MDG Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 2 months ago
5.0 - 10.0 years
7 - 12 Lacs
Delhi / NCR, Bengaluru
Work from Office
Should be able to conduct requirement gathering sessions, estimations for SAP MDG application. Should be aware of the phases of the SAP activate methodology for project execution. Work with the various technical teams to come up with the future state solution architecture to support the MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance, and scalability strategies to support the various business capabilities. Share perspectives about best practices, common technical issues & approaches. support high level logical data model definition and discussions to ensure feasibility with MDG. Full life cycle implementation cycle with Blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience. Functional design for Data modelling UI modelling Rules and Validations in BRF+ through configurations Replication modelling DRFin and DRFout, IDOCs, ALE, SOAP services for SAP MDG Location: Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 2 months ago
5.0 - 10.0 years
8 - 14 Lacs
Tirunelveli
Work from Office
Should be able to conduct requirement gathering sessions, estimations for SAP MDG application. Should be aware of the phases of the SAP activate methodology for project execution. Work with the various technical teams to come up with the future state solution architecture to support the MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance, and scalability strategies to support the various business capabilities. Share perspectives about best practices, common technical issues & approaches. support high level logical data model definition and discussions to ensure feasibility with MDG. Full life cycle implementation cycle with Blueprinting, fit gap analysis, configurations, data migrations, cutovers and go live experience. Functional design for Data modelling UI modelling Rules and Validations in BRF+ through configurations Replication modelling DRFin and DRFout, IDOCs, ALE, SOAP services for SAP MDG.
Posted 2 months ago
5.0 - 9.0 years
6 - 10 Lacs
Pune, Chennai, Bengaluru
Work from Office
Expertise with SAP MDG configurations for Data modelling, UI modelling, process modelling, rules and derivations, BRF+, replication configurations. Have technical knowledge of MDG workflows and custom developments for Parallel Splits, merge, parallel steps logics. Have technical knowledge of ERP tables and working exp for ABAP developments. Should have worked on MDG implementation as a technical expert with responsibility to design, develop, test , defect management for MDG WRICEF objects. Customizations of data model, UI model, BRF+ custom developments Webservice installations and deployments Developments for Webservices for inbound and outbounds for MDG Good knowledge of BRF+, Workflow, FPM, Web Dynpro, Enhancements, IDOCs and Proxies Working knowledge of transport management, best practices deployment, code review ABAP, ABAP workflows , OOPS ABAP Location- PAN ,Delhi NCR, Bengaluru,Chennai,Pune,Kolkata,Ahmedabad, Mumbai,Hyderabad
Posted 2 months ago
9.0 - 11.0 years
11 - 15 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Expertise with SAP MDG configurations for Data modelling, UI modelling, process modelling, rules and derivations, BRF+, replication configurations. Have technical knowledge of MDG workflows and custom developments for Parallel Splits, merge, parallel steps logics. Have technical knowledge of ERP tables and working exp for ABAP developments. Should have worked on MDG implementation as a technical expert with responsibility to design, develop, test , defect management for MDG WRICEF objects. Customizations of data model, UI model, BRF+ custom developments Webservice installations and deployments Developments for Webservices for inbound and outbounds for MDG Good knowledge of BRF+, Workflow, FPM, Web Dynpro, Enhancements, IDOCs and Proxies Working knowledge of transport management, best practices deployment, code review ABAP, ABAP workflows , OOPS ABAP Solution architecting for MDG Work with the various technical teams to come up with the future state solution architecture to support the SAP MDG implementation which would include areas such as source and consumers integration, application arch, external data/vendor access, security, performance and scalability strategies to support the various business capabilities Develop technical recommendations including integration strategy, synchronization mechanisms, external data capabilities, technology alternatives Share perspectives about best practices, common technical issues & approaches Supports Roadmap creation Will support high level logical data model definition and discussions to ensure feasibility with SAP MDG Owns Solution Architecture deliverable for SAP MDG Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, PAN India
Posted 2 months ago
7.0 - 12.0 years
19 - 25 Lacs
Bengaluru
Remote
Data manager Skills: SAP Analytics, data modelling Preferred candidate profile
Posted 2 months ago
9.0 - 14.0 years
50 - 85 Lacs
Noida
Work from Office
About the Role We are looking for a Staff Engineer specialized in Master Data Management to design and develop our next-generation MDM platform. This role is ideal for engineers who have created or contributed significantly to MDM solutions. Youll lead the architecture and development of our core MDM engine, focusing on data modeling, matching algorithms, and governance workflows that enable our customers to achieve a trusted, 360-degree view of their critical business data. A Day in the Life Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Architect and develop scalable, fault-tolerant MDM platform components that handle various data domains. Design and implement sophisticated entity matching and merging algorithms to create golden records across disparate data sources. Develop or Integrate flexible data modeling frameworks that can adapt to different industries and use cases. Create robust data governance workflows, including approval processes, audit trails, and role-based access controls. Build data quality monitoring and remediation capabilities into the MDM platform. Collaborate with product managers, solution architects, and customers to understand industry-specific MDM requirements. Develop REST APIs and integration patterns for connecting the MDM platform with various enterprise systems. Mentor junior engineers and promote best practices in MDM solution development. Lead technical design reviews and contribute to the product roadmap What You Need 8+ years of software engineering experience, with at least 5 years focused on developing master data management solutions or components. Proven experience creating or significantly contributing to commercial MDM platforms, data integration tools, or similar enterprise data management solutions. Deep understanding of MDM concepts including data modeling, matching/merging algorithms, data governance, and data quality management. Strong expertise in at least one major programming language such as Java, Scala, Python, or Go. Experience with database technologies including relational (Snowflake, Databricks, PostgreSQL) and NoSQL systems (MongoDB, Elasticsearch). Knowledge of data integration patterns and ETL/ELT processes. Experience designing and implementing RESTful APIs and service-oriented architectures. Understanding of cloud-native development and deployment on AWS, or Azure. Familiarity with containerization (Docker) and orchestration tools (Kubernetes). Experience with event-driven architectures and messaging systems (Kafka, RabbitMQ). Strong understanding of data security and privacy considerations, especially for sensitive master data.
Posted 2 months ago
7 - 10 years
15 - 25 Lacs
Pune
Hybrid
Lead Data Engineer (DataBricks) Experience: 7 - 10 Years Exp Salary : Upto INR 25 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Hybrid (Pune) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : AWS Glue, Databricks, Azure - Data Factory, SQL, Python, Data Modelling, ETL Good to have skills : Big Data Pipelines, Data Warehousing Forbes Advisor (One of Uplers' Clients) is Looking for: Data Quality Analyst who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Position: Lead Data Engineer (Databricks) Location: Pune, Ahmedabad Required Experience: 7 to 10 years Preferred: Immediate Joiners Job Overview: We are looking for an accomplished Lead Data Engineer with expertise in Databricks to join our dynamic team. This role is crucial for enhancing our data engineering capabilities, and it offers the chance to work with advanced technologies, including Generative AI. Key Responsibilities: Lead the design, development, and optimization of data solutions using Databricks, ensuring they are scalable, efficient, and secure. Collaborate with cross-functional teams to gather and analyse data requirements, translating them into robust data architectures and solutions. Develop and maintain ETL pipelines, leveraging Databricks and integrating with Azure Data Factory as needed. Implement machine learning models and advanced analytics solutions, incorporating Generative AI to drive innovation. Ensure data quality, governance, and security practices are adhered to, maintaining the integrity and reliability of data solutions. Provide technical leadership and mentorship to junior engineers, fostering an environment of learning and growth. Stay updated on the latest trends and advancements in data engineering, Databricks, Generative AI, and Azure Data Factory to continually enhance team capabilities. Qualifications: Bachelors or masters degree in computer science, Information Technology, or a related field. 7+ to 10 years of experience in data engineering, with a focus on Databricks. Proven expertise in building and optimizing data solutions using Databricks and integrating with Azure Data Factory/AWS Glue. Proficiency in SQL and programming languages such as Python or Scala. Strong understanding of data modelling, ETL processes, and Data Warehousing/Data Lakehouse concepts. Familiarity with cloud platforms, particularly Azure, and containerization technologies such as Docker. Excellent analytical, problem-solving, and communication skills. Demonstrated leadership ability with experience mentoring and guiding junior team members. Preferred Skills: Experience with Generative AI technologies and their applications. Familiarity with other cloud platforms, such as AWS or GCP. Knowledge of data governance frameworks and tools. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: At Inferenz, our team of innovative technologists and domain experts help accelerating the business growth through digital enablement and navigating the industries with data, cloud and AI services and solutions. We dedicate our resources to increase efficiency and gain a greater competitive advantage by leveraging various next generation technologies. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 months ago
1 - 2 years
6 - 9 Lacs
Chennai
Work from Office
We are looking for an Associate Data Scientist to analyze and interpret complex datasets, applying advanced statistical and machine learning techniques to extract valuable insights and drive data-driven decision-making. You will work closely with cross-functional teams to identify business challenges and develop innovative solutions that optimize our products and services. In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research. Key Responsibilities: Utilize advanced analytics techniques to analyze large and complex datasets, identifying patterns, trends, and correlations to uncover valuable insights. Develop and implement machine learning models and algorithms for predictive and prescriptive analytics. Clean, pre-process, and validate data to ensure accuracy, completeness, and consistency for analysis purposes. Apply knowledge of Large Language Models (LLMs) to enhance text analysis and generation tasks. Explore and evaluate state-of-the-art LLMs, adapt them to specific tasks, and fine-tune models as necessary to improve performance. Build and deploy scalable text analysis pipelines to process and analyze text data efficiently . Communicate complex findings and insights to both technical and non-technical stakeholders through effective data visualization and storytelling techniques. Propose solutions and strategies to business challenges . Collaborate with engineering and product development teams . The ideal candidate: Minimum 6 months of experience as a Data Scientist Experience in data analysis, and modeling. Solid understanding of statistical concepts and machine learning algorithms, with hands-on experience in applying them to real-world problems. Strong programming skills in languages such as Python or R Experience with machine learning frameworks and libraries such as scikit-learn, TensorFlow, or PyTorch Knowledge and hands-on experience with Large Language Models (LLMs), such as GPT-3 and BERT Familiarity with data visualization tools such as Tableau, Power BI, or Matplotlib to effectively communicate insights. B.E/B.Tech in Computer Science, Engineering or relevant field; OR graduate degree in Data Science or other quantitative field is preferred . Abilities and traits Analytical mind and business acumen Strong math skills (e.g. statistics, algebra) Problem-solving aptitude Excellent communication and presentation skills . What we offer: Competitive salary and benefits package. Opportunity to work with cutting-edge technologies. Flexible working hours and remote work options. Continuous learning and professional development opportunities. A collaborative and supportive work environment.
Posted 2 months ago
8 - 13 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 8 - 15 Yrs Location: Pan India Job Description: Minimum Two years experience in Boomi Data modeling Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 2 months ago
10 - 15 years
15 - 18 Lacs
Hyderabad
Work from Office
Skilled in data modeling (ER/Studio, Erwin), MPP DBs (Databricks, Snowflake), GitHub, CI/CD, metadata/lineage, agile/DevOps, SAP HANA/S4, and retail data (IRI, Nielsen). Mail:kowsalya.k@srsinfoway.com
Posted 2 months ago
15 - 24 years
22 - 37 Lacs
Pune
Remote
Responsibilities: Understanding business objectives and developing models that help to achieve them, along with metrics to track their progress. Analyzing the ML algorithms that could be used to solve a given problem and ranking them by their success probability. Determine and refine machine learning objectives. Designing machine learning systems and self-running artificial intelligence (AI) software to automate predictive models. Transforming data science prototypes and applying appropriate ML algorithms and tools. Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world Ensuring that algorithms generate accurate user recommendations. Verifying data quality, and/or ensuring it via data cleaning. Supervising the data acquisition process if more data is needed. Defining validation strategies. Defining the pre-processing or feature engineering to be done on a given dataset Solving complex problems with multi-layered data sets, as well as optimizing existing machine learning libraries and frameworks. Developing ML algorithms to analyze huge volumes of historical data to make predictions. Running tests, performing statistical analysis, and interpreting test results. Deploying models to production. Documenting machine learning processes. Keeping abreast of developments in machine learning. Preferred candidate profile Bachelor's degree in computer science, data science, mathematics, or a related field. Knowledge as a machine learning engineer. Proficiency with a deep learning framework such as TensorFlow, XgBoost, Wavevnet, Keras, numpy. Advanced proficiency with Python, Java, and R code writing. Proficiency with Python and basic libraries for machine learning such as scikit-learn and pandas. Extensive knowledge of ML frameworks, libraries, data structures, data modeling, and software architecture in ANN, CNN, RNN with LSTM. Ability to select hardware to run an ML model with the required latency In-depth knowledge of mathematics, statistics, and algorithms. Superb analytical and problem-solving abilities. Great communication and collaboration skills. Excellent time management and organizational abilities. Benefits of working with OptimEyes 1 . Remote work opportunity ( work from home)2. Work opportunity with a top-notch team, cutting-edge technology, and leadership of extremely successful experts.3. Monthly Bonus along with Salary4. Yearly Bonus Role & responsibilities Preferred candidate profile
Posted 2 months ago
4 - 7 years
6 - 11 Lacs
Hyderabad
Work from Office
The position is based out of CDK Pune office with team responsible to develop the Identity & Access Management platform. IAM / EIS is part of Foundational Platform Services within CDK that provides Identity & Authorization services to various internal & external CDK product platforms. Primary technical activities will be related to development & design of Product Software using Core Java, Spring framework, SQL / PL/SQL using JDBC / Hibernate, RESTful APIs & Angular / ReactJS. Skills Required Strong knowledge of Core Java and Object-Oriented programming concepts. Strong knowledge of Spring Ecosystem (IOC concepts, Spring Boot, Spring Web) Strong understanding of RESTful web services and JSON Functional expertise in the IAM Domain would be preferrable. Hands-on experience in working on frameworks such as OAUTH 2.0, SAML, OPENID and understanding of Authentication & Authorization domain with exposure to OKTA, Azure and such is a plus. Experience in messaging technologies & platforms such as RabitMQ / Kafka. Expertise in Docker, Kubernetes , AWS and Microservices . Experience with relational databases Aurora PostGRE-SQL, including designing queries, data modelling, PL/SQL scripts Experience on UI framework preferably Angular / ReactJS Testing frameworks like JUnit, Selenium, Cypress, Cucumber Strong analytical skills required, including the ability to understand complex business workflows and determine how they can be implemented within the system Strong communication Skills, able to work in global delivery model and work with his/her counterpart across the globe Good understanding of agile methodology for project delivery Basic skills with Software Configuration Management Tools and Integrated Development Environments Core Responsibilities Work with Product / Technical managers and architects to understand the requirements. Ability to come up with Solution Design, evaluate alternate approaches and come up with PoCs Manage, write & test code as per requirements Troubleshoot code issues, fix bugs and apply solution driven approach Execute full software development lifecycle Document and maintain application functionality, use cases, integration approach etc. Comply with project plans and industry standards Monitor and improve the application performance Required Qualifications 4-7 years of experience BE, BTech or Computer graduate
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough