Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
6 - 10 Lacs
Kolkata
Work from Office
As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your primary responsibilities include: Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 5+ years of experience in IT industry. Minimum of 4+ years of Experience in Oracle Applications and Oracle Cloud in Technical Domain. 2 End to End Implementations in Oracle Supply Chain Management Cloud as Functional Consultant. Should have worked in Inventory, Order Management, Cost Management, GOP Cloud, Data Integration, FBDI, ADFDI Minimum 4+ years of experience in BIP reporting Preferred technical and professional experience You’ll have access to all the technical and management training courses you need to become the expert you want to be. Should have minimum 3 or more years of relevant experience in Oracle Cloud Technical (Oracle Fusion )12c Development and Implementation. Should have good knowledge of integrating with Web Services, XML(Extensible Markup Language) and other API(Application Programming Interface) to transfer the data - from source and target, in addition to database
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 month ago
2.0 - 5.0 years
14 - 17 Lacs
Pune
Work from Office
As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python
Posted 1 month ago
2.0 - 5.0 years
6 - 10 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client’s needs. Your primary responsibilities include* Design, build, optimize and support new and existing data models and ETL processes based on our client’s business requirements Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyse and model data to ensure optimal ETL design and performance. Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation
Posted 1 month ago
3.0 - 5.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration-Ab Initio Preferred Skills: Technology-Data Management - Data Integration-Ab Initio
Posted 1 month ago
5.0 - 8.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Talend Preferred Skills: Technology-Data Management - Data Integration-Talend
Posted 1 month ago
3.0 - 5.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration Administration-Informatica Administration Preferred Skills: Technology-Data Management - Data Integration Administration-Informatica Administration
Posted 1 month ago
3.0 - 5.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Minimum 3-5 years of work experience in SAS EG and SAS CI Hands on experience in data transferring from different sources to SAS database Expertise in Data Step and Proc Step including merge statement , proc sql and macros , SAS functions Experience in automation and SAS reporting Good communication skill is must. Candidate should independently work deliver the project work as well as deal with client . Location Any Infosys DC in India Preferred Skills: Technology-ETL & Data Quality-ETL & Data Quality - SAS-SAS - SAS Data Integration Studio
Posted 1 month ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Educational Bachelor of Engineering,BCA,BTech,MTech,MCA,MBA Service Line Application Development and Maintenance Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Healthcare Data analyst ,PL/SQL, SQL, Data mapping, STTM creation, Data profiling, Reports Preferred Skills: Domain-Healthcare-Healthcare - ALL
Posted 1 month ago
2.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Overview A Data Engineer will be responsible for Designs, develops, programs and implements Machine Learning solutions , Implements Artificial/Augmented Intelligence systems/Agentic Workflows/Data Engineer Workflows, Performs Statistical Modelling and Measurements by applying data engineering, feature engineering, statistical methods, ML modelling and AI techniques on structured, unstructured, diverse “big data” sources of machine acquire data to generate actionable insights and foresights for real life business problem solutions and product features development and enhancements. A strong understanding of databases, SQL, cloud technologies, and modern data integration and orchestration tools like Azure Data Factory (ADF) are required to succeed in this role. Responsibilities Integrates state-of-the-art machine learning algorithms as well as the development of new methods Develops tools to support analysis and visualization of large datasets Develops, codes software programs, implements industry standard auto ML models (Speech, Computer vision, Text Data, LLM), Statistical models, relevant ML models (devices/machine acquired data), AI models and algorithms Identifies meaningful foresights based on predictive ML models from large data and metadata sources; interprets and communicates foresights, insights and findings from experiments to product managers, service managers, business partners and business managers Makes use of Rapid Development Tools (Business Intelligence Tools, Graphics Libraries, Data modelling tools) to effectively communicate research findings using visual graphics, Data Models, machine learning model features, feature engineering / transformations to relevant stakeholders Analyze, review and track trends and tools in Data Science, Machine Learning, Artificial Intelligence and IoT space Interacts with Cross-Functional teams to identify questions and issues for data engineering, machine learning models feature engineering Evaluates and makes recommendations to evolve data collection mechanism for Data capture to improve efficacy of machine learning models prediction Meets with customers, partners, product managers and business leaders to present findings, predictions, foresights; Gather customer specific requirements of business problems/processes; Identify data collection constraints and alternatives for implementation of models Working knowledge of MLOps, LLMs and Agentic AI/Workflows Programming Skills: Proficiency in Python and experience with ML frameworks like TensorFlow, PyTorch LLM Expertise: Hands-on experience in training, fine-tuning, and deploying LLMs Foundational Model Knowledge: Strong understanding of open-weight LLM architectures, including training methodologies, fine-tuning techniques, hyperparameter optimization, and model distillation. Data Pipeline Development: Strong understanding of data engineering concepts, feature engineering, and workflow automation using Airflow or Kubeflow. Cloud & MLOps: Experience deploying ML models in cloud environments like AWS, GCP (Google Vertex AI), or Azure using Docker and Kubernetes.Designs and implementation predictive and optimisation models incorporating diverse data types strong in SQL, Azure Data Factory (ADF) Qualifications • Minimum Education: o Bachelors, Master's or Ph.D. Degree in Computer Science or Engineering. • Minimum Work Experience (years): o 1+ years of experience programming with at least one of the following languages: Python, Scala, Go. o 1+ years of experience in SQL and data transformation o 1+ years of experience in developing distributed systems using open source technologies such as Spark and Dask. o 1+ years of experience with relational databases or NoSQL databases running in Linux environments (MySQL, MariaDB, PostgreSQL, MongoDB, Redis). • Key Skills and Competencies: o Experience working with AWS / Azure / GCP environment is highly desired. o Experience in data models in the Retail and Consumer products industry is desired. o Experience working on agile projects and understanding of agile concepts is desired. o Demonstrated ability to learn new technologies quickly and independently. o Excellent verbal and written communication skills, especially in technical communications. o Ability to work and achieve stretch goals in a very innovative and fast-paced environment. o Ability to work collaboratively in a diverse team environment. o Ability to telework o Expected travel: Not expected.
Posted 1 month ago
2.0 - 6.0 years
5 - 7 Lacs
Ahmedabad, Aurangabad
Work from Office
Job Title: Alteryx Engineer Location: Bangalore/Mumbai/Ahmedabad/Aurangabad Experience Required: 2-5 years Domain: Manufacturing Job Description: We are seeking a highly skilled Alteryx Engineer with 2-5 years of experience, specifically within the manufacturing domain, to join our dynamic team. The ideal candidate will have a strong background in data preparation, blending, and advanced analytics, coupled with practical experience in the manufacturing industry. This role involves designing, developing, and deploying robust Alteryx workflows to automate data processes, generate insights, and support strategic business decision-making. Key Responsibilities: Workflow Development: Design, develop, and maintain efficient and scalable Alteryx workflows to extract, transform, and load (ETL) data from various sources, ensuring data quality and integrity. Data Blending & Transformation: Perform complex data blending, cleansing, and transformation operations using Alteryx Designer to prepare data for analysis and reporting. Automation: Implement and manage automated data pipelines and analytical processes using Alteryx Server to streamline data delivery and reduce manual efforts. Data Analysis: Analyze complex datasets within Alteryx to identify trends, patterns, and insights that drive strategic decisions and operational improvements. Data Integration: Work with diverse data sources, including SAP, flat files (Excel, CSV), APIs, and other enterprise systems, to ensure accurate and timely data availability within Alteryx workflows. Collaboration: Collaborate closely with business stakeholders, including production, supply chain, and quality assurance teams, to gather requirements, understand their data needs, and translate them into effective Alteryx solutions. Reporting & Output: Configure Alteryx workflows to generate various outputs, including data extracts for reporting tools, analytical datasets, and automated reports. Troubleshooting: Diagnose, resolve, and optimize issues related to Alteryx workflows, data connections, and performance promptly. Required Skills: Experience: 2-5 years of hands-on experience in Alteryx workflow development, data preparation, and automation, with a strong focus on the manufacturing domain. Technical Proficiency: Strong proficiency in Alteryx Designer for building complex analytical workflows. Experience with Alteryx Server for deploying, scheduling, and managing workflows is highly desirable. Data Management: Hands-on experience with SQL and relational databases for querying, data extraction, and understanding database structures. Experience extracting and integrating data from SAP systems using Alteryx connectors or other relevant methods is crucial. Analytical Skills: Strong analytical and problem-solving skills with the ability to interpret complex data, identify root causes, and provide actionable insights. Communication: Excellent communication skills with the ability to present complex technical information clearly to both technical and non-technical audiences. Problem-Solving: Proven ability to troubleshoot issues, optimize workflow performance, and resolve data-related challenges effectively in a fast-paced environment. Domain Knowledge: Familiarity with manufacturing processes, operational metrics, supply chain data, and Key Performance Indicators (KPIs) is highly desirable. Preferred Skills: Alteryx Certification (e.g., Alteryx Designer Core, Advanced, or Expert) is a significant plus. Knowledge of other BI tools (e.g., Tableau, Power BI) or data analysis techniques and programming languages (e.g., Python, R) for advanced analytics is advantageous. Experience with data governance and best practices in Alteryx development. Direct experience with SAP modules relevant to manufacturing (e.g., FICO, Production Planning, Materials Management, Sales and Distribution) is a strong asset.
Posted 1 month ago
3.0 - 7.0 years
5 - 8 Lacs
Ahmedabad, Aurangabad
Work from Office
Job Description: We are seeking a skilled Tableau Analyst/Developer with 2-5 years of experience, specifically within the manufacturing domain, to join our dynamic team. The ideal candidate will have a strong background in data visualization and analysis, coupled with practical experience in the manufacturing industry. This role involves designing and developing interactive dashboards and reports to support business decision-making processes. Key Responsibilities: Dashboard Development: Design, develop, and maintain interactive dashboards and reports using Tableau to meet business requirements. Data Analysis: Analyze complex data sets to identify trends, patterns, and insights that drive strategic decisions. Data Integration: Work with data sources, including SQL databases, Excel, and other data management tools, to ensure accurate and timely data availability. Collaboration: Collaborate with business stakeholders, including production, supply chain, and quality assurance teams, to gather requirements and understand their data needs. Reporting: Generate ad-hoc reports and visualizations as requested by stakeholders to support ongoing projects and initiatives. Troubleshooting: Diagnose and resolve issues related to Tableau dashboards and data connections promptly. Required Skills Experience: 2-5 years of experience in Tableau development and data visualization, with a focus on the manufacturing domain. Technical Proficiency: Strong proficiency in Tableau Desktop and Tableau Server, with experience in creating complex calculations, parameters, and visualizations. Data Management: Hands-on experience with SQL and relational databases for querying and data extraction. Analytical Skills: Strong analytical skills with the ability to interpret data and provide actionable insights. Communication: Excellent communication skills with the ability to present complex information clearly to both technical and non-technical audiences. Problem-Solving: Ability to troubleshoot issues and resolve them effectively in a fast-paced environment. Domain Knowledge: Familiarity with manufacturing processes, metrics, and KPIs is highly desirable. Preferred Skills: Tableau Certification (Desktop Specialist, Desktop Certified Associate, or higher) is a plus. Additional Skills : Knowledge of other BI tools and data analysis techniques, such as Python or R, is advantageous.
Posted 1 month ago
6.0 - 11.0 years
8 - 12 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
We are seeking a skilled Lead Data Engineer with extensive experience in Snowflake, ADF, SQL, and other relevant data technologies to join our team. As a key member of our data engineering team, you will play an instrumental role in designing, developing, and managing data pipelines, working closely with cross-functional teams to drive the success of our data initiatives. Key Responsibilities: Design, implement, and maintain data solutions using Snowflake, ADF, and SQL Server to ensure data integrity, scalability, and high performance. Lead and contribute to the development of data pipelines, ETL processes, and data integration solutions, ensuring the smooth extraction, transformation, and loading of data from diverse sources. Work with MSBI, SSIS, and Azure Data Lake Storage to optimize data flows and storage solutions. Collaborate with business and technical teams to identify project needs, estimate tasks, and set intermediate milestones to achieve final outcomes. Implement industry best practices related to Business Intelligence and Data Management, ensuring adherence to usability, design, and development standards. Perform in-depth data analysis to resolve data issues and improve overall data quality. Mentor and guide junior data engineers, providing technical expertise and supporting the development of their skills. Effectively collaborate with geographically distributed teams to ensure project goals are met in a timely manner. Required Technical Skills: T-SQL, SQL Server, MSBI (SQL Server Integration Services, Reporting Services), Snowflake, Azure Data Factory (ADF), SSIS, Azure Data Lake Storage. Proficient in designing and developing data pipelines, data integration, and data management workflows. Strong understanding of Cloud Data Solutions, with a focus on Azure-based tools and technologies. Nice to Have: Experience with Power BI for data visualization and reporting. Familiarity with Azure Databricks for data processing and advanced analytics.
Posted 1 month ago
5.0 - 7.0 years
25 - 35 Lacs
Bengaluru
Remote
This position is responsible for Data Management and architecture of various entities like Product data, customer, ecommerce transactional data, platform development and integration. This position will work closely with all internal departments (Marketing, Sales, Finance, Product Data Team and Outside vendors) to ensure integrating data with external partners and enterprise integrations are working as expected. This member will play key role in various Data strategies in building ahigh functional data warehouse system. Responsibilities: Contribute with your expertise and technical knowledge to develop and maintain architectural roadmap for digital automation and data services ensuring alignment with the business strategies and standards. Develop data models and AI and ML algorithms to apply to data sets. Provide leadership and guidance on design and management of data for data applications, formulate best practices and organize processes for data management, validation, and evolution. Have comprehensive understanding of Master Data Management Concepts when applied to Customer data including but not limited to data collection, unification, transformation, segmentation, and storage Generate electronic product information data syndications to major LWTA customers Implement solutions to enable a stable architecture for collecting robust data sets. Responsible for the maintenance, improvement, cleaning, and manipulation of data in the data platform and analytics databases. Build processes and tools to maintain high data availability, quality, and maintainability. Articulate technology solutions as well as explain the competitive advantages of various technology alternatives. Design data flows, interfaces, CRUD operations etc. Work on Data modeling Conceptual, Logical and Physical Works with product development/marketing and IT/Digital to ensure a smooth data interface is built. Minimum Requirements : Bachelor's degree in mathematics, computer science, computer engineering, information management or a related field or equivalent relevant experience. 5 years of Data Architecture experience. At least 5 to 7 years experience running projects using an Agile project methodology and developing high performing applications. Strong experience with Data Modeling especially with conceptual, Logical and Physical data. Must possess excellent written and oral communications skills and the ability to clearly define projects, objectives, goals, schedules, and assignments. Must possess the ability to work effectively with business personnel at all levels. Good knowledge of programming languages, software tools and analytical methods. Experience with SQL or equivalent database querying language. Minimum 3 years prior experience focused in ETL design, development, review, and testing with additional experience in two or more of the following areas: database development, data modeling, data architecture, data warehouse development, business intelligence, data profiling, database performance optimization. Critical problem solver who can successfully identify, fix and solve problems Ability to multitask and troubleshoot issues quickly. Self-starter, self-motivated, able to work independently, prioritize effectively, and perform multiple tasks under minimal supervision
Posted 1 month ago
6.0 - 8.0 years
12 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 23 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 23 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.
Posted 1 month ago
7.0 - 12.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Location: Bangalore/Hyderabad/Pune Experience level: 7+ Years About the Role We are seeking a highly skilled Snowflake Developer to join our team in Bangalore. The ideal candidate will have extensive experience in designing, implementing, and managing Snowflake-based data solutions. This role involves developing data architectures and ensuring the effective use of Snowflake to drive business insights and innovation. Key Responsibilities: Design and implement scalable, efficient, and secure Snowflake solutions to meet business requirements. Develop data architecture frameworks, standards, and principles, including modeling, metadata, security, and reference data. Implement Snowflake-based data warehouses, data lakes, and data integration solutions. Manage data ingestion, transformation, and loading processes to ensure data quality and performance. Collaborate with business stakeholders and IT teams to develop data strategies and ensure alignment with business goals. Drive continuous improvement by leveraging the latest Snowflake features and industry trends. Qualifications: Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or a related field. 8+ years of experience in data architecture, data engineering, or a related field. Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. Must have exposure working in Airflow Proven track record of contributing to data projects and working in complex environments. Familiarity with cloud platforms (e.g., AWS, GCP) and their data services. Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is a plus.
Posted 1 month ago
6.0 - 8.0 years
12 - 18 Lacs
Hyderabad, Bengaluru
Work from Office
Job Title: Oracle Fusion Functional Consultant Cash Management & Lease Accounting Location: Hyderabad / Bangalore Experience: 6-8 Years Department: Oracle ERP – Finance Job Summary We are seeking an experienced Oracle Fusion Functional Consultant specializing in Cash Management and Lease Accounting, with strong functional knowledge and hands-on expertise in Fusion Financials. The ideal candidate should have 2–3 end-to-end implementation and/or support cycles, with a preference for candidates who have previously held Financial Functional Lead roles. Familiarity with Oracle Cloud tools, workflow processes, and prior EBS experience is highly desirable. Key Responsibilities Lead or support implementations of Oracle Fusion Cash Management and Lease Accounting modules. Collaborate with business stakeholders to gather requirements and translate them into functional specifications. Write functional design documents (MD50), test scripts, and support OTBI reports & Fusion analytics. Work on Oracle workflow processes and assist technical teams with integrations and reporting needs. Leverage FSM (Functional Setup Manager) and ADF (Application Development Framework) for configurations and issue resolution. Use Data Integration (DI) tools for mass data uploads and validations. Engage in testing, data migration, UAT, and post-go-live support. Ensure compliance with Oracle Cloud best practices and security standards. Required Skills & Experience 2–3 implementations or support projects in Oracle Fusion Cash Management & Lease Accounting. Strong hands-on knowledge in Oracle Fusion Financials. Experience with writing functional specs, working on OTBI (Oracle Transactional Business Intelligence) and Fusion Analytics. Solid understanding of workflow processes and how to configure them in Oracle Cloud. Familiarity with Oracle FSM, ADF tools, and Data Integration (DI) tools. Prior experience in Oracle EBS (Financials). Proven ability to work with cross-functional teams and technical counterparts. Strong communication, documentation, and stakeholder management skills. Preferred Qualifications Experience in a Financial Functional Lead role in past projects. Oracle Financials Cloud Certification preferred (e.g., General Ledger, Payables, Receivables). Exposure to multi-currency, intercompany, and bank reconciliation processes. Familiarity with Agile/Hybrid project methodologies.
Posted 1 month ago
15.0 - 20.0 years
20 - 30 Lacs
Noida, Gurugram
Hybrid
Design architectures using Microsoft SQL Server MongoDB Develop ETLdata lakes, Integrate reporting tools like Power BI, Qlik, and Crystal Reports to data strategy Implement AWS cloud services,PaaS,SaaS, IaaS,SQL and NoSQL databases,data integration
Posted 1 month ago
5.0 - 10.0 years
15 - 30 Lacs
Noida, Pune, Bengaluru
Work from Office
Description: The Data & Analytics Team is seeking a Data Engineer with a hybrid skillset in data integration and application development. This role is crucial for designing, engineering, governing, and improving our entire Data Platform, which serves customers, partners, and employees through self-service access. You'll demonstrate expertise in data & metadata management, data integration, data warehousing, data quality, machine learning, and core engineering principles Requirements: • 5+ years of experience with system/data integration, development, or implementation of enterprise and/or cloud software. • Strong experience with Web APIs (RESTful and SOAP). • Strong experience setting up data warehousing solutions and associated pipelines, including ETL tools (preferably Informatica Cloud). • Demonstrated proficiency with Python. • Strong experience with data wrangling and query authoring in SQL and NoSQL environments for both structured and unstructured data. • Experience in a cloud-based computing environment, specifically GCP. • Expertise in documenting Business Requirement, Functional & Technical documentation. • Expertise in writing Unit & Functional Test Cases, Test Scripts & Run books. • Expertise in incident management systems like Jira, Service Now etc. • Working knowledge of Agile Software development methodology. • Strong organizational and troubleshooting skills with attention to detail. • Strong analytical ability, judgment, and problem analysis techniques. • Excellent interpersonal skills with the ability to work effectively in a cross-functional team. Job Responsibilities: • Lead system/data integration, development, or implementation efforts for enterprise and/or cloud software. • Design and implement data warehousing solutions and associated pipelines for internal and external data sources, including ETL processes. • Perform extensive data wrangling and author complex queries in both SQL and NoSQL environments for structured and unstructured data. • Develop and integrate applications, leveraging strong proficiency in Python and Web APIs (RESTful and SOAP). • Provide operational support for the data platform and applications, including incident management. • Create comprehensive Business Requirement, Functional, and Technical documentation. • Develop Unit & Functional Test Cases, Test Scripts, and Run Books to ensure solution quality. • Manage incidents effectively using systems like Jira, Service Now, etc. • Prepare change management packages and implementation plans for migrations across different environments. • Actively participate in Enterprise Risk Management Processes. • Work within an Agile Software Development methodology, contributing to team success. • Collaborate effectively within cross-functional teams. What We Offer: Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them. Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities! Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays. Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings. Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses. Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!
Posted 1 month ago
12.0 - 14.0 years
25 - 30 Lacs
Chennai
Work from Office
The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.
Posted 1 month ago
7.0 - 8.0 years
12 - 15 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are looking for an experienced Informatica MDM Specialist with 7-8 years of expertise in Informatica Intelligent Cloud Services (IICS), including Cloud Application Integration (CAI) and Cloud Data Integration (CDI). The role involves developing business-critical Informatica entities, managing IDMC administration and architecture, and implementing application integration components. The ideal candidate will have hands-on experience in IDMC, CDI, CAI, and data integration best practices. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of this role is to interpret data and turn into information (reports, dashboards, interactive visualizations etc) which can offer ways to improve a business, thus affecting business decisions. Do 1. Managing the technical scope of the project in line with the requirements at all stages a. Gather information from various sources (data warehouses, database, data integration and modelling) and interpret patterns and trends b. Develop record management process and policies c. Build and maintain relationships at all levels within the client base and understand their requirements. d. Providing sales data, proposals, data insights and account reviews to the client base e. Identify areas to increase efficiency and automation of processes f. Set up and maintain automated data processes g. Identify, evaluate and implement external services and tools to support data validation and cleansing. h. Produce and track key performance indicators 2. Analyze the data sets and provide adequate information a. Liaise with internal and external clients to fully understand data content b. Design and carry out surveys and analyze survey data as per the customer requirement c. Analyze and interpret complex data sets relating to customers business and prepare reports for internal and external audiences using business analytics reporting tools d. Create data dashboards, graphs and visualization to showcase business performance and also provide sector and competitor benchmarking e. Mine and analyze large datasets, draw valid inferences and present them successfully to management using a reporting tool f. Develop predictive models and share insights with the clients as per their requirement Deliver NoPerformance ParameterMeasure1.Analyses data sets and provide relevant information to the clientNo. Of automation done, On-Time Delivery, CSAT score, Zero customer escalation, data accuracy Mandatory Skills: Power BI Visualization on cloud. Experience5-8 Years.
Posted 1 month ago
8.0 - 10.0 years
10 - 12 Lacs
Hyderabad
Work from Office
Details of the role: 8 to 10 years experience as Informatica Admin (IICS) Key responsibilities: Understand the programs service catalog and document the list of tasks which has to be performed for each Lead the design, development, and maintenance of ETL processes to extract, transform, and load data from various sources into our data warehouse. Implement best practices for data loading, ensuring optimal performance and data quality. Utilize your expertise in IDMC to establish and maintain data governance, data quality, and metadata management processes. Implement data controls to ensure compliance with data standards, security policies, and regulatory requirements. Collaborate with data architects to design and implement scalable and efficient data architectures that support business intelligence and analytics requirements. Work on data modeling and schema design to optimize database structures for ETL processes. Identify and implement performance optimization strategies for ETL processes, ensuring timely and efficient data loading. Troubleshoot and resolve issues related to data integration and performance bottlenecks. Collaborate with cross-functional teams, including data scientists, business analysts, and other engineering teams, to understand data requirements and deliver effective solutions. Provide guidance and mentorship to junior members of the data engineering team. Create and maintain comprehensive documentation for ETL processes, data models, and data flows. Ensure that documentation is kept up-to-date with any changes to data architecture or ETL workflows. Use Jira for task tracking and project management. Implement data quality checks and validation processes to ensure data integrity and reliability. Maintain detailed documentation of data engineering processes and solutions. Required Skills: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as a Senior ETL Data Engineer, with a focus on IDMC / IICS Strong proficiency in ETL tools and frameworks (e.g., Informatica Cloud, Talend, Apache NiFi). Expertise in IDMC principles, including data governance, data quality, and metadata management. Solid understanding of data warehousing concepts and practices. Strong SQL skills and experience working with relational databases. Excellent problem-solving and analytical skills.
Posted 1 month ago
5.0 - 8.0 years
5 - 9 Lacs
Chennai
Work from Office
GCP Engineer GCP developer should have the expertise on the components like Scheduler, DataFlow, BigQuery, Pub/Sub and Cloud SQL etc. Good understanding of GCP cloud environment/services (IAM, Networking, Pub/Sub, Cloud Run, Cloud Storage, Cloud SQL/PostgreSQL, Cloud Spanner etc) based on real migration projects Knowledge of Java / Java frameworks. Have leveraged/ worked with any or all technology areas like Spring boot, Spring batch, Spring boot cloud etc. Experience with API, Microservice design principles and leveraged them in actual project implementation for integration. Deep understanding of Architecture and Design Patterns Need to have knowledge of implementation of event-driven architecture, data integration, event streaming architecture, API driven architecture. Needs to be well versed with DevOps principal and need to have working experience in Docker/containerization. Experience in solution and execution of IaaS, PaaS, SaaS-based deployments, etc. Require conceptual thinking to create 'out of the box solutions Should be good in communication and should be able to handle both customer and development team to deliver an outcome. Mandatory Skills: App-Cloud-Google. Experience5-8 Years.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |