Jobs
Interviews

2471 Data Integration Jobs - Page 44

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 12 Lacs

Bengaluru

Work from Office

HI ALL, Please find below JD Position: Python Developer Location Bangalore (Hybrid mode) Experience: 5 to 8Years Position: Contract to hire Education: (BTech\MTech\MCA\BSC\BCA) Job Description: Bachelor's or Master's degree in Computer Science, Information Technology, or related field. 5+ years of experience in ETL development using Python. Experience with open-source ETL tools and libraries such as Apache Airflow, PySpark, and Pandas. Experience in development of database processes using Oracle SQL, PL/SQL UNIX and/or Windows environments Scripting such as Shell, Perl Object Oriented Analysis and Design (OOAD) Service Oriented Architecture (SOA) Strong understanding of data warehousing concepts, data modeling, and data integration. Excellent problem-solving skills and attention to detail. Ability to lead and mentor a team of developers. Strong communication and collaboration skills. Preferred Qualifications: Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with version control systems (e.g., Git). Experience in Agile/Scrum development methodologies.

Posted 1 month ago

Apply

4.0 - 7.0 years

9 - 19 Lacs

Noida, Greater Noida, Delhi / NCR

Work from Office

Experience Required: 3 -6 Years Mandatory Skills- Informatica IDMC (Informatica Intelligent Data Management Cloud), Informatica PowerCenter Shift- 12:00PM to 08:30PM Contact- Email- Aarushi.Shukla@coforge.com Hand on experience required for following mentioned skills (Mandatory)- Informatica IDMC Informatica PowerCenter Informatica CDQ Informatica Data Integration Unix scripting SQL

Posted 1 month ago

Apply

3.0 - 8.0 years

13 - 15 Lacs

Chennai

Work from Office

Focus on POWER PAGES, POWER APPS, CANVAS APPS, WORKING WITH CONNECTORS & CUSTOM CONNECTORS, APPLICATION SECURITY, AUTHORIZATION MECHANICSM, AZURE, CLOUD SERVICES, DATA VERSE, DATA MODELLING, CO-PILOT, SQL, POWER QUERY, DATA INTEGRATION, DATA EXTRACTN Required Candidate profile BE/MCA/MSc 3+yrs exp as POWER PLATFORM DEVELOPER Strong skills in POWER PAGES, POWER APPS, APPLICATION SECURITY, AZURE, CLOUD SERVICES, DATA VERSE, CO-PILOT, SQL, POWER QUERY Relocate to CHENNAI Perks and benefits Excellent Perks. Call Ms Divya @ 7010384865 now

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Hubli, Mangaluru, Mysuru

Work from Office

Build of interim archive solution for EDI on the Clients Maestrplatform which is a self-service, cloud-based platform on Azure Data Lake Storage. Experience in storage and retrieval of EDI message provided by the cloud-based EDI Seeburger platform.Experience in Azure Data Integration Engineer with expertise in Azure Data Factory, Databricks, Data Lake, Key Vault, Azure Active Directory. Understand security/encryption considerations and options

Posted 1 month ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Chennai

Work from Office

Project description You will be working in a cutting edge, banking environment which is now ongoing thorough upgrade program. You will be responsible for translating business data and overall data into reusable and adjustable dashboards used by senior business managers. Responsibilities Design and develop complex T-SQL queries and stored procedures for data extraction, transformation, and reporting. Build and manage ETL workflows using SSIS to support data integration across multiple sources. Create interactive dashboards and reports using Tableau and SSRS for business performance monitoring. Develop and maintain OLAP cubes using SSAS for multidimensional data analysis. Collaborate with business and data teams to understand reporting requirements and deliver scalable BI solutions. Apply strong data warehouse architecture and modeling concepts to build efficient data storage and retrieval systems. Perform performance tuning and query optimization for large datasets and improve system responsiveness. Ensure data quality, consistency, and integrity through robust validation and testing processes. Maintain documentation for data pipelines, ETL jobs, and reporting structures. Stay updated with emerging Microsoft BI technologies and best practices to continuously improve solutions. SkillsMust have At least 6 years of experience with T-SQL and SQL Server (SSIS and SSRS exposure is a must. Proficient with Tableau, preferably with at least 4 years of experience creating dashboards. Experience working with businesses and delivering dashboards to senior management. Working within Data Warehouse architecture experience is a must. Exposure to Microsoft BI. Nice to have N/A

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Pune

Work from Office

Provide expertise in analysis, requirements gathering, design, coordination, customization, testing and support of reports, in client’s environment Develop and maintain a strong working relationship with business and technical members of the team Relentless focus on quality and continuous improvement Perform root cause analysis of reports issues Development / evolutionary maintenance of the environment, performance, capability and availability. Assisting in defining technical requirements and developing solutions Effective content and source-code management, troubleshooting and debugging Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Cognos Developer & Admin Required. EducationThe resource should be full time MCA/M. Tech/B. Tech/B.E. and should preferably have relevant certifications ExperienceThe resource should have a minimum of 3 years of experience of working in the in BI DW projects in areas pertaining to reporting and visualization using cognos. The resources shall have worked in at least two projects where they were involved in developing reporting/ visualization * He shall have good understanding of UNIX. Should be well conversant in English and should have excellent writing, MIS, communication, time management and multi-tasking skill Preferred technical and professional experience Experience with various cloud and integration platforms (e.g. AWS, Google, Azure) Agile mindset - ability to process changes of priorities and requests, ownership, critical thinking Experience with an ETL/Data Integration tool (eg. IBM InfoSphere DataStage, Azure Data Factory, Informatica PowerCenter)

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back-end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong SAS ETL programming skills with 5+ years of relevant SAS ETL programming experience in a banking project. Have strong knowledge in areas of SAS development Preferred technical and professional experience Have strong knowledge in areas of SAS development, Banking domain

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Gurugram

Work from Office

As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your primary responsibilities include: Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 5+ years of experience in IT industry. Minimum of 4+ years of Experience in Oracle Applications and Oracle Cloud in Technical Domain. 2 End to End Implementations in Oracle Supply Chain Management Cloud as Functional Consultant. Should have worked in Inventory, Order Management, Cost Management, GOP Cloud, Data Integration, FBDI, ADFDI Minimum 4+ years of experience in BIP reporting Preferred technical and professional experience You’ll have access to all the technical and management training courses you need to become the expert you want to be. Should have minimum 3 or more years of relevant experience in Oracle Cloud Technical (Oracle Fusion ) 12c Development and Implementation. Should have good knowledge of integrating with WebServices, XML(Extensible Markup Language) and other API(Application Programming Interface) to transfer the data - from source and target, in addition to database.

Posted 1 month ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client’s needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our client’s business requirements Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyse and model data to ensure optimal ETL design and performance. Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation

Posted 1 month ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Mumbai

Work from Office

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati

Posted 1 month ago

Apply

2.0 - 5.0 years

6 - 10 Lacs

Mumbai

Work from Office

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Expertise in Data warehousing/ information Management/ Data Integration/Business Intelligence using ETL tool Informatica PowerCenter Knowledge of Cloud, Power BI, Data migration on cloud skills. Experience in Unix shell scripting and python Experience with relational SQL, Big Data etc Preferred technical and professional experience Knowledge of MS-Azure Cloud Experience in Informatica PowerCenter Experience in Unix shell scripting and python

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Mumbai

Work from Office

Act as an overall architect for project delivery, solutioning and client facing activities. Leads definition & realization of architecture for omnichannel systems of engagement. Should be strong in software architecture fundamentals & assets, tools using garage method for cloud. The practitioner should have at least 3 years of experience in API and Microservices based architecture definition and implementation Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise BE / B Tech in any stream, M.Sc. (Computer Science/IT) / M.C.A, with Minimum 1-3 years of experience Experience in Designs and builds solutions to move data from operational and external environments to the business intelligence environment using Informatica, Experience in Ab Initio software and DataStage (formerly Ascential) - IBM's WebSphere Data Integration Suite. Experience in Skills include designing and developing extract, transform and load (ETL) processes. Experience includes full lifecycle implementation of the technical components of a business intelligence solution Preferred technical and professional experience Experience in software architecture fundamentals & assets, tools using garage method for cloud. Experienced practitioner with at least 3 years of experience in API and Microservices based architecture definition and implementation. Should be strong in Skills include designing and developing extract, transform and load (ETL) processes

Posted 1 month ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Hyderabad

Work from Office

As a consultant you will serve as a client-facing practitioner who sells, leads and implements expert services utilizing the breadth of IBM's offerings and technologies. A successful Consultant is regarded by clients as a trusted business advisor who collaborates to provide innovative solutions used to solve the most challenging business problems. You will work developing solutions that excel at user experience, style, performance, reliability and scalability to reduce costs and improve profit and shareholder value. Your primary responsibilities include: Build, automate and release solutions based on clients priorities and requirements. Explore and discover risks and resolving issues that affect release scope, schedule and quality and bring to the table potential solutions. Make sure that all integration solutions meet the client specifications and are delivered on time Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 5+ years of experience in IT industry. Minimum of 4+ years of Experience in Oracle Applications and Oracle Cloud in Technical Domain. 2 End to End Implementations in Oracle Supply Chain Management Cloud as Functional Consultant. Should have worked in Inventory, Order Management, Cost Management, GOP Cloud, Data Integration, FBDI, ADFDI Minimum 4+ years of experience in BIP reporting Preferred technical and professional experience You’ll have access to all the technical and management training courses you need to become the expert you want to be. Should have minimum 3 or more years of relevant experience in Oracle Cloud Technical (Oracle Fusion )12c Development and Implementation. Should have good knowledge of integrating with Web Services, XML(Extensible Markup Language) and other API(Application Programming Interface) to transfer the data - from source and target, in addition to database

Posted 1 month ago

Apply

5.0 - 9.0 years

9 - 13 Lacs

Chennai

Work from Office

Your role Location: Chennai Work Hours: 12 PM – 9 PM IST Work Mode: Onsite – 5 days a week We are looking for a detail-oriented and proactive Senior Data Scientist for our Cross-Border line of business. In this pivotal role, you will work closely with the North America Quantitative Analytics leadership team to deliver high-impact data analysis and reporting in support of our Credit and Treasury functions. You will play a hands-on role in maintaining and enhancing our core reporting infrastructure, performing critical data analysis, and supporting operational excellence across our analytics efforts. This role is ideal for an experienced analyst ready to take ownership of high-impact deliverables and collaborate across global teams. What you'll be doing Own and deliver complex, business-as-usual (BAU) reports. Analyze large datasets to extract insights, trends, and identify improvement opportunities. Collaborate with North American stakeholders to gather requirements and deliver data-driven solutions. Utilize SQL, Python, and Excel for data extraction, transformation, and reporting automation. Work with internal partners to resolve data issues and ensure accuracy. Proactively troubleshoot anomalies and communicate findings and resolutions. Support data integration from new sources and adapt to evolving reporting needs. Contribute to continuous improvements in reporting efficiency, accuracy, and control measures. 5–7 years of experience in data analysis, business intelligence, or a related role. Bachelor’s or Master’s degree in Computer Science, Mathematics, Engineering, Statistics, or a related field. Proficiency in SQL, Python, and Excel . Working knowledge of VBA, data visualization, or AI tools is a plus. Strong attention to detail and data quality. Eagerness to learn about compliance and risk in a data context. Ability to work independently, manage deadlines, and collaborate effectively. Experience working with global stakeholders. Familiarity with financial services is preferred but not required. Willingness to work in office, during the specified shift hours, is mandatory . About Corpay Corpay is a global technology organisation that is leading the future of commercial payments with a culture of innovation that drives us to constantly create new and better ways to pay. Our specialized payment solutions help businesses control, simplify, and secure payment for fuel, general payables, toll and lodging expenses. Millions of people in over 80 countries around the world use our solutions for their payments. All offers of employment made by Corpay (and its subsidiary companies) are subject to the successful completion of satisfactory pre-employment vetting by an independent supplier (Experian). This is in accordance with Corpay's Resourcing Policy and include employment referencing, identity, adverse financial, criminal and sanctions list checks. We do this to meet our legal and regulatory requirements. Corpay is dedicated to encouraging a supportive and inclusive culture among our employees. It is within our best interest to promote diversity and eliminate discrimination in the workplace. We seek to ensure that all employees and job applicants are given equal opportunities. Notice to Agency and Search Firm Representatives: Corpay will not accept unsolicited CV's from agencies and/or search firms for this job posting. Resumes submitted to any Corpay employee by a third party agency and/or search firm without a valid written & signed search agreement, will become the sole property of Corpay. No fee will be paid if a candidate is hired for this position as a result of an unsolicited agency or search firm referral. Thank you.

Posted 1 month ago

Apply

3.0 - 7.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Job Title: Senior ODI Developer Company: KPI Partners Location: Hyderabad, Telangana, India; Bengaluru, Karnataka, India; Pune, Maharashtra, India Job Description: KPI Partners is seeking a highly skilled Senior ODI Developer to join our dynamic team. The ideal candidate will play a crucial role in the design, development, and implementation of Oracle Data Integrator (ODI) solutions that meet our clients' requirements. This position offers an exciting opportunity to work on diverse projects and contribute to our clients' success in data integration and management. Key Responsibilities: - Design, develop, and maintain data integration processes using Oracle Data Integrator. - Collaborate with cross-functional teams to gather requirements and translate them into effective data integration solutions. - Optimize ODI mappings and processes for enhanced performance and efficiency. - Troubleshoot and resolve issues related to data extraction, transformation, and loading (ETL) processes. - Create and maintain documentation for data integration workflows, including design specifications, technical documentation, and user guides. - Ensure data quality and integrity throughout the integration process. - Mentor and provide guidance to junior team members on best practices in ODI development. - Stay updated with the latest trends and advancements in data integration technologies and ODI features. Qualifications: - Bachelor’s degree in Computer Science, Information Technology, or a related field. - 5+ years of experience working as an ODI Developer with a strong background in ETL processes. - Proficiency in Oracle Data Integrator and related tools. - Strong SQL skills and experience with relational databases. - Good understanding of data warehousing concepts and methodologies. - Excellent problem-solving skills and attention to detail. - Strong communication and interpersonal skills to collaborate effectively with team members and stakeholders. Preferred Qualifications: - Experience with Oracle databases and other ETL tools. - Familiarity with cloud-based data integration solutions. - Good to have experience in OBIA/BI Apps What We Offer: - Competitive salary and benefits package. - Opportunities for professional development and career advancement. - A collaborative and innovative work environment. If you are a motivated and experienced Senior ODI Developer looking to make a significant impact in a growing company, we encourage you to apply. Join KPI Partners and be part of our journey to deliver exceptional data integration solutions.

Posted 1 month ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Hyderabad

Work from Office

Role & responsibilities The Data Migration Analyst will be responsible for the following: Execute data migrations from source systems into Salesforce using tools such as Salesforce Data Loader, DemandTools, Workbench, and cloud-based ETL platforms. Define and document source-to-target data mapping with a focus on Salesforce schema and best practices. Conduct data profiling, cleansing, standardization, and de-duplication prior to import. Perform pre- and post-migration data validations and audits to ensure completeness and accuracy. Support manual migration tasks when automation is not feasible. Collaborate with Salesforce Admins, developers, and stakeholders to align on migration strategy. Maintain detailed data dictionaries, mapping documents, and audit logs. Documentation of any changes and progress. Preferred candidate profile Ideal candidates will have the following qualifications and technical skills: 2 - 3 years of hands-on experience in data migration, with strong Salesforce focus. Advanced Microsoft Excel skills (Power Query, pivot tables, macros/VBA). Proficiency with Salesforce Data Loader, Workbench, or cloud-based ETL tools such as Talend, Informatica Cloud, or Boomi. Solid understanding of Salesforce objects such as Accounts, Contacts, Opportunities, Leads, and Custom Objects. Basic SQL knowledge for performing data validation and profiling. Strong knowledge of data mapping, cleansing techniques, and audit procedures. Excellent verbal and written communication skills. Ability to work under pressure in fast-paced M&A environments. Preferred Qualifications Additional qualifications that will set candidates apart include: Salesforce Administrator Certification. Experience with M&A data integrations or Salesforce org consolidations. Familiarity with data governance practices and master data management (MDM). Soft Skills & Attributes We value candidates who demonstrate the following qualities: Detail-oriented with strong analytical and critical thinking abilities. Quality-focused mindset with a proactive problem-solving approach. Business-aware - able to interpret the real-world implications of data changes. Effective in team-based and independent work environments. Flexible and adaptable to shifting priorities and timelines.

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 15 Lacs

Noida, Bhubaneswar, Greater Noida

Work from Office

4–6 yrs in data migration, ETL, integration SQL, Python/Shell scripting Exp in GCP, AWS, Azure RDBMS & NoSQL (PostgreSQL, MongoDB, etc.) ETL tools (Talend, NiFi, Glue) Airflow, Kafka, DBT, Dataflow Strong problem-solving & communication skills

Posted 1 month ago

Apply

5.0 - 8.0 years

5 - 10 Lacs

Chennai

Hybrid

As a Research Specialist III You will be responsible for researching, verifying, and updating data for ZoomInfo's industry-leading sales intelligence platform. The right candidate for this role has an engaging personality, an eye for quality, and a drive to learn with us as we continue to improve the top-quality research processes that keep ZoomInfo ahead of our competition. What You'll Do: Data Research: Conduct thorough research to collect and validate company firmographic data, including details such as company size, industry classification, and location. Executive Contact Data: Gather and verify executive contact information, including names, titles, emails, and phone numbers, ensuring data accuracy Data Integrity: Maintain a high level of attention to detail to uphold data quality and consistency standards. Adhere to standards: Adhere to research protocols, privacy laws and maintain confidentiality to protect operations and ensure customer confidence Collaboration: Collaborate effectively with cross-functional teams to contribute to the improvement and growth of our sales intelligence database What You Bring Minimum 5 to 8 years of previous experience in a Data Research role Shift time & Overlap: 1 PM IST to 10 PM IST - At times there might be a overlap in working at PST time zones as required to align with project needs. Excellent understanding of company size, structure and location, classification of companies (industry, ownership type and business) and good understanding of corporate actions like mergers, acquisitions and parent-subsidiary relationships Ability to establish priorities and work independently with little supervision Experience working with spreadsheets, and the ability to analyze data tables and draw conclusions Attention to detail and numeracy abilities Maintain a high level of accuracy while balancing changes in workload This is a mandatory hybrid role (3 days Work from Office and 2 days Work from home) and general shift. Designation : Research Specialist III Location : Global Info City, Block A, 11th Floor, MGR Salai, Perungudi, Chennai Reporting to : Team Leader - Data Research .

Posted 1 month ago

Apply

5.0 - 10.0 years

14 - 19 Lacs

Hyderabad

Work from Office

We are looking for future Insighters who can demonstrate teamwork, results orientation, a growth mindset, disciplined execution, and a winning attitude to join our growing team. We are looking for the right person to add to our growing enterprise data team. We are seeking a dynamic Data Engineer with a specialized focus on Business Intelligence and a profound understanding of financial concepts. This role is pivotal in bridging the gap between our technical and business teams, ensuring that financial metrics like recurring revenue, adjustments, renewals, and retention are accurately represented and easily understood across the organization. You will be instrumental in driving data-driven decision-making by translating complex financial data into actionable insights. Location Hyderabad(Remote) Shift Timings : 5:00 pm IST (6:30 am EST/7:30 am EDT) - 2:00 am IST (3:30 pm EST/4:30 pm EDT) Job Responsibilities Collaborates with the business to develop, debug, and maintain a comprehensive financial reporting suite. Requires deep fluency in financial concepts to ensure reporting accuracy and relevance. Aligns initiatives between business teams and technical teams to refine data models that feed into business intelligence tools. Acts as a fluent middleman between these groups, translating complex financial concepts into actionable insights for data-driven decision-making across the organization. Implements and innovates processes and systems to monitor data quality, ensuring production data is accurate, timely, and available for key stakeholders. Continually contributes to and enhances data team documentation, focusing on clarity and the explanation of financial metrics and models. Performs complex data analysis to troubleshoot and resolve data-related issues. Utilizes understanding of financial metrics to provide insights that drive business strategy. Works closely with a cross-functional team including frontend and backend engineers, product managers, and analysts. This role involves explaining financial concepts in common terms to ensure all team members understand the impact of data on business outcomes. Defines and manages company data assets, artifacts, and data models, ensuring they reflect current financial terminologies and practices. Qualifications Required Qualifications and Skills 5 Years of Data Engineering Experience with a focus on Business Intelligence 5 Years in Financial Reporting, with a strong grasp of financial concepts 5 Years of Experience with Power BI 3 Years in BI Architecture and Modeling 2 Years in Cloud BI Engineering Proficiency in Agile Methodologies Preferred Skills AWS or Azure Data Certifications Hands-on experience with Databricks, Spark, Python, ML Experience with Salesforce for enhanced data integration and CRM insights Additional Information ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** insightsoftware Hear From Our Team - InsightSoftware (wistia.com) Background checks are required for employment with insightsoftware, where permitted by country, state/province.

Posted 1 month ago

Apply

7.0 - 9.0 years

7 - 11 Lacs

Bengaluru

Work from Office

As a Data Engineer on the Nokia Enterprise Data Platform (EDP) team, you will play a vital role in shaping our data as a service platform. Collaborating closely with a dynamic team of developers, architects, and product managers, and tackling exciting challenges related to data integrations and performance monitoring in a fast-paced Agile environment. Your expertise will drive improvements in application performance while increasing our data capabilities amidst growing complexities. You have: Bachelor of Engineering in Computer science or Information & Technology, or Communication with 7-9 years of relevant experience working in a software development or equivalent role. Proficiency in SQL, Python, Spark, and Azure Data Engineering tools Experience in designing and maintaining scalable Azure-based infrastructure Experience with Power BI, SAP, or GIT Experience in implementing security best practices for cloud infrastructure It would be nice if you also had: Familiarity with Lakehouse architecture, Delta tables, and Databricks Motivation and drive to overcome challenges and succeed in a fast growing technical landscape. Knowledge of Agile/Scrum methodologies Develop, maintain, and optimize Nokias Enterprise Data Platform to ensure seamless data delivery across the organization. Collaborate with cross-functional teams to gather requirements, analyze demands, and implement data solutions aligned with business needs. Monitor and troubleshoot data pipeline performance, ensuring efficient data integration and throughput. Design and implement scalable Azure-based infrastructure, adhering to best practices for security and reliability. Build and manage CI/CD pipelines using Azure DevOps to streamline software development and deployment processes. Provide technical expertise in coding, testing, and system analysis, ensuring adherence to standard development practices. Communicate effectively with stakeholders, articulating technical challenges and project updates in a clear and concise manner. Foster a proactive and team-oriented environment, mentoring junior engineers, and promoting knowledge sharing across the team.

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Knowledge of design principles and fundamentals of architecture Understanding of performance engineering Knowledge of quality processes and estimation techniques Basic understanding of project domain Ability to translate functional / nonfunctional requirements to systems requirements Ability to design and code complex programs Ability to write test cases and scenarios based on the specifications Good understanding of SDLC and agile methodologies Awareness of latest technologies and trends Technical and Professional : Primary skills:Technology-Data Management - Data Integration-Informatica Preferred Skills: Technology-Data Management - Data Integration-Informatica

Posted 1 month ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Team ManagementLead and manage a team of technical professionals – Data Engineer and Application Developers, ensuring effective collaboration and productivity.Client InteractionServe as the primary point of contact for clients, understanding their needs and ensuring successful project delivery.Status ReportingPrepare and present detailed status reports on ongoing projects to stakeholders, highlighting progress, risks, and mitigation plans.Issue ResolutionProactively identify and resolve any technical challenges or roadblocks faced by the team, ensuring smooth project execution.Project ManagementOversee the entire project lifecycle, from planning to execution, ensuring timely delivery within scope and budget. Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional : In-depth knowledge of the Palantir Foundry platform, including data integration (Data Connections), Data Transformation (Code Repository & Pipeline Builder), analysis (Contour and Quiver), visualization (Workshop) and Ontology Manager. Must also have a good knowledge of Spark (PySpark) & TypeScript. Knowledge on AIP is an added advantage. Preferred Skills: Technology-Analytics - Packages-Python - Big Data Technology-Big Data - Data Processing-Spark

Posted 1 month ago

Apply

7.0 - 10.0 years

12 - 15 Lacs

Bengaluru

Work from Office

Core Oracle DB, Mongo DB, MySQL, MariaDB, MySQL Design & create database objects Replicate Database from Primary to DR site Implement security & integrity, backup recovery Manage database permissions Data Migration Install & configure test env.

Posted 1 month ago

Apply

1.0 - 6.0 years

4 - 9 Lacs

Pune

Work from Office

Experienced in XML, XSLT, and Data Integration to join our team supporting UKG Ready / Kronos projects. Experience in flat file/CSV formats, and designing solutions involving REST API and SOAP API. Work in the Human Capital Management (HCM) domain.

Posted 1 month ago

Apply

5.0 - 8.0 years

4 - 8 Lacs

Kolkata

Work from Office

We are seeking a highly skilled and experienced Hadoop Administrator to join our dynamic team. The ideal candidate will have extensive experience in managing and optimizing Hadoop clusters, ensuring high performance and availability. You will work with a variety of big data technologies and play a pivotal role in managing data integration, troubleshooting infrastructure issues, and collaborating with cross-functional teams to streamline data workflows. Key Responsibilities : - Install, configure, and maintain Hadoop clusters, ensuring high availability, scalability, and performance. - Manage and monitor various Hadoop ecosystem components, including HDFS, YARN, Hive, Impala, and other related technologies. - Oversee the integration of data from Oracle Flexcube and other source systems into the Cloudera Data Platform. - Troubleshoot and resolve complex issues related to Hadoop infrastructure, performance, and applications. - Collaborate with cross-functional teams including data engineers, analysts, and architects to optimize data workflows and processes. - Implement and manage data backup, recovery plans, and disaster recovery strategies for Hadoop clusters. - Perform regular health checks on the Hadoop ecosystem, including managing logs, capacity planning, and system updates. - Develop, test, and optimize scripts to automate system maintenance and data management tasks. - Ensure compliance with internal security policies and industry best practices for data protection. - Provide training and guidance to junior team members and help in knowledge sharing within the team. - Create and maintain documentation related to Hadoop administration processes, system configurations, troubleshooting steps, and best practices. - Stay updated with the latest trends in Hadoop technologies and suggest improvements and new tools as necessary. Qualifications : - Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. - 5+ years of hands-on experience in Hadoop administration, with a preference for candidates from the banking or financial sectors. - Strong knowledge of Oracle Flexcube, Cloudera Data Platform, Hadoop, Hive, Impala, and other big data technologies. - Proven experience in managing and optimizing large-scale Hadoop clusters, including cluster upgrades and performance tuning. - Expertise in configuring and tuning Hadoop-related services (e.g., HDFS, YARN, MapReduce). - Strong understanding of data security principles and implementation of security protocols within Hadoop. - Excellent analytical, troubleshooting, and problem-solving skills. - Strong communication and interpersonal skills with the ability to work collaboratively within cross-functional teams. - Ability to work independently, manage multiple priorities, and meet deadlines. - Certification in Hadoop administration or related fields is a plus. - Experience with scripting languages such as Python, Shell, or Perl is desirable.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies