Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
6 - 11 Lacs
Hyderabad
Work from Office
Design, develop, and maintain ETL processes using Talend. Manage and optimize data pipelines on Amazon Redshift. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. High proficiency in Talend. Strong experience with Amazon Redshift. Expertise in DBT and PySpark. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other data engineering tools and frameworks. Knowledge of machine learning frameworks and libraries.
Posted 1 month ago
4.0 - 9.0 years
6 - 11 Lacs
Pune
Work from Office
Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting).
Posted 1 month ago
12.0 - 17.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Data Tester Highlights: 5 plus years experience in data testing ETL TestingValidating the extraction, transformation, and loading (ETL) of data from various sources. Data ValidationEnsuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL ProficiencyWriting and executing SQL queries to fetch and analyze data. Data ModelingUnderstanding data models, data mappings, and architectural documentation. Test Case DesignCreating test cases, test data, and executing test plans. TroubleshootingIdentifying and resolving data-related issues. Dashboard TestingValidating dashboards for accuracy, functionality, and user experience. CollaborationWorking with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional TestingSimulating user interactions and clicks to ensure dashboards are functioning correctly. Performance TestingEvaluating dashboard responsiveness and load times. Data Quality TestingVerifying that the data displayed on dashboards is accurate, complete, and consistent. Usability TestingAssessing the ease of use and navigation of dashboards. Data Visualization TestingEnsuring charts, graphs, and other visualizations are accurate and present data effectively. Security TestingVerifying that dashboards are secure and protect sensitive data. Tools and Technologies: SQLUsed for querying and validating data. Hands on snowflake ETL ToolsTools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization ToolsTableau, Power BI, or other BI tools used for creating and testing dashboards. Testing FrameworksFrameworks like Selenium or JUnit used for automating testing tasks. Cloud PlatformsAWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools
Posted 1 month ago
15.0 - 20.0 years
10 - 14 Lacs
Pune
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving the success of application projects and fostering a collaborative environment among team members and other departments. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement adjustments as necessary to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Good To Have Skills: Experience with data integration tools and methodologies.- Strong understanding of data warehousing concepts and practices.- Experience in performance tuning and optimization of ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Sr Devloper with special emphasis and experience of 6 to 8 years on Pyspark, Python and SQL along with ETL Tools ( Talend / Ab initio / informatica / Similar) . Also have good exposure to ETL tools to understand the flow and rewrite them into Python and Pyspark and executing the test plans. 3+ years of sound knowledge on Pyspark to implement ETL logics. Proficiency in data modeling and design, including PL/SQL development Creating test plans to understand current ETL flow and rewriting them to Pyspark. Providing ongoing support and maintenance for ETL applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews, Continuous Integration.
Posted 1 month ago
8.0 - 13.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Combine interface design concepts with digital design and establish milestones to encourage cooperation and teamwork. Develop overall concepts for improving the user experience within a business webpage or product, ensuring all interactions are intuitive and convenient for customers. Collaborate with back-end web developers and programmers to improve usability. Conduct thorough testing of user interfaces in multiple platforms to ensure all designs render correctly and systems function properly. Converting the jobs from Talend ETL to Python and convert Lead SQLS to Snowflake. Developers with Python and SQL Skills. Developers should be proficient in Python (especially Pandas, PySpark, or Dask) for ETL scripting, with strong SQL skills to translate complex queries. They need expertise in Snowflake SQL for migrating and optimizing queries, as well as experience with data pipeline orchestration (e.g., Airflow) and cloud integration for automation and data loading. Familiarity with data transformation, error handling, and logging is also essential.
Posted 1 month ago
7.0 - 12.0 years
4 - 8 Lacs
Pune
Work from Office
1. ETLHands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica 2. Big DataExperience of big data platforms such as Hadoop, Hive or Snowflake for data storage and processing 3. Data Warehousing & Database ManagementUnderstanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design 4. Data Modeling & DesignGood exposure to data modeling techniques; design, optimization and maintenance of data models and data structures 5. LanguagesProficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala 6. DevOpsExposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Ab InitioExperience developing CoOp graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, ExpressIT, Data Profiler and ConductIT, ControlCenter, ContinuousFlows CloudGood exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & ControlsExposure to data validation, cleansing, enrichment and data controls ContainerizationFair understanding of containerization platforms like Docker, Kubernetes File FormatsExposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta OthersBasics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage.
Posted 1 month ago
7.0 - 12.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Data Modeller JD: We are seeking a skilled Data Modeller to join our Corporate Banking team. The ideal candidate will have a strong background in creating data models for various banking services, including Current Account Savings Account (CASA), Loans, and Credit Services. This role involves collaborating with the Data Architect to define data model structures within a data mesh environment and coordinating with multiple departments to ensure cohesive data management practices. Data Modelling: oDesign and develop data models for CASA, Loan, and Credit Services, ensuring they meet business requirements and compliance standards. Create conceptual, logical, and physical data models that support the bank's strategic objectives. Ensure data models are optimized for performance, security, and scalability to support business operations and analytics. Collaboration with Data Architect: Work closely with the Data Architect to establish the overall data architecture strategy and framework. Contribute to the definition of data model structures within a data mesh environment. Data Quality and Governance: Ensure data quality and integrity in the data models by implementing best practices in data governance. Assist in the establishment of data management policies and standards. Conduct regular data audits and reviews to ensure data accuracy and consistency across systems. Data Modelling ToolsERwin, IBM InfoSphere Data Architect, Oracle Data Modeler, Microsoft Visio, or similar tools. DatabasesSQL, Oracle, MySQL, MS SQL Server, PostgreSQL, Neo4j Graph Data Warehousing TechnologiesSnowflake, Teradata, or similar. ETL ToolsInformatica, Talend, Apache NiFi, Microsoft SSIS, or similar. Big Data TechnologiesHadoop, Spark (optional but preferred). TechnologiesExperience with data modelling on cloud platforms Microsoft Azure (Synapse, Data Factory)
Posted 1 month ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Job Title Data Analyst / Technical Business Analyst Job Summary We are looking for a skilled Data Analyst to support a large-scale data migration initiative within the banking and insurance domain. The role involves analyzing, validating, and transforming data from legacy systems to modern platforms, ensuring regulatory compliance, data integrity, and business continuity. Key Responsibilities Collaborate with business stakeholders, data architects, and IT teams to gather and understand data migration requirements. Analyze legacy banking and insurance systems (e.g., core banking, policy admin, claims, CRM) to identify data structures and dependencies. Work with large-scale datasets and understand big data architectures (e.g., Hadoop, Spark, Hive) to support scalable data migration and transformation. Perform data profiling, cleansing, and transformation using SQL and ETL tools, with the ability to understand and write complex SQL queries and interpret the logic implemented in ETL workflows. Develop and maintain data mapping documents and transformation logic specific to financial and insurance data (e.g., customer KYC, transactions, policies, claims). Validate migrated data against business rules, regulatory standards, and reconciliation reports. Support UAT by preparing test cases and validating migrated data with business users. Ensure data privacy and security compliance throughout the migration process. Document issues, risks, and resolutions related to data quality and migration. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, Finance, or a related field. 5+ years of experience in data analysis or data migration projects in banking or insurance. Strong SQL skills and experience with data profiling and cleansing. Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and data visualization tools (e.g., Power BI, Tableau). Experience working with big data platforms (e.g., Hadoop, Spark, Hive) and handling large volumes of structured and unstructured data. Understanding of banking and insurance data domains (e.g., customer data, transactions, policies, claims, underwriting). Knowledge of regulatory and compliance requirements (e.g., AML, KYC, GDPR, IRDAI guidelines). Excellent analytical, documentation, and communication skills. Preferred Qualifications Experience with core banking systems (e.g., Finacle, Flexcube) or insurance platforms Exposure to cloud data platforms (e.g.,AWS, Azure, GCP). Experience working in Agile/Scrum environments. Certification in Business Analysis (e.g., CBAP, CCBA) or Data Analytics.
Posted 1 month ago
14.0 - 19.0 years
4 - 8 Lacs
Hyderabad
Work from Office
SkillData Engineer RoleT2, T1 Key responsibility Data Engineer Must have 9+ years of experience in below mentioned skills. Must HaveBig Data Concepts , Python(Core Python- Able to write code), SQL, Shell Scripting, AWS S3 Good to HaveEvent-driven/AWA SQS, Microservices, API Development, Kafka, Kubernetes, Argo, Amazon Redshift, Amazon Aurora
Posted 1 month ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Posted 1 month ago
5.0 years
0 Lacs
India
Remote
Job Title: Salesforce Administrator with Sales Cloud and CPQ Location: [Bangalore/Coimbatore/Hybrid/ Remote] - Time: EST time zone Employment Type: [Full-Time / Contract] About the role: Salesforce Administrator with over 5 years of experience managing and optimizing Salesforce environments to support business operations, sales processes, and customer engagement. Proven track record in user management, workflow automation, data integrity, and system customization to drive organizational efficiency. Adept at collaborating with cross-functional teams to translate business requirements into scalable Salesforce solutions. Skilled in leveraging tools such as CPQ, SFDC, DataLoader, and SFDMU, with strong command of SOQL and reporting dashboards to ensure data accuracy and performance insights. Experienced in maintaining system security, managing Data Extensions, and supporting integrations across the Salesforce ecosystem. Technical Skills: CRM & Platform Tools: Salesforce CRM, Salesforce CPQ, Salesforce Marketing Cloud, SFDC, Partner Portal (Digital Experience); Automation & Process Building: Flow Builder, Journey Builder, Automation Studio, Email-to-Case, Queues & Assignment Rules; Data Management & Integration: SFDMU (SFDX Data Move Utility), DataLoader, Talend Open Studio, Data Extensions, SOQL, SQL Queries; Scripting & Customization: AMPscript, SSJS (Server-Side JavaScript), HTML, CSS; Web & Content Tools: Content Builder, Web Studio, Cloud Pages; Integrations & Tools: DocuSign, Slack, SAP, HubSpot. Roles and Responsibilities: Designed and implemented the project database; Configured a new Salesforce instance according to business requirements; Set up security and access controls, including profiles, roles, OWD, sharing rules, and groups; Managed data quality through validation rules, Sales Path configuration, matching rules, and duplicate rules; Configured the existing Partner Portal (Digital Experience); Implemented Service Cloud functionality for tracking IT support issues and equipment requests; Configured Email-to-Case functionality; Created queues and assignment rules; Performed initial data upload from Salesforce to Salesforce using Talend; Prepared data migration plans, including field mapping and migration sequencing; Developed reports and dashboards based on client requirements; Automated processes using Flow Builder; Created custom record types; Integrated Slack with Salesforce; Designed business processes according to client requirements; Participated in regular conference calls with clients and demonstrated system functionality; Delivered solutions, provided customer support, and conducted training sessions; Maintained project documentation and guided the customer team. Why You’ll Love Working With Us: You’ll build automation that makes important processes faster, easier, and more secure. Collaborate with different teams to create solutions that really help the business. Keep learning new skills and grow your career in a supportive environment. Take ownership of key projects that improve security and compliance. Enjoy a flexible, hybrid work environment that values your ideas and effort. Health Insurance, EPFs
Posted 1 month ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We are seeking a skilled ETL Developer to join our team in Coimbatore. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes to support data warehousing and business intelligence initiatives. The ETL Developer will work closely with data analysts, database administrators, and business stakeholders to ensure accurate and efficient data integration. Key Responsibilities: Design, develop, and implement ETL workflows to extract, transform, and load data from various sources. Optimise ETL processes for performance, reliability, and scalability. Collaborate with data analysts and business teams to understand data requirements. Maintain and troubleshoot existing ETL jobs and pipelines. Ensure data quality, consistency, and security throughout the data lifecycle. Document ETL processes, data mappings, and workflows. Stay updated with the latest ETL tools and best practices. Installing and configuring Informatica components, including high availability; managing server activations and de-activations for all environments; ensuring that all systems and procedures adhere to organizational best practices Day-to-day administration of the Informatica Suite of services (PowerCenter, IDS, Metadata, Glossary and Analyst). Informatica capacity planning and ongoing monitoring (e.g. CPU, memory, etc.) to proactively increase capacity as needed. Qualifications & Skills: Bachelor's degree in computer science, information technology, or related field. 5 to 7 years of experience in ETL development, preferably with tools like Informatica, Talend, SSIS, or similar. Strong SQL skills and experience with relational databases (SQL Server, Oracle, etc.). Knowledge of data warehousing concepts and architecture. Familiarity with scripting languages such as Python or Shell scripting is a plus. Excellent problem-solving and communication skills. Ability to work independently and as part of a team.
Posted 1 month ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
Process Manager - GCP Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Participate in Stakeholder interviews, workshops, and design reviews to define data models, pipelines, and workflows. Analyse business problems and propose data-driven solutions that meet stakeholder objectives. Experience on working on premise as well as cloud platform (AWS/GCP/Azure) Should have extensive experience in GCP with a strong focus on Big Query, and will be responsible for designing, developing, and maintaining robust data solutions to support analytics and business intelligence needs. (GCP is preferable over AWS & Azure) Design and implement robust data models to efficiently store,organize,and access data for diverse use cases. Design and build robust data pipelines (Informatica / Fivertan / Matillion / Talend) for ingesting, transforming, and integrating data from diverse sources. Implement data processing pipelines using various technologies, including cloud platforms, big data tools, and streaming frameworks (Optional). Develop and implement data quality checks and monitoring systems to ensure data accuracy and consistency. Technical and Functional Skills: Bachelors Degree with 5+ years of experience with relevant 3+ years hands-on of experience in GCP with BigQuery. Good knowledge of any 1 of the databases scripting platform (Oracle preferable) Work would involve analysis, development of code/pipelines at modular level, reviewing peers code and performing unit testing and owning push to prod activities. With 5+ of work experience and worked as Individual contributor for 5+ years Direct interaction and deep diving with VPs of deployment Should work with cross functional team/ stakeholders Participate in Backlog grooming and prioritizing tasks Worked on Scrum Methodology. GCP certification desired. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law
Posted 1 month ago
3.0 years
0 Lacs
Hyderābād
On-site
Job Summary: We are seeking a highly experienced QA professional with over 3+ years of experience to join our Quality Assurance team for the data migration project. The ideal candidate will have a strong background in ETL testing, data validation, and migration projects, with expertise in creating test cases and test plans, as well as hands-on experience with data migration to cloud platforms like Snowflake. The role requires leadership capabilities to manage testing efforts, including coordinating with both on-shore and off-shore teams, ensuring seamless collaboration and delivery. Proficiency in ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage is essential, along with a solid understanding of SQL and semi-structured data formats such as JSON and XML. Key Responsibilities: Develop and implement comprehensive test strategies and plans for data migration projects, ensuring full coverage of functional and non-functional requirements. Create detailed test cases, test plans, and test scripts for validating data migration processes and transformations. Conduct thorough data validation and verification testing, leveraging advanced SQL skills to write and execute complex queries for data accuracy, completeness, and consistency. Utilize ETL tools such as Talend, Informatica PowerCenter, or DataStage to design and execute data integration tests, ensuring successful data transformation and loading into target systems like Snowflake. Validate semi-structured data formats (JSON, XML), ensuring proper parsing, mapping, and integration within data migration workflows. Lead testing efforts for data migration to cloud platforms, ensuring seamless data transfer and integrity. Act as the QA Lead to manage and coordinate testing activities with on-shore and off-shore teams, ensuring alignment, timely communication, and delivery of quality outcomes. Document and communicate test results, defects, and issues clearly to the development and project teams, ensuring timely resolutions. Collaborate with cross-functional teams to create and maintain automated testing frameworks for ETL processes, improving testing efficiency and coverage. Monitor adherence to QA best practices and standards while driving process improvements. Stay updated on the latest QA tools, technologies, and methodologies to enhance project outcomes. Qualifications: 3+ years of experience in Quality Assurance, focusing on ETL testing, data validation, and data migration projects. Proven experience creating detailed test cases, test plans, and test scripts. Hands-on experience with ETL tools like Talend (preferred), Informatica PowerCenter, or DataStage. Proficiency in SQL for complex query writing and optimization for data validation and testing. Experience with cloud data migration projects, specifically working with databases like Snowflake. Strong understanding of semi-structured data formats like JSON and XML, with hands-on testing experience. Strong analytical and troubleshooting skills for resolving data quality and testing challenges. Preferred Skills: Experience with automated testing tools and frameworks, particularly for ETL processes. Knowledge of data governance and data quality best practices. Familiarity with AWS or other cloud-based ecosystems. ISTQB or equivalent certification in software testing.
Posted 1 month ago
6.0 years
12 - 15 Lacs
Bhopal
On-site
#Connections #hiring #Immediate #DataEngineer #Bhopal Hi Connections, We are hiring data engineer for our client. Job Title: Data Engineer – Real-Time Streaming & Integration (Apache Kafka) Location: Bhopal, Madhya Pradesh Key Responsibilities: · Design, develop, and maintain real-time streaming data pipelines using Apache Kafka and Kafka Connect. · Implement and optimize ETL/ELT processes for structured and semi-structured data from various sources. · Build and maintain scalable data ingestion, transformation, and enrichment frameworks across multiple environments. · Collaborate with data architects, analysts, and application teams to deliver integrated data solutions that meet business requirements. · Ensure high availability, fault tolerance, and performance tuning for streaming data infrastructure. · Monitor, troubleshoot, and enhance Kafka clusters, connectors, and consumer applications. · Enforce data governance, quality, and security standards throughout the pipeline lifecycle. · Automate workflows using orchestration tools and CI/CD pipelines for deployment and version control. Required Skills & Qualifications: · Strong hands-on experience with Apache Kafka, Kafka Connect, and Kafka Streams. · Expertise in designing real-time data pipelines and stream processing architectures. · Solid experience with ETL/ELT frameworks using tools like Apache NiFi, Talend, or custom Python/Scala-based solutions. · Proficiency in at least one programming language: Python, Java, or Scala. · Deep understanding of message serialization formats (e.g., Avro, Protobuf, JSON). · Strong SQL skills and experience working with data lakes, warehouses, or relational databases. · Familiarity with schema registry, data partitioning, and offset management in Kafka. · Experience with Linux environments, containerization, and CI/CD best practices. Preferred Qualifications: · Experience with cloud-native data platforms (e.g., AWS MSK, Azure Event Hubs, GCP Pub/Sub). · Exposure to stream processing engines like Apache Flink or Spark Structured Streaming. · Familiarity with data lake architectures, data mesh concepts, or real-time analytics platforms. · Knowledge of DevOps tools like Docker, Kubernetes, Git, and Jenkins. Work Experience: · 6+ years of experience in data engineering with a focus on streaming data and real-time integrations. · Proven track record of implementing data pipelines in production-grade enterprise environments. Education Requirements: · Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. · Certifications in data engineering, Kafka, or cloud data platforms are a plus. Interested guys, kindly share your updated profile to pavani@sandvcapitals.com or reach us on 7995292089. Thank you. Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,500,000.00 per year Schedule: Day shift Experience: Data Engineer: 6 years (Required) ETL: 6 years (Required) Work Location: In person
Posted 1 month ago
4.0 - 9.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Responsibilities:Design, develop, and maintain data pipelines using Snowflake, DBT, and AWS.Collaborate with cross-functional teams to understand data requirements and deliver solutions.Optimize and troubleshoot existing data workflows to ensure efficiency and reliability.Implement best practices for data management and governance.Stay updated with the latest industry trends and technologies to continuously improve our data infrastructure.Required Skills: Proficiency in Snowflake, DBT, and AWS.Experience with data modeling, ETL processes, and data warehousing.Strong problem-solving skills and attention to detail.Excellent communication and teamwork abilities.Preferred Skills: Knowledge of Fivetran (HVR) and Python.Familiarity with data integration tools and techniques.Ability to work in a fast-paced and agile environment.Education:Bachelor's degree in Computer Science, Information Technology, or a related field.
Posted 1 month ago
6.0 - 9.0 years
6 - 10 Lacs
Hyderabad
Work from Office
Candidate will be responsible for data preparation jobs, data analysis and data reporting. Responsible for performing technical troubleshooting and providing technical expertise for ETL jobs Designing and Developing the ETL mapping in Talend, SQL, PL SQL, etc Preparing the documentation likeMapping document, design document, deployment document Co-ordination with the support team, client and third party vendor on various aspects of SDLC cycle gathering Attend the status / stand up calls. Primary SkillTalend DI, SQL, PL SQL, AWS services Secondary Skill Python Primary skill is Talend please suggest more suitable Talend/SQL profiles.
Posted 1 month ago
5.0 - 10.0 years
5 - 9 Lacs
Hyderabad, Bengaluru
Hybrid
Immediate Openings on AWS Cloud Developer _ Bengaluru _Contract Skill: AWS Cloud Developer Notice Period: Immediate . Employment Type: Contract Job Description Plan, design, and execute end-to-end data migration projects from various data sources to Amazon Aurora PostgreSQL and Amazon DynamoDB. Collaborate with cross-functional teams to understand data requirements, perform data analysis, and define migration strategies. Develop and implement data transformation and manipulation procedures using Talend, AWS Glue, and AWS DMS to ensure data accuracy and integrity during the migration process. Optimize data migration workflows for efficiency and reliability, and monitor performance to identify and address potential bottlenecks. Collaborate with database administrators, data engineers, and developers to troubleshoot and resolve data-related issues. Ensure adherence to best practices, security standards, and compliance requirements throughout the data migration process. Provide documentation, technical guidance, and training to team members and stakeholders on data migration procedures and best practices.
Posted 1 month ago
10.0 years
0 Lacs
India
Remote
Our Company We’re Hitachi Digital, a company at the forefront of digital transformation and the fastest growing division of Hitachi Group. We’re crucial to the company’s strategy and ambition to become a premier global player in the massive and fast-moving digital transformation market. Our group companies, including GlobalLogic, Hitachi Digital Services, Hitachi Vantara and more, offer comprehensive services that span the entire digital lifecycle, from initial idea to full-scale operation and the infrastructure to run it on. Hitachi Digital represents One Hitachi, integrating domain knowledge and digital capabilities, and harnessing the power of the entire portfolio of services, technologies, and partnerships, to accelerate synergy creation and make real-world impact for our customers and society as a whole. Imagine the sheer breadth of talent it takes to unleash a digital future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally as important to us. Preferred job location: Bengaluru, Hyderabad, Pune, New Delhi or Remote The team Hitachi Digital is a leader in digital transformation, leveraging advanced AI and data technologies to drive innovation and efficiency across various operational companies (OpCos) and departments. We are seeking a highly experienced Lead Data Integration Architect to join our dynamic team and contribute to the development of robust data integration solutions. The role Lead the design, development, and implementation of data integration solutions using SnapLogic, MuleSoft, or Pentaho. Develop and optimize data integration workflows and pipelines. Collaborate with cross-functional teams to integrate data solutions into existing systems and workflows. Implement and integrate VectorAI and Agent Workspace for Google Gemini into data solutions. Conduct research and stay updated on the latest advancements in data integration technologies. Troubleshoot and resolve complex issues related to data integration systems and applications. Document development processes, methodologies, and best practices. Mentor junior developers and participate in code reviews, providing constructive feedback to team members. Provide strategic direction and leadership in data integration and technology adoption. What you’ll bring Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. 10+ years of experience in data integration, preferably in the Banking or Finance industry. Extensive experience with SnapLogic, MuleSoft, or Pentaho (at least one is a must). Experience with Talend and Alation is a plus. Strong programming skills in languages such as Python, Java, or SQL. Technical proficiency in data integration tools and platforms. Knowledge of cloud platforms, particularly Google Cloud Platform (GCP). Experience with VectorAI and Agent Workspace for Google Gemini. Comprehensive knowledge of financial products, regulatory reporting, credit risk, and counterparty risk. Prior strategy consulting experience with a focus on change management and program delivery preferred. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to convey complex technical concepts to non-technical stakeholders. Proven leadership skills and experience in guiding development projects from conception to deployment. Preferred Qualifications Familiarity with data engineering tools and techniques. Previous experience in a similar role within a tech-driven company. About Us We’re a global, 1000-strong, diverse team of professional experts, promoting and delivering Social Innovation through our One Hitachi initiative (OT x IT x Product) and working on projects that have a real-world impact. We’re curious, passionate and empowered, blending our legacy of 110 years of innovation with our shaping our future. Here you’re not just another employee; you’re part of a tradition of excellence and a community working towards creating a digital future. Championing diversity, equity, and inclusion Diversity, equity, and inclusion (DEI) are integral to our culture and identity. Diverse thinking, a commitment to allyship, and a culture of empowerment help us achieve powerful results. We want you to be you, with all the ideas, lived experience, and fresh perspective that brings. We support your uniqueness and encourage people from all backgrounds to apply and realize their full potential as part of our team. How We Look After You We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We’re also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We’re always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you’ll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with. We’re proud to say we’re an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status or any other protected characteristic. Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.
Posted 1 month ago
10.0 years
0 Lacs
India
On-site
About Fresh Gravity: Founded in 2015, Fresh Gravity helps businesses make data-driven decisions. We are driven by data and its potential as an asset to drive business growth and efficiency. Our consultants are passionate innovators who solve clients' business problems by applying best-in-class data and analytics solutions. We provide a range of consulting and systems integration services and solutions to our clients in the areas of Data Management, Analytics and Machine Learning, and Artificial Intelligence. In the last 10 years, we have put together an exceptional team and have delivered 200+ projects for over 80 clients ranging from startups to several fortune 500 companies. We are on a mission to solve some of the most complex business problems for our clients using some of the most exciting new technologies, providing the best of learning opportunities for our team. We are focused and intentional about building a strong corporate culture in which individuals feel valued, supported, and cared for. We foster an environment where creativity thrives, paving the way for groundbreaking solutions and personal growth. Our open, collaborative, and empowering work culture is the main reason for our growth and success. To know more about our culture and employee benefits, visit out website https://www.freshgravity.com/employee-benefits/ . We promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. We are data driven. We are passionate. We are innovators. We are Fresh Gravity. Requirements What you'll do: Solid hands-on experience with Talend Open Studio for Data Integration, Talend Administration Centre, and Talend Data Quality ETL Process Design: Able to develop and design ETL jobs, ensuring they meet business requirements and follow best practices. Knowledge of SCD, normalization jobs Talend Configuration: Proficiency in Configuring Talend Studio, Job Server, and other Talend components. Data Mapping: Proficiency in creating and refining Talend mappings for data extraction, transformation, and loading. SQL: Possess Strong knowledge of SQL and experience. Able to develop complex SQL queries for data extraction and loading, especially when working with databases like Oracle, RedShift, Snowflake. Custom Scripting: Knowledge to implement custom Talend components using scripting languages like Python or Java. Shell scripting to automate tasks Reusable Joblets: Working knowledge to Design and create reusable joblets for various ETL tasks. ESB Integration and real-time data integration : Able to implement and manage integrations with ESB (Enterprise Service Bus) systems - Kafka/Azure Event Hub, including REST and SOAP web services. Desirable Skills and Experience: Experience with ETL/ELT, data transformation, data mapping, and data profiling Strong analytical and problem-solving skills Ability to work independently and as part of a team Ability to work with cross-functional teams to understand business requirements and design data Troubleshoot and resolve data integration issues in a timely manner Mentor junior team members and help them improve their Talend development skills Stay up to date with the latest Talend and data integration trends and technologies, integration solutions that meet those requirements Benefits In addition to a competitive package, we promise rich opportunities for you to succeed, to shine, to exceed even your own expectations. In keeping with Fresh Gravity's challenger ethos, we have developed the 5Dimensions (5D) benefits program. This program recognizes the multiple dimensions within each of us and seek to provide opportunities for deep development across these dimensions. Enrich Myself; Enhance My Client; Build my Company, Nurture My Family; and Better Humanity.
Posted 1 month ago
6.0 - 11.0 years
0 - 2 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities MUST HAVE MANDATORY SKILLS Minimum 6+ Year of experience with Talend Data Development in AWS ecosystem • Hands on experience with Python & Spark (PySpark) • Hands on experience with Distributed Data Storage including expertise in AWS S3 or other NoSQL storage systems • Prior experience with Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services. *********************************************************************************************** Job Duties & Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. • Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. • Demonstrate proficiency in coding skills, utilizing languages such as PySpark, Talend to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. • Collaborate seamlessly across diverse technical stacks, including AWS. • Develop and deliver detailed presentations to effectively communicate complex technical concepts. • Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. • Adhere to Agile practices throughout the solution development process. • Design, build, and deploy databases and data stores to support organizational Preferred candidate profile
Posted 1 month ago
0.0 - 6.0 years
12 - 15 Lacs
Bhopal, Madhya Pradesh
On-site
#Connections #hiring #Immediate #DataEngineer #Bhopal Hi Connections, We are hiring data engineer for our client. Job Title: Data Engineer – Real-Time Streaming & Integration (Apache Kafka) Location: Bhopal, Madhya Pradesh Key Responsibilities: · Design, develop, and maintain real-time streaming data pipelines using Apache Kafka and Kafka Connect. · Implement and optimize ETL/ELT processes for structured and semi-structured data from various sources. · Build and maintain scalable data ingestion, transformation, and enrichment frameworks across multiple environments. · Collaborate with data architects, analysts, and application teams to deliver integrated data solutions that meet business requirements. · Ensure high availability, fault tolerance, and performance tuning for streaming data infrastructure. · Monitor, troubleshoot, and enhance Kafka clusters, connectors, and consumer applications. · Enforce data governance, quality, and security standards throughout the pipeline lifecycle. · Automate workflows using orchestration tools and CI/CD pipelines for deployment and version control. Required Skills & Qualifications: · Strong hands-on experience with Apache Kafka, Kafka Connect, and Kafka Streams. · Expertise in designing real-time data pipelines and stream processing architectures. · Solid experience with ETL/ELT frameworks using tools like Apache NiFi, Talend, or custom Python/Scala-based solutions. · Proficiency in at least one programming language: Python, Java, or Scala. · Deep understanding of message serialization formats (e.g., Avro, Protobuf, JSON). · Strong SQL skills and experience working with data lakes, warehouses, or relational databases. · Familiarity with schema registry, data partitioning, and offset management in Kafka. · Experience with Linux environments, containerization, and CI/CD best practices. Preferred Qualifications: · Experience with cloud-native data platforms (e.g., AWS MSK, Azure Event Hubs, GCP Pub/Sub). · Exposure to stream processing engines like Apache Flink or Spark Structured Streaming. · Familiarity with data lake architectures, data mesh concepts, or real-time analytics platforms. · Knowledge of DevOps tools like Docker, Kubernetes, Git, and Jenkins. Work Experience: · 6+ years of experience in data engineering with a focus on streaming data and real-time integrations. · Proven track record of implementing data pipelines in production-grade enterprise environments. Education Requirements: · Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. · Certifications in data engineering, Kafka, or cloud data platforms are a plus. Interested guys, kindly share your updated profile to pavani@sandvcapitals.com or reach us on 7995292089. Thank you. Job Type: Full-time Pay: ₹1,200,000.00 - ₹1,500,000.00 per year Schedule: Day shift Experience: Data Engineer: 6 years (Required) ETL: 6 years (Required) Work Location: In person
Posted 1 month ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
ETL Tester Location : Chennai. Exp : 5 to 8 years. Key Responsibilities Review ETL design documents and understand data flows, mapping documents, and business requirements. Develop comprehensive test plans, test cases, and test scripts for validating ETL processes. Perform data validation and data quality testing at various stages of the ETL cycle. Write and execute SQL queries to verify data transformation logic, source-to-target data mapping, and business rules. Identify, troubleshoot, and document data anomalies, discrepancies, and system defects. Work closely with development teams to replicate, debug, and resolve issues. Participate in daily stand-ups, sprint planning, and defect triage meetings. Communicate clearly with stakeholders and provide timely updates on test status and results. Contribute to the development and maintenance of automated ETL testing solutions (optional, based on project). Ensure compliance with testing standards and best practices across data projects. Required Skills And Qualifications Bachelor's degree in Computer Science, Information Systems, or related field. 5+ years of hands-on experience in ETL testing or data validation roles. Strong knowledge of SQL and ability to write complex queries for data verification. Familiarity with ETL tools (e.g., Informatica, Talend, DataStage, SSIS, etc.) Experience working with large datasets and relational databases (Oracle, SQL Server, PostgreSQL, etc.) Excellent problem-solving skills with a keen eye for identifying data quality issues. Strong analytical and critical thinking skills. Clear and concise verbal and written communication skills for cross-functional collaboration. Ability to work in agile/scrum environments with fast-changing priorities. Nice To Have Experience with test automation for ETL pipelines using tools like Selenium, PyTest, or Apache Airflow validation scripts. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to BI tools like Power BI, Tableau, or Looker. Understanding of data warehousing and data lake concepts. (ref:hirist.tech)
Posted 1 month ago
5.0 years
0 Lacs
Kochi, Kerala, India
On-site
The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France