Jobs
Interviews

1529 Talend Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 6.0 years

7 - 8 Lacs

Kolkata

Work from Office

Use Talend Open Studio to design, implement, and manage data integration solutions. Develop ETL processes to ensure data is accurately extracted, transformed, and loaded into various systems for analysis.

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Design and optimize ETL workflows using Talend. Ensure data integrity and process automation.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Role: Data Engineer Location: Indore(Hybrid) Experience required : 5+ Years Job Description: Build and maintain data pipelines for ingesting and processing structured and unstructured data. Ensure data accuracy and quality through validation checks and sanity reports. Improve data infrastructure by automating manual processes and scaling systems. Support internal teams (Product, Delivery, Onboarding) with data issues and solutions. Analyze data trends and provide insights to inform key business decisions. Collaborate with program managers to resolve data issues and maintain clear documentation. Must-Have Skills: Proficiency in SQL, Python (Pandas, NumPy), and R Experience with ETL tools (e.g., Apache NiFi, Talend, AWS Glue) Cloud experience with AWS (S3, Redshift, EMR, Athena, RDS) Strong understanding of data modeling, warehousing, and data validation Familiarity with data visualization tools (Tableau, Power BI, Looker) Experience with Apache Airflow, Kubernetes, Terraform, Docker Knowledge of data lake architectures, APIs, and custom data formats (JSON, XML, YAML)

Posted 1 month ago

Apply

5.0 years

0 Lacs

Andhra Pradesh, India

On-site

JD Key Responsibilities Lead the end-to-end migration of legacy data warehouses (e.g., Teradata, Oracle, SQL Server, Netezza, Redshift) to Snowflake. Assess current data architecture and define migration strategy, roadmap, and timelines. Develop ELT/ETL pipelines using tools such as dbt, Apache Airflow, Matillion, Talend, Informatica, etc. Optimize Snowflake configurations, including clustering, caching, and resource management for performance and cost efficiency. Implement security best practices, including role-based access, masking, and data encryption. Collaborate with data engineering, analytics, and business teams to ensure accurate and efficient data transfer. Create and maintain technical documentation, including migration plans, test scripts, and rollback procedures. Support validation, testing, and go-live activities. Required Skills & Experience 5+ years in data engineering or data platform roles, with at least 2+ years in Snowflake migration projects. Hands-on experience in migrating large datasets from legacy data warehouses to Snowflake. Proficient in SQL, Python, and Snowflake scripting (SnowSQL, stored procedures, UDFs). Experience with data migration tools and frameworks (e.g., AWS SCT, Azure Data Factory, Fivetran, etc.). Strong knowledge of cloud platforms (AWS, Azure, or GCP). Familiarity with DevOps practices, CI/CD for data pipelines, and version control (Git). Excellent problem-solving and communication skills. Preferred Qualifications Snowflake certification(s) SnowPro Core or Advanced Architect. Experience with real-time data ingestion (e.g., Kafka, Kinesis, Pub/Sub). Background in data governance, data quality, and compliance (GDPR, HIPAA). Prior experience in Agile/Scrum delivery environments

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Senior Data Quality Assurance Analyst Career Level :: D Introduction To Role Are you ready to play a pivotal role in transforming data quality within a global pharmaceutical environment? Join our Operations Global Data Office as a Senior Data Quality Assurance Analyst, where you'll be instrumental in developing solutions to enhance data quality within our SAP systems. Your expertise will drive the creation of data quality rules and dashboards, ensuring alignment to standards and governance policies. Collaborate with data stewards and business owners to develop code for monitoring and measuring data quality, focusing on root cause analysis and prevention of data issues. Are you solution-oriented and passionate about testing? This is your chance to make a significant impact! Accountabilities Develop and support the creation of data quality dashboards in Power BI by extracting data from various Global SAP systems into Snowflake. Work extensively with collaborators to define requirements for continuous data quality monitoring. Provide extensive data analysis and profiling across a wide range of data objects. Develop and implement the data quality framework and operating model. Focus on high levels of process automation to ensure data and results are up-to-date. Conduct extensive data analysis to detect incorrect patterns in critical data early. Facilitate matching or linking multiple data sources together for continuous DQ monitoring. Embed ongoing data quality monitoring by setting up mechanisms to track issues and trends. Conduct root cause analysis to understand causes of poor quality data. Train, coach, and support data owners and stewards in managing data quality. Essential Skills/Experience Experience developing and supporting the creation of data quality dashboards in Power BI, by extracting data from various Global SAP systems into Snowflake and develop rules for identifying DQ issues using Acceldata or something similar. Demonstrated experience & domain expertise within data management disciplines, including the three pillars of data quality, data governance and data architecture. Advanced programming skills in T-SQL or similar, to support data quality rule creation. Advanced data profiling and analysis skills evidenced by use of at least one data profiling analysis tool. For example, Adera, DataIKU or Acceldata. Strong ETL automation and reconciliation experience. Expert in extracting and manipulating and joining data in all its various formats. Excellent visualizing experience, using Power BI or similar for monitoring and reporting data quality issues. Key aspect of the role is to create self-serve data quality dashboards for the business to use for defect remediation and trending. Excellent written and verbal communication skills with the ability to influence others. to achieve objectives. Experience in Snowflake or similar for data lakes. Strong desire to improve the quality of data and to identify the causes impacting good data quality. Experience of Business and IT partnering for the implementation of Data Quality KPIs and visualisations. Strong Team member management skills with a good attention to detail. Ability to work in fast-paced, dynamic environment and manage multiple streams of work simultaneously. Experience of working in a global organisation, preferably within the pharmaceuticals industry. Experience of working in global change projects. Extensive knowledge of data quality with the ability to develop and mature the data quality operating model and framework. Knowledge of at least one standard data quality tool. For example, Acceldata, Alteryx, Aperture, Trillium, Ataccama or SAS Viya. Desirable Skills/Experience Using one of the following data lineage or governance tools or similar. For example, Talend or Collibra. Experience in working in a complex MDG SAP data environment. Experience of any of the following for data cleansing – Winshuttle or Aperture Working within a lean environment and knowledge of data governance methodologies and standards. Knowledge of automation and scheduling tools. Extensive knowledge of risk and data compliance. Experience in data observability using AI pattern detection. When we put unexpected teams in the same room, we unleash bold thinking with the power to inspire life-changing medicines. In-person working gives us the platform we need to connect, work at pace and challenge perceptions. That's why we work, on average, a minimum of three days per week from the office. But that doesn't mean we're not flexible. We balance the expectation of being in the office while respecting individual flexibility. Join us in our unique and ambitious world. At AstraZeneca, innovation is at the heart of everything we do. We embrace change by trialing new solutions with patients and business needs in mind. Our diverse workforce is united by curiosity, sharing findings, and scaling fast. Be part of a digitally-enabled journey that impacts society, the planet, and our business by delivering life-changing medicines. Ready to make a difference? Apply now! Date Posted 29-May-2025 Closing Date 30-Jul-2025 AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 13 Lacs

Pune, Chennai, Mumbai (All Areas)

Work from Office

Role : Talend Data Engineer Experience : Minimum 6+ Year relevant in Talend Location : Any clientLocation Pune, Chennai Mumbai Etc Hybrid Remote (minimum 1 day / Week) Shift Timing : 01:00 to 11:00 PM IST MUST HAVE MANDATORY SKILLS Minimum 6+ Year of experience with Talend Data Development in AWS ecosystem • Hands on experience with Python & Spark (PySpark) • Hands on experience with Distributed Data Storage including expertise in AWS S3 or other NoSQL storage systems • Prior experience with Data integration technologies, encompassing Spark, Kafka, eventing/streaming, Streamsets, NiFi, AWS Data Migration Services. Job Duties & Key Responsibilities: Design and implement tailored data solutions to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. • Provide thought leadership by recommending the most appropriate technologies and solutions for a given use case, covering the entire spectrum from the application layer to infrastructure. • Demonstrate proficiency in coding skills, utilizing languages such as PySpark , Talend to efficiently move solutions into production while prioritizing performance, security, scalability, and robust data integrations. • Collaborate seamlessly across diverse technical stacks, including AWS. • Develop and deliver detailed presentations to effectively communicate complex technical concepts. • Generate comprehensive solution documentation, including sequence diagrams, class hierarchies, logical system views, etc. • Adhere to Agile practices throughout the solution development process. • Design, build, and deploy databases and data stores to support organizational requirements. Please share the following details along with the most updated resume to geeta.negi@compunnel.com if you are interested in the opportunity: Total Experience Relevant experience Current CTC Expected CTC Notice Period (Last working day if you are serving the notice period) Current Location SKILL 1 RATING OUT OF 5 SKILL 2 RATING OUT OF 5 SKILL 3 RATING OUT OF 5 (Mention the skill)

Posted 1 month ago

Apply

6.0 - 11.0 years

9 - 13 Lacs

Hyderabad

Work from Office

GCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Experienced inGCPdata engineer Big Query SQL Python Talend ETL ProgrammerGCPor Any Cloud technology. Good experience in building the pipeline ofGCPComponents to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently

Posted 1 month ago

Apply

5.0 - 10.0 years

8 - 14 Lacs

Hyderabad

Work from Office

#Employment Type: Contract 1. 5+ Years in ETL Domain Development (in which 3 plus years in Talend) 2. Strong in SQL Queries Writing (Mandate) 3. Hands on Trouble Shooting SQL Queries (Mandate) 4. Hands-on Talend Deployments Development (Mandate) 5. Strong in DWH Concepts (Mandate)

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Job Purpose Intercontinental Exchange, Inc. (ICE) presents a unique opportunity to work with cutting-edge technology to provide solutions to business challenges in the financial sector. ICE team members work across departments and traditional boundaries to innovate and respond to industry demand. We are seeking an Integration Developer to join our collaborative Enterprise Information Management team to support the delivery of solutions to various business organizations. This candidate will be a significant part of the Integration team to support cross-system application and data integrations. The candidate will be working with a team of experts in data, ETL, and Integrations. This position requires technical proficiency as well as an eager attitude, professionalism, and solid communication skills. An Integration Developer will be a member of the team who drives strategy for tools and development. This person will not have direct reports. Responsibilities Build, maintain, and support applications in a global software platform and various other corporate systems, tools, and scripts Collaborate with other internal groups to translate business and functional requirements into technical implementation for the automation of existing processes and the development of new applications Communicate with internal customers in non-technical terms, understand business requirements, and propose solutions Manage projects from specification gathering, to development, to QA, user acceptance testing, and deployment to production Document changes and follow proper SDLC procedures Enhances team and coworkers through knowledge sharing and implementing best practices in day to day activities Takes initiative to continually learn and enhance technical knowledge and skills. Knowledge and Experience BS degree preferably in CS or EE, or a related discipline 2 - 3 yr. experience as an integration developer using applications like Talend or MuleSoft or any other. Familiarity with building multi-threaded application, and some understanding of distributed system like Kafka, Rabbit MQ Experience in developing REST based services Familiarity with different data formats like JSON, XML etc. High proficiency in RDBMS concepts and SQL Understanding of design patterns and object-oriented design concepts Experience with deployment automation tools such as Jenkins, Artifactory, Maven Strong written and verbal communication skills Ability to multitask and work independently on multiple projects Preferred Linux, Bash, SSH Familiarity Experience with application like Salesforce, ServiceNow, ORMB and other financial applicatons Financial industry expertise

Posted 1 month ago

Apply

3.0 - 5.0 years

14 - 19 Lacs

Pune

Work from Office

Job Summary Synechron is seeking a skilled Qlik Sense Developer to design, develop, and optimize business intelligence solutions that enhance data-driven decision-making across the organization. The successful candidate will collaborate with cross-functional teams to understand business requirements, craft scalable dashboards, and enable stakeholders to derive insights efficiently. This role plays a vital part in supporting digital transformation and strategic initiatives through effective data visualization and analytics. Software Required Software Skills: Strong proficiency in Qlik Sense (version 12 or later) Experience with QlikView or other BI tools (preferred) Knowledge of SQL and data querying techniques Familiarity with data modeling concepts Working knowledge of scripting languages (e.g., Python, R) is a plus Experience with cloud-based data platforms (Azure, AWS, GCP) Preferred Software Skills: Integration with other BI/analytics tools Knowledge of ETL processes and tools (Informatica, Talend, etc.) Experience with scripting for automation and data manipulation Overall Responsibilities Collaborate with business users and technical teams to gather reporting and dashboard requirements Design, develop, and deploy interactive, user-friendly Qlik Sense dashboards and visualizations Optimize data models and scripts to improve performance and scalability Conduct thorough testing to ensure accuracy and usability of reports Implement best practices for security, version control, and documentation Stay updated with latest features, techniques, and trends in BI and data visualization Provide ongoing support and enhancements to existing dashboards, resolving issues proactively Participate in cross-functional meetings to align data solutions with strategic goals Performance Outcomes & Expectations: Timely delivery of high-quality BI solutions that meet user needs Improved data accessibility and insights derived from dashboards Reduction in report generation time through performance tuning Active contribution to knowledge sharing and best practices Technical Skills (By Category) Programming Languages: EssentialQlik Sense scripting language, SQL PreferredPython, R (for advanced data manipulation and automation) Databases/Data Management: EssentialSQL Server, Oracle, PostgreSQL PreferredNoSQL (MongoDB, Cassandra) Cloud Technologies: PreferredCloud platforms (Azure, AWS, GCP) with experience integrating with BI tools Frameworks and Libraries: Qlik Sense QlikView, Qlik Data Market Data modeling tools (dimensional modeling, star schemas) Development Tools and Methodologies: Qlik Sense, QlikView, SQL Management Studio Version control (Git, Bitbucket) Agile methodologies (Scrum, Kanban) Data integration and ETL tools Security Protocols: User access management within Qlik Sense Data security standards and encryption practices Experience Minimum of 3-5 years of hands-on experience designing and deploying Qlik Sense dashboards Proven track record of translating business requirements into visual insights Experience with data modeling, scripting, and data querying Exposure to cloud and on-premises environments Experience collaborating with cross-functional teams in dynamic settings Day-to-Day Activities Participate in daily stand-up meetings and sprint planning Gather dashboard requirements through stakeholder engagement Develop, test, and publish Qlik Sense visualizations Optimize data load scripts and data models for performance Troubleshoot and resolve data discrepancies or performance bottlenecks Conduct user training and documentation for dashboards Collaborate with data engineers, analysts, and business users for continuous improvement Keep abreast of the latest BI trends and features, recommending innovations Qualifications Bachelors or Masters degree in Computer Science, Information Technology, Data Science, or related field Professional certification in Qlik Sense or BI tools is a plus Continuous professional development in BI trends and data visualization techniques Professional Competencies Strong analytical and problem-solving skills Excellent written and verbal communication abilities Ability to work independently and collaboratively Detail-oriented with a focus on data accuracy Adaptability to changing requirements and technological updates Time management skills to prioritize deliverables S YNECHRONS DIVERSITY & INCLUSION STATEMENT Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative Same Difference is committed to fostering an inclusive culture promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more. All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicants gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law . Candidate Application Notice

Posted 1 month ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Noida

Work from Office

Company Overview With 80,000 customers across 150 countries, UKG is the largest U.S.-based private software company in the world. And we're only getting started. Ready to bring your bold ideas and collaborative mindset to an organization that still has so much more to build and achieveRead on. Here, we know that you're more than your work. That's why our benefits help you thrive personally and professionally, from wellness programs and tuition reimbursement to U Choose "” a customizable expense reimbursement program that can be used for more than 200+ needs that best suit you and your family, from student loan repayment, to childcare, to pet insurance. Our inclusive culture, active and engaged employee resource groups, and caring leaders value every voice and support you in doing the best work of your career. If you're passionate about our purpose "” people "”then we can't wait to support whatever gives you purpose. We're united by purpose, inspired by you. Job Summary The Analytics Consultant I is a business intelligence focused expert that participates in the delivery of analytics solutions and reporting for various UKG products such as Pro, UKG Dimensions and UKG Datahub. The candidate is also responsible for interacting with other businesses and technical project stakeholders to gather business requirements and ensure successful delivery. The candidate should be able to leverage the strengths and capabilities of the software tools to provide an optimized solution to the customer. The Analytics Consultant I will also be responsible for developing custom analytics solutions and reports to specifications provided and support the solutions delivered. The candidate must be able to effectively communicate ideas both verbally and in writing at all levels in the organization, from executive staff to technical resources. The role requires working with the Program/Project manager, the Management Consultant, and the Analytics Consultants to deliver the solution based upon the defined design requirements and ensure it meets the scope and customer expectations. Key Responsibilities Interact with other businesses and technical project stakeholders to gather business requirements Deploy and Configure the UKG Analytics and Data Hub products based on the Design Documents Develop and deliver best practice visualizations and dashboards using a BI tools such as Cognos or BIRT or Power BI etc. Put together a test plan, validate the solution deployed and document the results Provide support during production cutover, and after go-live act as the first level of support for any requests that come through from the customer or other Consultants Analyze the customer's data to spot trends and issues and present the results back to the customer Required Qualifications: 1-3 years' experience designing and delivering Analytical/Business Intelligence solutions required Cognos, BIRT, Power BI or other business intelligence toolset experience required ETL experience using Talend or other industry standard ETL tools strongly preferred Advanced SQL proficiency is a plus Knowledge of Google Cloud Platform or Azure or something similar is desired, but not required Knowledge of Python is desired, but not required Willingness to learn new technologies and adapt quickly is required Strong interpersonal and problem-solving skills Flexibility to support customers in different time zones is required Where we're going UKG is on the cusp of something truly special. Worldwide, we already hold the #1 market share position for workforce management and the #2 position for human capital management. Tens of millions of frontline workers start and end their days with our software, with billions of shifts managed annually through UKG solutions today. Yet it's our AI-powered product portfolio designed to support customers of all sizes, industries, and geographies that will propel us into an even brighter tomorrow! UKGCareers@ukg.com

Posted 1 month ago

Apply

8.0 - 13.0 years

6 - 9 Lacs

Bengaluru

Work from Office

In this role you will provide Tier 2/3 technical support for packet core network solutions, ensuring prompt resolution of customer issues. Diagnose and resolve network faults, performance issues, and service disruptions. Manage escalations from customers, partners, and internal teams. Collaborate with engineering teams to resolve complex issues effectively. Monitor packet core network performance and proactively address potential issues. Perform routine maintenance, software upgrades, and patch deployments. Serve as a primary technical point of contact for customer support requests. Communicate effectively with customers, providing clear guidance and updates. Maintain detailed records of issues, resolutions, and troubleshooting steps. 8+ years of relevant experience and/or a graduate equivalent (or higher) degree. Level 2 Technical Support TAC - Packet Core. Experience with troubleshooting tools, Trace & log file analysis tools, and with measuring equipment PACO requires good knowledge on cloud and container infrastructure. Good in all technology call flows 2G, 3G, 4G, 5G, as per 3GPP standards. Fault identification/correction/ reporting skills on the particular subsystem and equipment Ready to support Emergencies on a 24/7 basis It would be nice if you also had: Flexible to work in different time zones. Having Excellent in logical / analytical skills. Linux Cloud certification. Plan and execute technical tasks requiring specialist skills in own professional area. Work independently with the responsibility for solving customer request cases and reporting according to processes. Identify and solve technical problems. Support areas by participating in emergency and 24/7 duty (emergency duty on rotational basis only for limited time in a month) Care Expert Technical Support TAC - Packet Core. Support customers' regions APAC & India, MEA & Europe, NAM Experience with troubleshooting tools, Trace & log file analysis tools, and with measuring equipment PACO requires good knowledge on cloud and container infrastructure.

Posted 1 month ago

Apply

5.0 - 9.0 years

15 - 19 Lacs

Bengaluru

Work from Office

Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual interventions in processing margin calls. Responsibilities The candidate will work on development of new functionality for Liqudity Risk platform closely with other teams over the globe. Skills Must have BigData experience (6 years+); Java/python J2EE, Spark, Hive; SQL Databases; UNIX Shell; Strong Experience in Apache Hadoop, Spark, Hive, Impala, Yarn, Talend, Hue; Big Data Reporting, Querying and analysis. Nice to have Spark Calculators based on business logic/rules Basic performance tuning and troubleshooting knowledge Experience with all aspects of the SDLC Experience with complex deployment infrastructures Knowledge in software architecture, design and testing Data flow automation (Apache NiFi, Airflow etc) Understanding of difference between OOP and Functional design approach Understanding of an event driven architecture Spring, Maven, GIT, uDeploy; Other Languages EnglishB2 Upper Intermediate Seniority Senior

Posted 1 month ago

Apply

6.0 - 11.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Strong experience in o 7 Years in ETL Domain (Mandate) o In 7 Years (3 + years in Talend ETL is (Mandate) o Talend Deployments Development (Mandate) o SQL Queries Writing (Mandate) o Trouble Shooting SQL Queries (Mandate) o DWH (Data Warehouse Concepts) Data Warehouse ETL (Mandate) o Any Cloud database Experience (Mandate) Preferably i.e. Redshift AWS Aurora PostgreSQL Snowflake etc o Dimensional Data Modeling (Optional)

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Mumbai

Hybrid

PF Detection is mandatory Minimum 5 years of experience in database development and ETL tools. 2. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). 3. Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). 4. Experience with data modeling and schema design. 5. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). 6. Understanding of data warehousing concepts and best practices

Posted 1 month ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Karnataka

Hybrid

PF Detection is mandatory : Looking for a candidate with over 6 years of hands-on involvement in Snowflake. The primary expertise required is in Snowflake, must be capable of creating complex SQL queries for manipulating data. The candidate should excel in implementing complex scenarios within Snowflake. The candidate should possess a strong foundation in Informatica PowerCenter, showcasing their proficiency in executing ETL processes. Strong hands-on experience in SQL and RDBMS Strong hands-on experience in Unix Shell Scripting Knowledge in Data warehousing and cloud data warehousing Should have good communication skills

Posted 1 month ago

Apply

0 years

10 Lacs

Hyderābād

On-site

Job Description: We are seeking a skilled Data Engineer to join our team, with a dual focus on infrastructure maintenance and seamless onboarding of data views. The ideal candidate will play a key role in ensuring stable data platform operations while enabling efficient data integration, especially in the context of complex upstream changes and fiscal year transitions. --- Key Responsibilities: · Perform infrastructure maintenance, including: o Azure subscription management, o Azure Infrastructure and Platform Operations o ETL pipeline monitoring o Source path validation and updates o Proactive issue identification and resolution · Manage data onboarding activities, including: o Integration of new data sources and views o Adjustments for FY rollover and evolving upstream systems · Collaborate with cross-functional teams to align platform readiness with business Please use these source to look for candidates Required Skills & Qualifications: Education: Bachelor's or master’s degree in computer science, Engineering, Information Systems, or a related technical field. Programming Languages: Proficiency in one or more programming languages commonly used in data engineering (e.g., Python, Java, Scala, SQL). Database Expertise: Strong knowledge of database systems, data modeling techniques, and advanced SQL proficiency. Experience with NoSQL databases is often a plus. ETL Tools & Concepts: Solid understanding of ETL/ELT processes and experience with relevant tools (e.g., Apache Airflow, Talend, Databricks, Azure Data Factory). Job Type: Full-time Pay: From ₹1,000,000.00 per year Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

3.0 years

9 - 10 Lacs

Gurgaon

On-site

About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India

Posted 1 month ago

Apply

5.0 - 9.0 years

15 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

JD Any 4 Must Have : AWS RDS, Redshift, Glue, Airflow, Python Note - Try to look for any 4 skills from must have, if finding challenge then only Redshift & AWS RDS is negotiable and rest will be must have Good To Have: General AWS knowledge, SageMaker,QuickSight, Talend, & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Minimum 3 to 5 years of Talend Developer experience. Work on the User stories and develop the Talend jobs development following the best practices. Create detailed technical design documents of talend jobs development work. Work with the SIT team and involve for defect fixing for Talend components. Note: Maximo IBM tool knowledge would have an advantage for Coned otherwise it is Ok.

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Insightsoftware (ISW) is a growing, dynamic computer software company that helps businesses achieve greater levels of financial intelligence across their organization with our world-class financial reporting solutions. At insightsoftware, you will learn and grow in a fast-paced, supportive environment that will take your career to the next level. The Data Conversion Specialist is a member of the insightsoftware Project Management Office (PMO) who demonstrates teamwork, results orientation, a growth mindset, disciplined execution, and a winning attitude. Job Description Location: Hyderabad (Work from Office- Hybrid) Working Hours: 2:30 PM to 11:30 PM for 3 Days 5:00 PM - 2:00AM IST or 6:00 PM to 3:00 AM IST for 2 Days, should be ok to work in night shift as per requirement. Position Summary The Senior Consultant will integrate and map customer data from client source system(s) to our industry-leading platform. The role will include, but is not limited to: Using strong technical data migration, scripting, and organizational skills to ensure the client data is converted efficiently and accurately to the insightsoftware (ISW) platform. Performing extract, transform, load (ETL) activities to ensure accurate and timely data conversions. Providing in-depth research and analysis of complex scenarios to develop innovative solutions to meet customer needs whilst remaining within project governance. Mapping and maintaining business requirements to the solution design using tools such as requirements traceability matrices (RTM). Presenting findings, requirements, and problem statements for ratification by stakeholders and working groups. Identifying and documenting data gaps to allow change impact and downstream impact analysis to be conducted. Qualifications Experience assessing data and analytic requirements to establish mapping rules from source to target systems to meet business objectives. Experience with real-time, batch, and ETL for complex data conversions. Working knowledge of extract, transform, load (ETL) methodologies and tools such as Talend, Dell Boomi, etc. Utilize data mapping tools to prepare data for data loads based on target system specifications. Working experience using various data applications/systems such as Oracle SQL, Excel, .csv files, etc. Strong SQL scripting experience. Communicate with clients and/or ISW Project Manager to scope, develop, test, and implement conversion/integration Effectively communicate with ISW Project Managers and customers to keep project on target Continually drive improvements in the data migration process. Collaborate via phone and email with clients and/or ISW Project Manager throughout the conversion/integration process. Demonstrated collaboration and problem-solving skills. Working knowledge of software development lifecycle (SDLC) methodologies including, but not limited to: Agile, Waterfall, and others. Clear understanding of cloud and application integrations. Ability to work independently, prioritize tasks, and manage multiple tasks simultaneously. Ensure client’s data is converted/integrated accurately and within deadlines established by ISW Project Manager. Experience in customer SIT, UAT, migration and go live support. Additional Information All your information will be kept confidential according to EEO guidelines. ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** ** At this time insightsoftware is not able to offer sponsorship to candidates who are not eligible to work in the country where the position is located . ** insightsoftware About Us: Hear From Our Team - InsightSoftware (wistia.com) Background checks are required for employment with insightsoftware, where permitted by country, state/province. At insightsoftware, we are committed to equal employment opportunity regardless of race, color, ethnicity, ancestry, religion, national origin, gender, sex, gender identity or expression, sexual orientation, age, citizenship, marital or parental status, disability, veteran status, or other class protected by applicable law. We are proud to be an equal opportunity workplace.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

Remote

Greetings from Synergy Resource Solutions, a leading recruitment consultancy firm Our Client is an ISO 27001:2013 AND ISO 9001 Certified company, and pioneer in web design and development company from India. Company has also been voted as the Top 10 mobile app development companies in India. Company is leading IT Consulting and web solution provider for custom software, website, games, custom web application, enterprise mobility, mobile apps and cloud-based application design & development. Company is ranked one of the fastest growing web design and development company in India, with 3900+ successfully delivered projects across United States, UK, UAE, Canada and other countries. Over 95% of client retention rate demonstrates their level of services and client satisfaction. Position : Senior Database Administrator (WFO) Experience : 5-8 Years relevant experience Education Qualification : Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. Job Location : Ahmedabad Shift : 11 AM – 8.30 PM CTC: 18 to 25 Lacs Key Responsibilities: Our client seeking an experienced and motivated Senior Data Engineer to join their AI & Automation team. The ideal candidate will have 5–8 years of experience in data engineering, with a proven track record of designing and implementing scalable data solutions. A strong background in database technologies, data modeling, and data pipeline orchestration is essential. Additionally, hands-on experience with generative AI technologies and their applications in data workflows will set you apart. In this role, you will lead data engineering efforts to enhance automation, drive efficiency, and deliver data driven insights across the organization. Job Description: • Design, build, and maintain scalable, high-performance data pipelines and ETL/ELT processes across diverse database platforms. • Architect and optimize data storage solutions to ensure reliability, security, and scalability. • Leverage generative AI tools and models to enhance data engineering workflows, drive automation, and improve insight generation. • Collaborate with cross-functional teams (Data Scientists, Analysts, and Engineers) to understand and deliver on data requirements. • Develop and enforce data quality standards, governance policies, and monitoring systems to ensure data integrity. • Create and maintain comprehensive documentation for data systems, workflows, and models. • Implement data modeling best practices and optimize data retrieval processes for better performance. • Stay up-to-date with emerging technologies and bring innovative solutions to the team. Qualifications: • Bachelor's or Master’s degree in Computer Science, Information Technology, or a related field. • 5–8 years of experience in data engineering, designing and managing large-scale data systems. The mandatory skills are as follows: SQL NoSQL (MongoDB or Cassandra, or CosmosDB) One of the following : Snowflake or Redshift or BigQuery or Microsft Fabrics Azure Strong expertise in database technologies, including: o SQL Databases: PostgreSQL, MySQL, SQL Server o NoSQL Databases: MongoDB, Cassandra o Data Warehouse/ Unified Platforms: Snowflake, Redshift, BigQuery, Microsoft Fabric • Hands-on experience implementing and working with generative AI tools and models in production workflows. • Proficiency in Python and SQL, with experience in data processing frameworks (e.g., Pandas, PySpark). • Experience with ETL tools (e.g., Apache Airflow, MS Fabric, Informatica, Talend) and data pipeline orchestration platforms. • Strong understanding of data architecture, data modeling, and data governance principles. • Experience with cloud platforms (preferably Azure) and associated data services. Skills: • Advanced knowledge of Database Management Systems and ETL/ELT processes. • Expertise in data modeling, data quality, and data governance. • Proficiency in Python programming, version control systems (Git), and data pipeline orchestration tools. • Familiarity with AI/ML technologies and their application in data engineering. • Strong problem-solving and analytical skills, with the ability to troubleshoot complex data issues. • Excellent communication skills, with the ability to explain technical concepts to non-technical stakeholders. • Ability to work independently, lead projects, and mentor junior team members. • Commitment to staying current with emerging technologies, trends, and best practices in the data engineering domain. If your profile is matching with the requirement & if you are interested for this job, please share your updated resume with details of your present salary, expected salary & notice period.

Posted 1 month ago

Apply

0.0 - 4.0 years

8 - 12 Lacs

Chennai, Tamil Nadu

On-site

Senior ETL Developer Job Summary Experience : 5-8 years Hybrid mode Full time/Contract Chennai Immediate joiner US shift timings Job Overview We are looking for a Senior ETL Developer who can take ownership of projects end-to-end , lead technical implementation, and mentor team members in ETL, data integration, and cloud data workflows. The ideal candidate will have 5–8 years of experience working with Talend , PostgreSQL , and AWS , and must be comfortable in a Linux environment . We are seeking a Senior ETL Developer with strong expertise in Talend , PostgreSQL , AWS , and Linux . The candidate should be able to take complete ownership of project execution—from design to delivery— while mentoring junior developers and driving technical best practices. The ideal candidate will have hands-on experience in data integration , cloud-based ETL pipelines , data versioning , and automation , and must be ready to work in a hybrid setup from Chennai or Madurai . · Design and implement scalable ETL workflows using Talend and PostgreSQL. · Handle complex data transformations and integrations across structured/unstructured sources. · Develop automation scripts using Shell/Python in a Linux environment. · Build and maintain stable ETL pipelines integrated with AWS services (S3, Glue, RDS, Redshift). · Ensure data quality, governance, and version control using tools like Git and Quilt. · Troubleshoot data pipeline issues and optimize for performance. · Schedule and manage jobs using tools like Apache Airflow, Cron, or Jenkins. · Mentor team members, review code, and promote technical best practices. · Drive continuous improvement and training on modern data tools and techniques. ETL & Integration · Must Have : Talend (Open Studio / DI / Big Data) · Also Good : SSIS, SSRS, SAS · Bonus : Apache NiFi, Informatica Databases · Required : PostgreSQL (3+ years) · Bonus : Oracle, SQL Server, MySQL Cloud Platforms · Required : AWS (S3, Glue, RDS, Redshift) · Bonus : Azure Data Factory, GCP · Certifications : AWS / Azure (Good to have) OS & Scripting · Required : Linux, Shell scripting · Preferred : Python scripting Data Versioning & Source Control · Required : Quilt, Git/GitHub/Bitbucket · Bonus : DVC, LakeFS, Git LFS Scheduling & Automation · Apache Airflow, Cron, Jenkins, Talend JobServer Bonus Tools · REST APIs, JSON/XML, Spark, Hive, Hadoop Visualization & Reporting · Power BI / Tableau (Nice to have) · Strong verbal and written communication. · Proven leadership and mentoring capabilities. · Ability to manage projects independently. · Comfortable adopting and teaching new tools and methodologies. · Willingness to work in a hybrid setup from Chennai or Madurai. Job Types: Full-time, Contractual / Temporary Pay: ₹800,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Schedule: Evening shift Monday to Friday Rotational shift UK shift US shift Weekend availability Experience: ETL developer: 5 years (Required) Talend/Informatica : 4 years (Required) Location: Chennai, Tamil Nadu (Required) Shift availability: Day Shift (Preferred) Night Shift (Preferred) Overnight Shift (Preferred) Work Location: In person

Posted 1 month ago

Apply

6.0 - 9.0 years

14 - 24 Lacs

Bengaluru

Hybrid

Maximus is hiring Senior IT Engineer - Talend Developer (Full time) for our Bangalore office. Exp: 6 to 8 Years Website Link: https://maximus.com/ Office address: RMZ Infinity, BenniganaHalli , Bangalore Work mode: Hybrid ( 3Days WFO) Job Description: Summary: Act as an ETL engineer on agile service delivery teams of 5-10 engineers. The main responsibilities include providing the overall development, testing and operational support and maintenance of ETL jobs and My SQL database in cloud platform. The ETL engineer will work closely with lead Architect and other stakeholders to ensure timely delivery of the features. Skills Required: Primary Skill: ETL -> Talend Talend / ETL Tool /Datawarehouse troubleshooting SQL experience Good Knowledge of APIs and understanding of JSON structures Strong written and verbal communication skills ETL process management Data modeling Data warehouse architecture Data pipeline (ETL tools) development Able to create reusable Talend jobs Performance tuning for Talend jobs and ETL testing

Posted 1 month ago

Apply

7.5 years

0 Lacs

Navi Mumbai, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Collibra Data Quality & Observability Good to have skills : Collibra Data Governance Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are functioning optimally. You will also engage in problem-solving activities, providing support and enhancements to existing applications while maintaining a focus on quality and efficiency. Key Responsibilities: Configure and implement Collibra Data Quality (CDQ) rules, workflows, dashboards, and data quality scoring metrics. Collaborate with data stewards, data owners, and business analysts to define data quality KPIs and thresholds. Develop data profiling and rule-based monitoring using CDQ's native rule engine or integrations (e.g., with Informatica, Talend, or BigQuery). Build and maintain Data Quality Dashboards and Issue Management workflows within Collibra. Integrate CDQ with Collibra Data Intelligence Cloud for end-to-end governance visibility. Drive root cause analysis and remediation plans for data quality issues. Support metadata and lineage enrichment to improve data traceability. Document standards, rule logic, and DQ policies in the Collibra Catalog. Conduct user training and promote data quality best practices across teams. Required Skills and Experience: 3+ years of experience in data quality, metadata management, or data governance. Hands-on experience with Collibra Data Quality & Observability (CDQ) platform. Knowledge of Collibra Data Intelligence Cloud including Catalog, Glossary, and Workflow Designer. Proficiency in SQL and understanding of data profiling techniques. Experience integrating CDQ with enterprise data sources (Snowflake, BigQuery, Databricks, etc.). Familiarity with data governance frameworks and data quality dimensions (accuracy, completeness, consistency, etc.). Excellent analytical, problem-solving, and communication skills. Additional Information: - The candidate should have minimum 7.5 years of experience in Collibra Data Quality & Observability. - This position is based in Mumbai. - A 15 years full time education is required.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies