Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Raurkela, Odisha, India
On-site
We are currently hiring for a Data Strategy Management role with a leading global company, based in Pune and Bangalore . We are looking for immediate joiners or candidates with up to 15 days' notice . Key focus areas (1st preference): Master Data Management (MDM) Data Governance Data Quality Additional desirable experience: Data Ingestion, ETL, Dimensional Modelling, Data Migration Data Warehousing, Data Modelling, Data Visualization Tools: Informatica MDM, IDQ, Informatica PC/IICS, Talend, Collibra Cloud tech: ADB, ADF, Synapse Experience implementing ETL/BI/Analytics/Data Management on cloud platforms Preferred certifications: PMP / CSM Cloud certifications (Azure/AWS/GCP) CDMP We’re looking for someone with strong analytical, communication, and negotiation skills who can drive strategic data initiatives end to end. If this sounds interesting, I’d be happy to share more details and discuss how this could align with your career goals. Could we connect or schedule a quick call? Looking forward to hearing from you!
Posted 3 weeks ago
1.0 - 3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Position:** ETL Developer Client Requirements Experience:** 1-3 years Location: Gurgaon Employment Type - Full time budget - Upto 35,000/ -40,000/ We are looking for a passionate and detail-oriented **ETL Developer** with 1 to 3 years of experience in building, testing, and maintaining ETL processes. The ideal candidate should have a strong understanding of data warehousing concepts, ETL tools, and database technologies. ### **Key Responsibilities:** ✅ Design, develop, and maintain ETL workflows and processes using \[specify tools e.g., Informatica / Talend / SSIS / Pentaho / custom ETL frameworks]. ✅ Understand data requirements and translate them into technical specifications and ETL designs. ✅ Optimize and troubleshoot ETL processes for performance and scalability. ✅ Ensure data quality, integrity, and security across all ETL jobs. ✅ Perform data analysis and validation for business reporting. ✅ Collaborate with Data Engineers, DBAs, and Business Analysts to ensure smooth data operations. --- ### **Required Skills:** * 1-3 years of hands-on experience with ETL tools (e.g., **Informatica, Talend, SSIS, Pentaho**, or equivalent). * Proficiency in SQL and experience working with **RDBMS** (e.g., **SQL Server, Oracle, MySQL, PostgreSQL**). * Good understanding of **data warehousing concepts** and **data modeling**. * Experience in handling **large datasets** and performance tuning of ETL jobs. * Ability to work in Agile environments and participate in code reviews. --- ### **Preferred Skills (Good to Have):** * Experience with **cloud ETL solutions** (AWS Glue, Azure Data Factory, GCP Dataflow). * Exposure to **big data ecosystems** (Hadoop, Spark). * Basic knowledge of **Python / Shell scripting** for automation. * Familiarity with **version control (Git)** and **CI/CD pipelines 🎓 Bachelor’s degree in Computer Science, Engineering, Information Technology, or related field.
Posted 3 weeks ago
5.0 years
0 Lacs
Gurugram, Haryana, India
Remote
🚀 We're Hiring: Senior ETL Tester (QA) – 5+ Years Experience 📍 Location: [GURGAON / Remote / Hybrid] 🕒 Employment Type: Full-Time 💼 Experience: 5+ Years 💰 Salary: [ Based on Experience] 📅 Joining: Immediate --- 🔍 About the Role: We’re looking for a Senior ETL Tester (QA) with 5+ years of strong experience in testing data integration workflows, validating data pipelines, and ensuring data integrity across complex systems. You will play a critical role in guaranteeing that data transformation processes meet performance, accuracy, and compliance requirements. --- ✅ Key Responsibilities: Design, develop, and execute ETL test plans, test scenarios, and SQL queries to validate data quality. Perform source-to-target data validation, transformation logic testing, and reconciliation. Collaborate with data engineers, business analysts, and developers to review requirements and ensure complete test coverage. Identify, document, and manage defects using tools like JIRA, Azure DevOps, or similar. Ensure data quality, completeness, and consistency across large-scale data platforms. Participate in performance testing and optimize data testing frameworks. Maintain and enhance automation scripts for recurring ETL validations (if applicable). --- 💡 Required Skills: 5+ years of hands-on experience in ETL testing and data validation. Strong SQL skills for writing complex queries, joins, aggregations, and data comparisons. Experience working with ETL tools (e.g., Informatica, Talend, DataStage, SSIS). Knowledge of Data Warehousing concepts and Data Modeling. Familiarity with data visualization/reporting tools (e.g., Tableau, Power BI – optional). Experience with Agile/Scrum methodologies. Strong analytical and problem-solving skills. --- ⭐ Nice to Have: Exposure to big data platforms (e.g., Hadoop, Spark). Experience with test automation tools for ETL processes. Cloud data testing experience (AWS, Azure, or GCP). Basic scripting (Python, Shell) for test automation. --- 🙌 Why Join Us? Work with a fast-paced, dynamic team that values innovation and data excellence. Competitive salary, flexible work hours, and growth opportunities. Engage in large-scale, cutting-edge data projects. --- 📩 To Apply: Send your resume to ABHISHEK.RAJ@APPZLOGIC.COM .
Posted 3 weeks ago
4.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
We’re Hiring: Talend Developer Location: Pune / Nagpur Experience: 4 to 8 Years Notice Period: Immediate to 15 Days Joiners Work Mode: Work from Office | 5 Days Working Are you an ETL expert with strong hands-on experience in Talend? Join our team and be a key part of building scalable data solutions! About the Role: We're looking for a talented Talend Developer who can design, develop, and manage robust ETL pipelines to support business-critical reporting needs. Key Responsibilities: 🔹 Develop and maintain ETL jobs using Talend 🔹 Create one-on-one pipelines (~40) for two schemas to transfer data from Fircosoft DB to ODS 🔹 Ensure data accuracy, performance, and scalability of ETL solutions 🔹 Collaborate with data analysts and stakeholders to meet reporting needs for FSK and Trust 🔹 Troubleshoot and resolve ETL-related issues proactively 🔹 Document ETL processes and follow best practices in data engineering What We’re Looking For: ✅ Strong hands-on experience with Talend ✅ Proven expertise in building and maintaining ETL pipelines ✅ Good understanding of relational databases and data flow concepts ✅ Experience with Fircosoft DB and ODS is a plus ✅ Excellent communication and problem-solving skills ✅ Ability to join immediately or within 15 days
Posted 3 weeks ago
15.0 - 20.0 years
0 Lacs
ahmedabad, gujarat
On-site
Solution Architect Senior Solution Architect Ahmedabad 13-20 years Responsibilities Suggest Solutions to existing System to improve business processes, User Experience, or Improve ROI Designing, modifying, and testing technical architecture Continually research the current and emerging technologies and propose changes where needed. Suggest innovative solutions to technical problems. Assess the business impact that certain technical choices have. Provide updates to stakeholders on product development processes, costs, and budgets. Meet with clients and suggest improvements to existing Applications and Architectures. Requirements and skills Proven work experience as a Solution Architect or similar role Minimum 15 years of experience in Software IT field Knowledge of breadth of various technologies and tools. Familiarity with upcoming technologies and tools. In-depth understanding of various coding languages Sound knowledge of various operating systems and databases Efficient communication skills Nice to have skills & knowledge in one or more areas: C#, .NET Core, SQL Server, NoSQL databases, Cloud, DevOps, MuleSoft, Talend, Hyper Automation, Data Engineering, AI/ML, Angular, React, Mobile Development. Seniority Level: Experienced Seniority Level: Full Time Your Name(Required) Email Address(Required) Subject(Required) Message,
Posted 3 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 3 weeks ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Title: Business Intelligence (BI) Architect Location: Hyderabad, India Job Type: Full time Experience Level: 10+ Yrs of Experience Job Summary: We are seeking a skilled and strategic Business Intelligence (BI) Architect to design and implement scalable BI solutions that empower data-driven decision-making across the organization. The ideal candidate will have a deep understanding of data modeling, ETL processes, data warehousing, reporting, and visualization tools, with proven experience in architecting end-to-end BI platforms. Key Responsibilities: Design and implement enterprise BI architecture, including data pipelines, data modeling, and reporting layers. Collaborate with business stakeholders to understand analytics and reporting requirements. Lead the development of data warehouses, data marts, and analytics models. Architect and optimize ETL/ELT workflows using tools such as Informatica, Talend, Azure Data Factory, etc. Design robust and scalable data models using dimensional and normalized techniques. Implement and manage BI tools such as Power BI, Tableau, Qlik, or others. Define and enforce BI governance policies, including data quality, security, and compliance. Provide technical leadership and mentorship to BI developers and analysts. Evaluate and recommend modern BI technologies and platforms to meet evolving business needs. Create documentation and architectural diagrams for BI processes, solutions, and systems. Required Qualifications: Bachelor's or Master’s degree in computer science, Information Systems, Data Engineering, or a related field. 8+ years of experience in BI/DW development and architecture. Strong experience with BI tools (e.g., Power BI, Tableau, Looker). Hands-on expertise in data modeling, SQL, DAX, MDX, and performance optimization. Proficiency with cloud-based data platforms (e.g., Azure Synapse, Snowflake, AWS Redshift, Google BigQuery). Experience with data integration tools (ETL/ELT) and orchestration platforms. Knowledge of data governance, metadata management, and security frameworks. Preferred Skills: Experience in Agile/Scrum methodologies. Exposure to AI/ML integration with BI platforms. BI/Analytics certifications (e.g., Microsoft Certified: Power BI, Tableau Desktop Certified Professional). Soft Skills: Strong analytical thinking and problem-solving skills. Excellent communication and stakeholder management abilities. Ability to work independently and lead cross-functional teams. Passion for data and continuous improvement.
Posted 3 weeks ago
4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Do you enjoy hands-on technical work? Do you enjoy being part of a team that ensures the highest quality? Join our Digital Technology-Data & Analytics team Baker Hughes' Digital Technology team provide and create tech solutions to cater to the needs of our customers. As a global team we collaborative to provide cutting-edge solutions to solve our customer's problems. We support them by providing materials management, planning, inventory and warehouse solutions. Take ownership for innovative Data Analytics projects The Data Engineering team helps solve our customers' toughest challenges; making flights safer, power cheaper, and oil & gas production safer for people and the environment by leveraging data and analytics. Data Architect will work on the projects in a technical domain as a technical leader and architect in Data & Information. Will be responsible for data handling, data warehouse building and architecture. As a Senior Data Engineer, you will be responsible for: Designing and implementing scalable and robust data pipelines to collect, process, and store data from various sources. Developing and maintaining data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation. Optimizing and tuning the performance of data systems to ensure efficient data processing and analysis. Collaborating with product managers and analysts to understand data requirements and implement solutions for data modeling and analysis. Identifying and resolving data quality issues, ensuring data accuracy, consistency, and completeness. Implement and maintain data governance and security measures to protect sensitive data. Monitoring and troubleshooting data infrastructure, perform root cause analysis, and implement necessary fixes. Ensuring the use of state-of-the-art methodologies to carry out job in the most productive and effective way. This may include research activities to identify and introduce new technologies in the field of data acquisition and data analysis Fuel your passion To be successful in this role you will: Have a Bachelors in Computer Science or “STEM” Majors (Science, Technology, Engineering and Math) with a minimum 4 years of experience Have Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems. Have Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle). Have Experience with building complex jobs for building SCD type mappings using ETL tools like Talend, Informatica, etc Have Experience with data visualization and reporting tools (e.g., Tableau, Power BI). Have strong problem-solving and analytical skills, with the ability to handle complex data challenges. Excellent communication and collaboration skills to work effectively in a team environment. Have Experience in data modeling, data warehousing and ETL principles. Have Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery). Have Advanced knowledge of distributed computing and parallel processing. Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink). (Good to have) Have Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes). Have certification in relevant technologies or data engineering disciplines. Work in a way that works for you We recognize that everyone is different and that the way in which people want to work and deliver at their best is different for everyone too. In this role, we can offer the following flexible working patterns: Working flexible hours - flexing the times when you work in the day to help you fit everything in and work when you are the most productive Working with us Our people are at the heart of what we do at Baker Hughes. We know we are better when all of our people are developed, engaged and able to bring their whole authentic selves to work. We invest in the health and well-being of our workforce, train and reward talent and develop leaders at all levels to bring out the best in each other. Working for you Our inventions have revolutionized energy for over a century. But to keep going forward tomorrow, we know we have to push the boundaries today. We prioritize rewarding those who embrace change with a package that reflects how much we value their input. Join us, and you can expect: Contemporary work-life balance policies and wellbeing activities Comprehensive private medical care options Safety net of life insurance and disability programs Tailored financial programs Additional elected or voluntary benefits About Us: We are an energy technology company that provides solutions to energy and industrial customers worldwide. Built on a century of experience and conducting business in over 120 countries, our innovative technologies and services are taking energy forward – making it safer, cleaner and more efficient for people and the planet. Join Us: Are you seeking an opportunity to make a real difference in a company that values innovation and progress? Join us and become part of a team of people who will challenge and inspire you! Let’s come together and take energy forward. Baker Hughes Company is an Equal Opportunity Employer. Employment decisions are made without regard to race, color, religion, national or ethnic origin, sex, sexual orientation, gender identity or expression, age, disability, protected veteran status or other characteristics protected by law. R149189
Posted 3 weeks ago
7.0 - 12.0 years
5 - 9 Lacs
Pune
Work from Office
About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Apache Spark, PySparkMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for creating efficient and scalable applications that align with the organization's goals and objectives. Your typical day will involve collaborating with cross-functional teams, analyzing business requirements, designing application architecture, and implementing solutions to meet user needs. You will also be involved in troubleshooting and resolving application issues, as well as ensuring the security and integrity of the applications. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Design, build, and configure applications to meet business process and application requirements.- Analyze business requirements and translate them into technical specifications.- Collaborate with cross-functional teams to gather requirements and understand user needs.- Design application architecture and develop efficient and scalable solutions.- Implement and test application functionality and ensure its alignment with business objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Apache Spark, PySpark, Talend ETL.- Strong understanding of statistical analysis and machine learning algorithms.- Experience with data visualization tools such as Tableau or Power BI.- Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms.- Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information:- The candidate should have a minimum of 7.5 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Bengaluru office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Analytics Good to have skills : Talend ETL, SAP Native HANA SQL Modeling & DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and best practices among team members.- Monitor project progress and ensure timely delivery of application components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Analytics.- Good To Have Skills: Experience with SAP Native HANA SQL Modeling & Development, Talend ETL.- Strong analytical skills to interpret complex data sets.- Experience with data visualization tools to present insights effectively.- Familiarity with database management and data warehousing concepts. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Analytics.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
4.0 - 9.0 years
16 - 31 Lacs
Hyderabad
Work from Office
We are looking for a talented Talend Developer with hands-on experience in Talend Management Console on Cloud and Snowflake to join our growing team. The ideal candidate will play a key role in building and optimizing ETL/ELT data pipelines, integrating complex data systems, and ensuring high performance across cloud environments. While experience with Informatica is a plus, it is not mandatory for this role. As a Talend Developer, you will be responsible for designing, developing, and maintaining data integration solutions to meet the organizations growing data needs. You will collaborate with business stakeholders, data architects, and other data professionals to ensure the seamless and secure movement of data across platforms, ensuring scalability and performance. Key Responsibilities: Develop and maintain ETL/ELT data pipelines using Talend Management Console on Cloud to integrate data from various on-premises and cloud-based sources. Design, implement, and optimize data flows for data ingestion, processing, and transformation in Snowflake to support analytical and reporting needs. Utilize Talend Management Console on Cloud to manage, deploy, and monitor data integration jobs, ensuring robust pipeline management and process automation. Collaborate with data architects to ensure that the data integration solutions align with business requirements and follow best practices. Ensure data quality, performance, and scalability of Talend-based data solutions. Troubleshoot, debug, and optimize existing ETL processes to ensure smooth and efficient data integration. Document data integration processes, including design specifications, mappings, workflows, and performance optimizations. Collaborate with the Snowflake team to implement best practices for data warehousing and data transformation. Implement error-handling and data validation processes to ensure high levels of accuracy and data integrity. Provide ongoing support for Talend jobs, including post-deployment monitoring, troubleshooting, and optimization. Participate in code reviews and collaborate in an agile development environment. Required Qualifications: 2+ years of experience in Talend development, with a focus on using the Talend Management Console on Cloud for managing and deploying jobs. Strong hands-on experience with Snowflake data warehouse, including data integration and transformation. Expertise in developing ETL/ELT workflows for data ingestion, processing, and transformation. Experience with SQL and working with relational databases to extract and manipulate data. Experience working in cloud environments (e.g., AWS, Azure, or GCP) with integration of cloud-based data platforms. Strong knowledge of data integration, data quality, and performance optimization in Talend. Ability to troubleshoot and resolve issues in data integration jobs and processes. Solid understanding of data modeling concepts and best practices for building scalable data pipelines. Preferred Qualifications: Experience with Informatica is a plus but not mandatory. Experience with scripting languages such as Python or Shell scripting for automation. Familiarity with CI/CD pipelines and working in DevOps environments for continuous integration of Talend jobs. Knowledge of data governance and data security practices in cloud environments. Role & responsibilities Preferred candidate profile
Posted 3 weeks ago
5.0 - 10.0 years
4 - 8 Lacs
Hyderabad
Work from Office
About The Role Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. Be involved in the end-to-end data management process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead the design and implementation of data solutions.- Optimize and troubleshoot ETL processes.- Conduct data analysis and provide insights for decision-making. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data modeling and database design.- Experience with data integration and data warehousing concepts.- Hands-on experience with SQL and scripting languages.- Knowledge of cloud platforms and big data technologies. Additional Information:- The candidate should have a minimum of 5 years of experience in Talend ETL.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education
Posted 3 weeks ago
7.0 - 15.0 years
9 - 13 Lacs
Hyderabad
Work from Office
We are looking for an experienced and highly skilled Lead Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, specifically with Talend, SQL, Data Warehousing (DWH), and Snowflake. This role will involve leading data engineering projects, managing data pipelines, and optimizing data workflows to support business analytics and data-driven decision-making. Key Responsibilities: Lead the design, development, and optimization of scalable data pipelines using Talend for ETL processes. Manage the architecture and performance tuning of data systems, including Snowflake and other cloud-based data platforms. Develop, test, and maintain SQL scripts and queries to extract, transform, and load data into data warehouses. Ensure data integration from various source systems into a centralized Data Warehouse (DWH) while maintaining data quality and integrity. Collaborate with cross-functional teams, including business analysts, data scientists, and stakeholders, to identify and implement data solutions. Lead the optimization of data workflows and the automation of processes to improve efficiency and reduce latency. Mentor and guide junior data engineers and team members on best practices, tools, and techniques in data engineering. Troubleshoot, diagnose, and resolve data-related issues and improve the overall data architecture. Stay up-to-date with the latest trends and advancements in data technologies and methodologies. Key Skills & Qualifications: Talend: Strong hands-on experience with Talend for ETL development and data integration tasks. SQL: Advanced proficiency in SQL for data manipulation, querying, and performance optimization. Snowflake: Deep experience with Snowflake as a cloud-based data platform, including performance tuning and optimization. Data Warehousing (DWH): Strong understanding of Data Warehouse architecture, design, and management. Proven experience with data modeling, data migration, and data transformation. Strong knowledge of cloud data platforms, preferably AWS, Azure, or Google Cloud. Excellent problem-solving skills, with the ability to troubleshoot complex data issues. Effective communication and leadership skills, with experience in leading cross-functional teams. Familiarity with Agile methodologies is a plus. Education and Experience: Bachelors degree in Computer Science, Engineering, Information Technology, or a related field (Master's preferred). Minimum of 5+ years of experience in data engineering, with at least 2 years in a leadership or senior role. Hands-on experience with Talend, SQL, Snowflake, and Data Warehousing in large-scale, high-volume environments.
Posted 3 weeks ago
4.0 - 8.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Analysis, Design and maintaining data systems and databases. Good backend development skills : SQL, coding and troubleshooting data-related issues. Mining data from primary and secondary sources reorganizing data formats. Review data to identify key insights into business data usage. Good communication skills to interact with key stakeholders. Talend tool knowledge and experience will be preferrable.
Posted 3 weeks ago
3.0 - 6.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Encore Software Services is looking for ETL Developer to join our dynamic team and embark on a rewarding career journey Consulting with data management teams to get a big-picture idea of the companys data storage needs. Presenting the company with warehousing options based on their storage needs. Designing and coding the data warehousing system to desired company specifications. Conducting preliminary testing of the warehousing environment before data is extracted. Extracting company data and transferring it into the new warehousing environment. Testing the new storage system once all the data has been transferred. Troubleshooting any issues that may arise. Providing maintenance support. Real-time data Ingestion, Streaming data, Kafka, AWS Cloud streaming tools, ETL, Semi-structured data formats like JSON, XML Tools: Talend, Kafka, AWS Event Bridge, Lamda and and Strong SQL & Python
Posted 3 weeks ago
3.0 - 8.0 years
10 - 14 Lacs
Hyderabad
Work from Office
About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various stakeholders to gather requirements, overseeing the development process, and ensuring that the applications meet the specified needs. You will also engage in problem-solving discussions, providing insights and solutions to enhance application performance and user experience. Your role will require you to stay updated with the latest technologies and methodologies to ensure the applications are built using best practices and are scalable for future needs. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Facilitate communication between technical teams and stakeholders to ensure alignment on project goals.- Mentor junior team members, providing guidance and support in their professional development. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Good To Have Skills: Experience with data integration tools and methodologies.- Strong understanding of data warehousing concepts and best practices.- Experience in performance tuning and optimization of ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Talend ETL.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 3 weeks ago
10.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 12. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Educational Qualification: Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus.
Posted 3 weeks ago
10.0 years
0 Lacs
Gurgaon
On-site
Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director (Senior Architect – Data) Department-IT Location- Gurgaon/ Bangalore Job Summary The Enterprise Data Architect will enhance the company's strategic use of data by designing, developing, and implementing data models for enterprise applications and systems at conceptual, logical, business area, and application layers. This role advocates data modeling methodologies and best practices. We seek a skilled Data Architect with deep knowledge of data architecture principles, extensive data modeling experience, and the ability to create scalable data solutions. Responsibilities include developing and maintaining enterprise data architecture, ensuring data integrity, interoperability, security, and availability, with a focus on ongoing digital transformation projects. Key Responsibilities 1. Strategy & Planning o Develop and deliver long-term strategic goals for data architecture vision and standards in conjunction with data users, department managers, clients, and other key stakeholders. o Create short-term tactical solutions to achieve long-term objectives and an overall data management roadmap. o Establish processes for governing the identification, collection, and use of corporate metadata; take steps to assure metadata accuracy and validity. o Establish methods and procedures for tracking data quality, completeness, redundancy, and improvement. o Conduct data capacity planning, life cycle, duration, usage requirements, feasibility studies, and other tasks. o Create strategies and plans for data security, backup, disaster recovery, business continuity, and archiving. o Ensure that data strategies and architectures are aligned with regulatory compliance. o Develop a comprehensive data strategy in collaboration with different stakeholders that aligns with the transformational projects’ goals. o Ensure effective data management throughout the project lifecycle. 2. Acquisition & Deployment o Ensure the success of enterprise-level application rollouts (e.g. ERP, CRM, HCM, FP&A, etc.) Liaise with vendors and service providers to select the products or services that best meet company goals 3. Operational Management o Assess and determine governance, stewardship, and frameworks for managing data across the organization. o Develop and promote data management methodologies and standards. o Document information products from business processes and create data entities o Create entity relationship diagrams to show the digital thread across the value streams and enterprise o Create data normalization across all systems and data base to ensure there is common definition of data entities across the enterprise o Document enterprise reporting needs develop the data strategy to enable single source of truth for all reporting data o Address the regulatory compliance requirements of each country and ensure our data is secure and compliant o Select and implement the appropriate tools, software, applications, and systems to support data technology goals. o Oversee the mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality. o Collaborate with project managers and business unit leaders for all projects involving enterprise data. o Address data-related problems regarding systems integration, compatibility, and multiple-platform integration. o Act as a leader and advocate of data management, including coaching, training, and career development to staff. o Develop and implement key components as needed to create testing criteria to guarantee the fidelity and performance of data architecture. o Document the data architecture and environment to maintain a current and accurate view of the larger data picture. o Identify and develop opportunities for data reuse, migration, or retirement. 4. Data Architecture Design: o Develop and maintain the enterprise data architecture, including data models, databases, data warehouses, and data lakes. o Design and implement scalable, high-performance data solutions that meet business requirements. 5. Data Governance: o Establish and enforce data governance policies and procedures as agreed with stakeholders. o Maintain data integrity, quality, and security within Finance, HR and other such enterprise systems. 6. Data Migration: o Oversee the data migration process from legacy systems to the new systems being put in place. o Define & Manage data mappings, cleansing, transformation, and validation to ensure accuracy and completeness. 7. Master Data Management: o Devise processes to manage master data (e.g., customer, vendor, product information) to ensure consistency and accuracy across enterprise systems and business processes. o Provide data management (create, update and delimit) methods to ensure master data is governed 8. Stakeholder Collaboration: o Collaborate with various stakeholders, including business users, other system vendors, and stakeholders to understand data requirements. o Ensure the enterprise system meets the organization's data needs. 9. Training and Support: o Provide training and support to end-users on data entry, retrieval, and reporting within the candidate enterprise systems. o Promote user adoption and proper use of data. 10 Data Quality Assurance: o Implement data quality assurance measures to identify and correct data issues. o Ensure the Oracle Fusion and other enterprise systems contain reliable and up-to-date information. 11. Reporting and Analytics: o Facilitate the development of reporting and analytics capabilities within the Oracle Fusion and other systems o Enable data-driven decision-making through robust data analysis. 1. Continuous Improvement: o Continuously monitor and improve data processes and the Oracle Fusion and other system's data capabilities. o Leverage new technologies for enhanced data management to support evolving business needs. Technology and Tools: Oracle Fusion Cloud Data modeling tools (e.g., ER/Studio, ERwin) ETL tools (e.g., Informatica, Talend, Azure Data Factory) Data Pipelines: Understanding of data pipeline tools like Apache Airflow and AWS Glue. Database management systems: Oracle Database, MySQL, SQL Server, PostgreSQL, MongoDB, Cassandra, Couchbase, Redis, Hadoop, Apache Spark, Amazon RDS, Google BigQuery, Microsoft Azure SQL Database, Neo4j, OrientDB, Memcached) Data governance tools (e.g., Collibra, Informatica Axon, Oracle EDM, Oracle MDM) Reporting and analytics tools (e.g., Oracle Analytics Cloud, Power BI, Tableau, Oracle BIP) Hyperscalers / Cloud platforms (e.g., AWS, Azure) Big Data Technologies such as Hadoop, HDFS, MapReduce, and Spark Cloud Platforms such as Amazon Web Services, including RDS, Redshift, and S3, Microsoft Azure services like Azure SQL Database and Cosmos DB and experience in Google Cloud Platform services such as BigQuery and Cloud Storage. Programming Languages: (e.g. using Java, J2EE, EJB, .NET, WebSphere, etc.) o SQL: Strong SQL skills for querying and managing databases. Python: Proficiency in Python for data manipulation and analysis. Java: Knowledge of Java for building data-driven applications. Data Security and Protocols: Understanding of data security protocols and compliance standards. Key Competencies Qualifications: Education: o Bachelor’s degree in computer science, Information Technology, or a related field. Master’s degree preferred. Experience: 10+ years overall and at least 7 years of experience in data architecture, data modeling, and database design. Proven experience with data warehousing, data lakes, and big data technologies. Expertise in SQL and experience with NoSQL databases. Experience with cloud platforms (e.g., AWS, Azure) and related data services. Experience with Oracle Fusion or similar ERP systems is highly desirable. Skills: Strong understanding of data governance and data security best practices. Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work effectively in a collaborative team environment. Leadership experience with a track record of mentoring and developing team members. Excellent in documentation and presentations. Good knowledge of applicable data privacy practices and laws. Certifications: Relevant certifications (e.g., Certified Data Management Professional, AWS Certified Big Data – Specialty) are a plus. Behavioral A self-starter, an excellent planner and executor and above all, a good team player Excellent communication skills and inter-personal skills are a must Must possess organizational skills, including multi-task capability, priority setting and meeting deadlines Ability to build collaborative relationships and effectively leverage networks to mobilize resources Initiative to learn business domain is highly desirable Likes dynamic and constantly evolving environment and requirements
Posted 3 weeks ago
0 years
0 Lacs
Andhra Pradesh
On-site
Design and execute test plans for ETL processes, ensuring data accuracy, completeness, and integrity. Develop automated test scripts using Python or R for data validation and reconciliation. Perform source-to-target data verification, transformation logic testing, and regression testing. Collaborate with data engineers and analysts to understand business requirements and data flows. Identify data anomalies and work with development teams to resolve issues. Maintain test documentation, including test cases, test results, and defect logs. Participate in performance testing and optimization of data pipelines. Required Skills & Qualifications: Strong experience in ETL testing across various data sources and targets. Proficiency in Python or R for scripting and automation. Solid understanding of SQL and relational databases. Familiarity with data warehousing concepts and tools (e.g., Power BI, QlikView, Informatica, Talend, SSIS). Experience with test management tools (e.g., JIRA, TestRail). Knowledge of data profiling, data quality frameworks, and validation techniques. Excellent analytical and communication skills. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 3 weeks ago
8.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments UST is seeking a highly skilled and motivated Lead Data Engineer to join our Telecommunications vertical, leading impactful data engineering initiatives for US-based Telco clients. The ideal candidate will have 6–8 years of experience in designing and developing scalable data pipelines using Snowflake, Azure Data Factory, Azure Databricks. Proficiency in Python, PySpark, and advanced SQL is essential, with a strong focus on query optimization, performance tuning, and cost-effective architecture. A solid understanding of data integration, real-time and batch processing, and metadata management is required, along with experience in building robust ETL/ELT workflows. Candidates should demonstrate a strong commitment to data quality, validation, and consistency, with working knowledge of data governance, RBAC, encryption, and compliance frameworks considered a plus. Familiarity with Power BI or similar BI tools is also advantageous, enabling effective data visualization and storytelling. The role demands the ability to work in a dynamic, fast-paced environment, collaborating closely with stakeholders and cross-functional teams while also being capable of working independently. Strong communication skills and the ability to coordinate across multiple teams and stakeholders are critical for success. In addition to technical expertise, the candidate should bring experience in solution design and architecture planning, contributing to scalable and future-ready data platforms. A proactive mindset, eagerness to learn, and adaptability to the rapidly evolving data engineering landscape—including AI integration into data workflows—are highly valued. This is a leadership role that involves mentoring junior engineers, fostering innovation, and driving continuous improvement in data engineering practices. Skills Azure Databricks,Snowflake,python,Data Engineering
Posted 3 weeks ago
5.0 - 10.0 years
16 - 20 Lacs
Ahmedabad, Chennai, Bengaluru
Work from Office
Role & responsibilities TALEND ., SNOWFLAKE
Posted 3 weeks ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company : Fives India Engineering & Projects Pvt. Ltd. Job Title : Data Analyst/Senior Data Analyst (BI developer) Job Location : Chennai, Tamil Nadu, India Job Department : IT Educational Qualification : BE/B.Tech/MCA from a reputed Institute in Computer Science or related field Work Experience : 4 – 8 years Job Description Fives is a global industrial engineering group based in Paris, France, that designs and supplies machines, process equipment and production lines for the world’s largest industrial sectors including aerospace, automotive, steel, aluminium, glass, cement, logistics and energy. Headquartered in Paris, Fives is located in about 25 countries with more than 9000 employees. Fives is seeking a Data Analyst/ Senior Data Analyst for their office located in Chennai, India. The position is an integral part of the Group IT development team working on custom software solutions for the Group IT requirements. We are looking for analyst specialized in BI development. Required Skills Applicant should have skills/experience in the following area: 4 – 8 years’ of experience in Power BI development Good understanding of data visualization concepts. Proficiency in writing DAX expressions and Power Query Knowledge of SQL and database related technologies Source control such as GIT Proficient in building REST APIs to interact with data sources Familiarity with ETL/ELT concepts and tools such as Talend is a plus Good knowledge of programming, algorithms and data structures Ability to use Agile collaboration tools such as Jira Good communication skills both verbal and written Willingness to learn new technologies and tools Position Type Full-Time/Regular
Posted 3 weeks ago
9.0 years
0 Lacs
India
Remote
Role: Data Management Lead Work Mode: Remote Hire Type: Contract Experience & Skills: 7–9 years of experience in data management, data governance, or enterprise data architecture. Strong understanding of data governance frameworks and industry best practices. Hands-on experience with data management platforms such as Atlan, Collibra, Informatica, Talend, or similar tools. Proficient in metadata management, data lineage, and master data integration. Familiarity with cloud data platforms (e.g., AWS, Azure, GCP) and modern data architectures (e.g., data mesh, data fabric). Strong working knowledge of SQL and Power BI for data querying and reporting. Excellent problem-solving, analytical, and communication skills. Proven ability to lead cross-functional teams and manage stakeholder expectations. Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Management, or a related field. Preferred Qualifications: Certifications such as CDMP, DGSP, or equivalent in data governance or data management. Experience in regulated industries (e.g., finance, healthcare, pharmaceuticals). Multilingual communication skills — English required; Japanese or other languages are a plus. If interested, share your resume on sadiya.mankar@leanitcorp.com
Posted 3 weeks ago
2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Greeting from Infosys BPM Ltd., We are hiring for Test Automation using Java and Selenium, with knowledge on testing process, SQL, ETL DB Testing, ETL Testing Automation skills. Please walk-in for interview on 14th & 15th July 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-217871 Interview details Interview Date: 14th & 15th July 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Test Automation using Java and Selenium, with knowledge on testing process, SQL Java, Selenium automation, SQL, Testing concepts, Agile. Tools: Jira and ALM, Intellij Functional Testing: UI Test Automation using Selenium, Java Financial domain experience Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team.
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France