Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Description You are a strategic thinker passionate about driving solutions in Data Governance. You have found the right team. As a Data Governance Associate in our Finance team, you will spend each day defining, refining, and delivering set goals for our firm. In your role as a Senior Associate in the CAO – Data Governance Team, you will execute data quality initiatives and contribute to data governance practices, including data lineage, contracts, and classification. Under the guidance of the VP, you will ensure data integrity and compliance, utilizing cloud platforms, data analytics tools, and SQL expertise. You will be part of a team that provides resources and support to manage data risks globally, lead strategic data projects, and promote data ownership within JPMC’s Chief Administrative Office. Job Responsibilities Collaborate with leadership and stakeholders to support the CAO Data Governance program by facilitating communication and ensuring alignment with organizational goals. Implement and maintain a data quality operating model, including standards, rules, and processes, to ensure prioritized data is fit for purpose and meets business needs. Manage the data quality issue management lifecycle, coordinating between CDO, application owners, data owners, information owners, and other stakeholders to ensure timely resolution and continuous improvement. Align with evolving firmwide CDAO Data Quality policies, standards, and best practices, incorporating requirements into the CAO CDAO data governance framework to ensure compliance and consistency. Implement data governance frameworks on CAO Data Lake structures to enhance data accessibility, usability, and integrity across the organization. Required Qualifications, Capabilities, And Skills 8+ years of experience in data quality management or data governance within financial services. Experience with data management tools such as Talend, Alteryx, Soda, Collibra. Experience with visualization tools like Tableau and Qlik Sense. Experience with Agile/Scrum methodologies and tools (Confluence, Jira). Familiarity with Microsoft desktop productivity tools (Excel, PowerPoint, Visio, Word, SharePoint, Teams). Preferred Qualifications, Capabilities, And Skills Lean/Six Sigma experience is a plus. Proficiency in cloud platforms like GCP and AWS, with data lake implementation experience. Experience with Databricks or similar for data processing and analytics. ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team Our professionals in our Corporate Functions cover a diverse range of areas from finance and risk to human resources and marketing. Our corporate teams are an essential part of our company, ensuring that we’re setting our businesses, clients, customers and employees up for success.
Posted 2 weeks ago
15.0 - 20.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Tagetik Planning Budgeting and Forecasting Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing solutions that align with business objectives, and ensuring that applications are optimized for performance and usability. You will also engage in problem-solving activities, providing support and enhancements to existing applications while staying updated on industry trends and best practices. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Tagetik Planning Budgeting and Forecasting.- Strong analytical skills to interpret complex data and provide actionable insights.- Experience in application development methodologies and best practices.- Familiarity with integration techniques and tools to connect various applications.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Tagetik Planning Budgeting and Forecasting.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
13.0 - 23.0 years
50 - 55 Lacs
Hyderabad
Work from Office
Role : Snowflake Practice Lead / Architect / Solution Architect Exp : 13+ Years Work Location : Hyderabad Position Overview : We are seeking a highly skilled and experienced - Snowflake Practice Lead- to drive our data strategy, architecture, and implementation using Snowflake. This leadership role requires a deep understanding of Snowflake's cloud data platform, data engineering best practices, and enterprise data management. The ideal candidate will be responsible for defining best practices, leading a team of Snowflake professionals, and driving successful Snowflake implementations for clients. Key Responsibilities : Leadership & Strategy : - Define and drive the Snowflake practice strategy, roadmap, and best practices. - Act as the primary subject matter expert (SME) for Snowflake architecture, implementation, and optimization. - Collaborate with stakeholders to understand business needs and align data strategies accordingly. Technical Expertise & Solutioning : - Design and implement scalable, high-performance data architectures using - Snowflake- . - Develop best practices for data ingestion, transformation, modeling, and security- within Snowflake. - Guide clients on Snowflake migrations, ensuring a seamless transition from legacy systems. - Optimize - query performance, storage utilization, and cost efficiency- in Snowflake environments. Team Leadership & Mentorship : - Lead and mentor a team of Snowflake developers, data engineers, and architects. - Provide technical guidance, conduct code reviews, and establish best practices for Snowflake development. - Train internal teams and clients on Snowflake capabilities, features, and emerging trends. Client & Project Management : - Engage with clients to understand business needs and design tailored Snowflake solutions. - Lead - end-to-end Snowflake implementation projects , ensuring quality and timely delivery. - Work closely with - data scientists, analysts, and business stakeholders- to maximize data utilization. Required Skills & Experience : - 10+ years of experience- in data engineering, data architecture, or cloud data platforms. - 5+ years of hands-on experience with Snowflake in large-scale enterprise environments. - Strong expertise in SQL, performance tuning, and cloud-based data solutions. - Experience with ETL/ELT processes, data pipelines, and data integration tools- (e.g., Talend, Matillion, dbt, Informatica). - Proficiency in cloud platforms such as AWS, Azure, or GCP, particularly their integration with Snowflake. - Knowledge of data security, governance, and compliance best practices . - Strong leadership, communication, and client-facing skills. - Experience in migrating from traditional data warehouses (Oracle, Teradata, SQL Server) to Snowflake. - Familiarity with Python, Spark, or other big data technologies is a plus. Preferred Qualifications : - Snowflake SnowPro Certification- (e.g., SnowPro Core, Advanced Architect, Data Engineer). - Experience in building data lakes, data marts, and real-time analytics solutions- . - Hands-on experience with DevOps, CI/CD pipelines, and Infrastructure as Code (IaC)- in Snowflake environments. Why Join Us? - Opportunity to lead cutting-edge Snowflake implementations- in a dynamic, fast-growing environment. - Work with top-tier clients across industries, solving complex data challenges. - Continuous learning and growth opportunities in cloud data technologies. - Competitive compensation, benefits, and a collaborative work culture.
Posted 2 weeks ago
15.0 - 20.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Strong understanding of data integration processes and methodologies.- Experience with data warehousing concepts and practices.- Familiarity with SQL and database management systems.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
8.0 - 12.0 years
7 - 11 Lacs
Pune
Work from Office
Experience with ETL processes and data warehousing Proficient in SQL and Python/Java/Scala Team Lead Experience
Posted 2 weeks ago
2.0 - 4.0 years
4 - 8 Lacs
Pune
Work from Office
Experience with ETL processes and data warehousing Proficient in SQL
Posted 2 weeks ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Senior Software Engineer - Snowflake As a Senior Software Engineer – Snowflake with the experience in Snowflake, SQL & AWS you will be with Convera, you will be responsible to Install, configure, and maintain Snowflake environments across development, testing, and production. Responsibilities Snowflake Administration & Maintenance: Install, configure, and maintain Snowflake environments across development, testing, and production. Manage roles, users, access controls, and permissions to enforce security best practices. Monitor and optimize compute resources, storage usage, and query performance. Set up and manage virtual warehouses to balance cost and efficiency. Data Management & Integration: Design and optimize schema, tables, views, and materialized views for performance. Implement data ingestion pipelines using Snowpipe, COPY commands, and external tables. Integrate Snowflake with ETL/ELT tools (Informatica, DBT, Airflow, Fivetran, etc.). Work with cloud storage (AWS S3, Azure Blob, Google Cloud Storage) for data integration. Performance Tuning & Optimization: Monitor and tune query performance using Query Profile and Warehouse best practices. Optimize clustering, caching, and auto-scaling for cost efficiency.Implement data partitioning, pruning, and compression for better performance. Security & Compliance: Ensure compliance with data governance policies and industry regulations (GDPR, HIPAA, etc.). Implement role-based access control (RBAC) and multi-factor authentication (MFA). Configure data masking, row-level security, and encryption for sensitive data. Set up auditing and logging to track user activities and data access. Backup, Recovery, and Disaster Planning: Manage Time Travel & Fail-safe features for data recovery. Implement backup and retention policies to meet business continuity requirements. Develop strategies for disaster recovery and high availability. Automation & Scripting: Automate administrative tasks using Python, SQL, PowerShell, or Bash. Develop and manage CI/CD pipelines for Snowflake using DevOps tools. Implement Infrastructure as Code (IaC) using Terraform or CloudFormation. Support & Documentation: Provide technical support and troubleshoot Snowflake-related issues. Collaborate with data engineers, analysts, and business users to optimize workflows. Document best practices, guidelines, and operational procedures. You Should Apply If You Have Strong experience with Snowflake administration, architecture, and security. Proficiency in SQL and query optimization. Knowledge of cloud platforms (AWS, Azure, GCP). Familiarity with ETL/ELT tools (dbt, Talend, Informatica, Airflow, etc.). Experience with BI tools (Tableau, Power BI, Looker) is a plus. Scripting skills in Python, PowerShell, or Shell scripting. Understanding of data modeling and warehousing concepts (Star Schema, Snowflake Schema). Strong problem-solving and troubleshooting skills. Excellent communication and collaboration with cross-functional teams. Ability to work independently and in an Agile environment. Preferred Qualifications: Bachelor’s degree in computer science, Data Science, Information Systems, or a related field. Snowflake SnowPro Core or Advanced Certification (preferred). Experience with Kubernetes, Terraform, or CI/CD tools is a plus. About Convera Convera is the largest non-bank B2B cross-border payments company in the world. Formerly Western Union Business Solutions, we leverage decades of industry expertise and technology-led payment solutions to deliver smarter money movements to our customers – helping them capture more value with every transaction. Convera serves more than 30,000 customers ranging from small business owners to enterprise treasurers to educational institutions to financial institutions to law firms to NGOs. Our teams care deeply about the value we bring to our customers, which makes Convera a rewarding place to work. This is an exciting time for our organization as we build our team with growth-minded, result-oriented people who are looking to move fast in an innovative environment. As a truly global company with employees in over 20 countries, we are passionate about diversity; we seek and celebrate people from different backgrounds, lifestyles, and unique points of view. We want to work with the best people and ensure we foster a culture of inclusion and belonging. We offer an abundance of competitive perks and benefits including: Competitive salary Opportunity to earn an annual bonus. Great career growth and development opportunities in a global organization A flexible approach to work There are plenty of amazing opportunities at Convera for talented, creative problem solvers who never settle for good enough and are looking to transform Business to Business payments. Apply now if you’re ready to unleash your potential.
Posted 2 weeks ago
4.0 - 9.0 years
10 - 20 Lacs
Hyderabad
Work from Office
We are looking for a talented Talend Developer with hands-on experience in Talend Management Console on Cloud and Snowflake to join our growing team. The ideal candidate will play a key role in building and optimizing ETL/ELT data pipelines, integrating complex data systems, and ensuring high performance across cloud environments. While experience with Informatica is a plus, it is not mandatory for this role. As a Talend Developer, you will be responsible for designing, developing, and maintaining data integration solutions to meet the organizations growing data needs. You will collaborate with business stakeholders, data architects, and other data professionals to ensure the seamless and secure movement of data across platforms, ensuring scalability and performance. Key Responsibilities: Develop and maintain ETL/ELT data pipelines using Talend Management Console on Cloud to integrate data from various on-premises and cloud-based sources. Design, implement, and optimize data flows for data ingestion, processing, and transformation in Snowflake to support analytical and reporting needs. Utilize Talend Management Console on Cloud to manage, deploy, and monitor data integration jobs, ensuring robust pipeline management and process automation. Collaborate with data architects to ensure that the data integration solutions align with business requirements and follow best practices. Ensure data quality, performance, and scalability of Talend-based data solutions. Troubleshoot, debug, and optimize existing ETL processes to ensure smooth and efficient data integration. Document data integration processes, including design specifications, mappings, workflows, and performance optimizations. Collaborate with the Snowflake team to implement best practices for data warehousing and data transformation. Implement error-handling and data validation processes to ensure high levels of accuracy and data integrity. Provide ongoing support for Talend jobs, including post-deployment monitoring, troubleshooting, and optimization. Participate in code reviews and collaborate in an agile development environment. Required Qualifications: 2+ years of experience in Talend development, with a focus on using the Talend Management Console on Cloud for managing and deploying jobs. Strong hands-on experience with Snowflake data warehouse, including data integration and transformation. Expertise in developing ETL/ELT workflows for data ingestion, processing, and transformation. Experience with SQL and working with relational databases to extract and manipulate data. Experience working in cloud environments (e.g., AWS, Azure, or GCP) with integration of cloud-based data platforms. Strong knowledge of data integration, data quality, and performance optimization in Talend. Ability to troubleshoot and resolve issues in data integration jobs and processes. Solid understanding of data modeling concepts and best practices for building scalable data pipelines. Preferred Qualifications: Experience with Informatica is a plus but not mandatory. Experience with scripting languages such as Python or Shell scripting for automation. Familiarity with CI/CD pipelines and working in DevOps environments for continuous integration of Talend jobs. Knowledge of data governance and data security practices in cloud environments.
Posted 2 weeks ago
7.0 - 10.0 years
15 - 25 Lacs
Hyderabad
Work from Office
Lead the offshore team. Responsible for designing, developing, and maintaining data integration and processing solutions using Talend, and Snowflake. Building robust ETL pipelines, transforming data, and managing cloud-based data warehouses to support business reporting, analytics, and operational needs. Designs and develops APIs for real-time data integration and ensures seamless system connectivity. Optimizes ETL jobs and API performance to meet data integration and throughput requirements. Automates data integration tasks where possible to improve efficiency and reduce manual errors. Creates detailed documentation for all data integration processes, including ETL jobs and API endpoints. Develops and executes a data integration strategy that aligns with the objectives and adheres to the bank's data governance policies. Role & responsibilities
Posted 2 weeks ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
This role is for one of the Weekday's clients Min Experience: 4 years Location: Bengaluru JobType: full-time Requirements Key Responsibilities As a Data Engineer, you will play a crucial role in designing and maintaining scalable and high-performance data systems. Your responsibilities will include: Data Pipeline Development and Management Design, build, test, and maintain efficient data pipelines and data management systems. Develop and manage ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to integrate data from diverse sources such as databases, APIs, and real-time streams. Data Modeling and Architecture Design data models and implement schemas for data warehouses and data lakes to support analytics and business operations. Optimize data storage, access, and performance for scalability and maintainability. Data Quality and Integrity Implement validation, cleansing, and monitoring to maintain data accuracy, consistency, and reliability. Define and enforce best practices and standards for data governance and quality. Infrastructure Management Manage and monitor key data infrastructure components including databases, data lakes, and distributed computing environments. Apply data security protocols and ensure proper access controls are in place. Automation and Optimization Automate data workflows and pipelines to improve reliability and performance. Continuously monitor and fine-tune systems for operational efficiency. Collaboration and Support Partner with data scientists, analysts, software engineers, and business stakeholders to gather requirements and provide scalable data solutions. Document processes, workflows, and system designs; support cross-functional teams with technical guidance. Technology Evaluation Stay current with emerging tools and technologies in the data engineering space. Evaluate and recommend new solutions to enhance data capabilities and performance. Education And Experience Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, Data Science, or a related field. 5 to 7 years of experience in data engineering, software development, or a similar domain. Technical & Functional Competencies Required Skills & Qualifications Technical Proficiency Programming: Strong experience in Python and SQL. Databases: Proficient in relational (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra) databases. Data Warehousing & Lakes: Hands-on experience with platforms like Snowflake, Redshift, BigQuery. ETL/ELT Tools: Proficiency with tools like Apache Airflow, AWS Glue, Azure Data Factory, Talend. Big Data: Working knowledge of Apache Spark or similar big data technologies. Cloud Platforms: Experience with AWS, Azure, or GCP for data engineering workflows. Data Modeling: Strong understanding of modeling techniques and best practices. API Integration: Ability to build and consume APIs for data integration. Version Control: Experience with Git or other version control systems. Soft Skills Analytical mindset with a strong problem-solving approach. Excellent communication skills for both technical and non-technical audiences. Team player with a collaborative work ethic. Detail-oriented with a commitment to data quality. Adaptability to new technologies and changing project requirements. Key Skills: ETL, Data Modeling, Data Architecture, Cloud Data Platforms, Python, SQL, Big Data, Data Warehousing, API Integration
Posted 2 weeks ago
9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description Role: Lead Scheduler Administrator Location: Offshore/INDIA Who are we looking for? We are looking for 9+ years of Administrator experience in Automation Job Scheduler related technologies like Oracle / Autosys/Airflow/Talend/Cognos/Crontab/CA7. Administrators play a critical role in managing and maintaining scheduling jobs in different platforms. You will provide 24/7 production support, troubleshoot issues, monitor system health, optimized performance, and collaborate with cross-functional teams to maintain a reliable platform. Technical Skills: Strong administrator knowledge of Oracle & Autosys Scheduler/ Crontab/Cloudwatch/CA7. Strong administrator knowledge of Airflow/ Talend/ Cognos Batch jobs monitoring Performance Optimization: Monitor and validate job performance, system resource utilization (CPU, memory, storage), and troubleshoot bottlenecks. Problem Resolution: Diagnose and resolve issues Software Updates & Patches: Apply software updates, patches, and service packs to ensure the environment is current and secure. Collaboration: Work closely with batch job administrator, DataStage developers, data engineers, database administrators, and other teams to ensure proper job monitoring Security & Compliance: Enhance security practices, implement best practices for batch monitoring, and ensure compliance with enterprise technology standards and regulatory expectations. Documentation: Maintain technical documentation, including runbooks, disaster recovery plans, and job run reports. Monitoring & Reporting: Monitor job executions, generate reports on job runs, and create tickets for any issues Troubleshooting: Identifying and resolving issues and performance. Communication: Excellent communication skills, both written and verbal, for interacting with users, other administrators, and potentially vendors. Documentation: Maintaining clear documentation related to batch monitoring procedures, and other administrative tasks. Ability to work in a 24/7 support rotation and handle urgent production issues. Responsibilities Understanding of batch processing principles and system dependencies. Plan daily, weekly, monthly, and special batch cycles. Develop and test batch schedules for both testing and production environments. Set up new schedules or modify existing job schedules. Define calendars as required by business needs. Oversee automated batch jobs and processes, ensuring they start, run, and complete as planned. Monitor execution, waiting, failed and long-running jobs. Identify failures and address job abends (abnormal ends) or other issues. Escalate critical incidents to appropriate support teams. Participate in major incident triage and resolution. Maintain detailed records of batch execution, performance data, and troubleshooting steps. Update related documentation and schedules. Generate operational reports and reports as required per business needs. Document resolutions to ensure continuous improvement. Experience in diagnosing, isolating, and debugging scheduling problems. Ability to log and analyze errors.
Posted 2 weeks ago
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. JD Template- ETL Tester Associate - Operate Field CAN be edited Field CANNOT be edited ____________________________________________________________________________ Job Summary - A career in our Managed Services team will provide you with an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Data, Testing & Analytics as a Service team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Minimum Degree Required (BQ) *: Bachelor's degree Degree Preferred Required Field(s) of Study (BQ): Preferred Field(s) Of Study Computer and Information Science, Management Information Systems Minimum Year(s) of Experience (BQ) *: US Certification(s) Preferred Minimum of 2 years of experience Required Knowledge/Skills (BQ) Preferred Knowledge/Skills *: As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Key Responsibilities Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Qualifications Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Posted 2 weeks ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. In testing and quality assurance at PwC, you will focus on the process of evaluating a system or software application to identify any defects, errors, or gaps in its functionality. Working in this area, you will execute various test cases and scenarios to validate that the system meets the specified requirements and performs as expected. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Job Summary A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Analytics and Insights Managed Services team bring a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights while building your skills in exciting new directions. Have a voice at our table to help design, build and operate the next generation of software and services that manage interactions across all aspects of the value chain. Job Description To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. JD for ETL tester at Associate level As an ETL Tester, you will be responsible for designing, developing, and executing SQL scripts to ensure the quality and functionality of our ETL processes. You will work closely with our development and data engineering teams to identify test requirements and drive the implementation of automated testing solutions. Minimum Degree Required : Bachelor Degree Degree Preferred : Bachelors in Computer Engineering Minimum Years of Experience : 7 year(s) of IT experience Certifications Required : NA Certifications Preferred : Automation Specialist for TOSCA, Lambda Test Certifications Required Knowledge/Skills Collaborate with data engineers to understand ETL workflows and requirements. Perform data validation and testing to ensure data accuracy and integrity. Create and maintain test plans, test cases, and test data. Identify, document, and track defects, and work with development teams to resolve issues. Participate in design and code reviews to provide feedback on testability and quality. Develop and maintain automated test scripts using Python for ETL processes. Ensure compliance with industry standards and best practices in data testing. Qualifications Solid understanding of SQL and database concepts. Proven experience in ETL testing and automation. Strong proficiency in Python programming. Familiarity with ETL tools such as Apache NiFi, Talend, Informatica, or similar. Knowledge of data warehousing and data modeling concepts. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Experience with version control systems like Git. Preferred Knowledge/Skills Demonstrates extensive knowledge and/or a proven record of success in the following areas: Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab. Knowledge of big data technologies such as Hadoop, Spark, or Kafka.
Posted 2 weeks ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
The Applications Development Intermediate Programmer Analyst position is an intermediate level role where you will be responsible for contributing to the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. Your main objective will be to assist in applications systems analysis and programming activities. Based in India, you will report directly to the team lead. Your primary responsibilities will include developing projects based on detailed business requirements, working through solutions, managing execution and rollout of solutions within a consistent global platform, and creating T SQL queries, stored procedures, functions, and triggers using SQL Server 2014 and 2017. You will also need to have an understanding of basic data warehousing concepts and design/develop SSIS packages to pull data from various source systems and load to target tables. Additionally, you may be required to develop Dashboards and Reports using SSRS and work on BAU JIRAs and L3 support related activities. As an Applications Development Intermediate Programmer Analyst, you will need to provide detailed analysis and documentation of processes and flows, consult with users, clients, and other technology groups on issues, recommend programming solutions, and analyze applications to identify vulnerabilities and security issues. You should be able to operate with a limited level of direct supervision, exercise independence of judgment and autonomy, and appropriately assess risks when making business decisions. To qualify for this role, you should have 4-8 years of overall IT experience with at least 2 years in the Financial Services industry. Strong understanding of Microsoft SQL Server, SSIS, SSRS, and Autosys is required. Experience in any ETL tool, preferably SSIS, and some knowledge of Python can be beneficial. Other desired qualifications include highly motivated, strong analytical and problem-solving skills, good knowledge of database fundamentals, experience with reporting and job scheduling tools, and familiarity with the Finance Industry and Software Development Life Cycle. Education requirement for this position is a Bachelor's degree or equivalent experience. Please note that this job description provides a high-level overview of the work performed, and other job-related duties may be assigned as required.,
Posted 2 weeks ago
3.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
You are invited to join our team as a Lead / Senior ETL & Data Migration QA Engineer in Hyderabad, India. In this role, you will play a crucial part in a significant data migration project, bringing your expertise in ETL testing, data validation, and cloud migration. Your primary responsibilities will include designing and implementing test strategies, executing test cases, validating data accuracy, and ensuring the integrity of large-scale data transformations. You will lead QA efforts across global teams, collaborating to deliver high-quality results. Your key responsibilities will involve developing robust test strategies for data migration and ETL processes, executing detailed test cases for data validation, performing SQL-based data testing, and using ETL tools like Talend for data pipeline validation. Additionally, you will lead QA activities for cloud data migration projects, coordinate testing efforts across teams, document test results and defects, and contribute to the development of automated testing frameworks. To qualify for this role, you should have at least 3 years of experience in QA with a focus on ETL testing, data validation, and data migration. Proficiency in SQL, hands-on experience with ETL tools such as Talend, Informatica PowerCenter or DataStage, familiarity with cloud data platforms like Snowflake, and understanding of semi-structured data formats like JSON and XML are essential requirements. Strong analytical and problem-solving skills, experience in leading QA efforts, and the ability to work in distributed teams are also necessary. Preferred skills for this position include experience with automated testing tools for ETL processes, knowledge of data governance and quality standards, familiarity with cloud ecosystems like AWS, and certification in software testing such as ISTQB. If you are passionate about quality assurance, data migration, and ETL processes, and are looking to make a significant impact in a dynamic work environment, we encourage you to apply for this role and be a part of our team dedicated to driving continuous improvement and excellence in QA practices.,
Posted 2 weeks ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.
Posted 2 weeks ago
8.0 - 13.0 years
5 - 8 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Experience in Talend Tool,TMC, Solid understaing of Snowflake ,Data warehouse,sql quires Experience in Data Security Masking,Metadata Management etc
Posted 2 weeks ago
10.0 - 15.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:SQL, AWS Redshift, PostgreSQLExperience10-15 YearsLocation:Bangalore : SQL, AWS Redshift, PostgreSQL
Posted 2 weeks ago
0.0 - 5.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Engineer - DBT (Data Build Tool)Experience0-5 YearsLocation:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWSRequirements definition, source data analysis and profiling, the logical and physical design of the data lake and datawarehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systemsWork in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONSEssential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposureOther skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as neededStrong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and drawconclusions.
Posted 2 weeks ago
0 years
0 Lacs
Hyderābād
On-site
Overview: At PepsiCo, we’re accelerating our digital transformation by building the next generation of intelligent, connected, and agile systems. As part of this journey, we're seeking an CTO ASSOC Manager to drive the vision, design, and implementation of enterprise-wide integration strategies. This role is critical in ensuring our systems and data flows are connected, secure, and future-ready—supporting both global scale and localized agility. You’ll partner closely with cross-functional stakeholders, product teams, and enterprise architects to define scalable integration blueprints that align with business priorities and our evolving IT roadmap. If you have a passion for modern architecture, cloud-native technologies, and unlocking value through connected systems—this is your opportunity to make a global impact. Responsibilities: Integration Strategy & Architecture Define the enterprise integration strategy, aligning with business goals and IT roadmaps. Design scalable, resilient, and secure integration architectures using industry best practices. Develop API-first and event-driven integration strategies. Establish governance frameworks, integration patterns, and best practices. Technology Selection & Implementation Evaluate and recommend the right integration technologies, such as: Middleware & ESB: TIBCO, MuleSoft, WSO2, IBM Integration Bus Event Streaming & Messaging: Apache Kafka, RabbitMQ, IBM MQ API Management: Apigee, Kong, AWS API Gateway, MuleSoft ETL & Data Integration: Informatica, Talend, Apache NiFi iPaaS (Cloud Integration): Dell Boomi, Azure Logic Apps, Workato Lead the implementation and configuration of these platforms. API & Microservices Architecture Design and oversee API-led integration strategies. Implement RESTful APIs, GraphQL, and gRPC for real-time and batch integrations. Define API security standards (OAuth, JWT, OpenID Connect, API Gateway). Establish API versioning, governance, and lifecycle management. Enterprise Messaging & Event-Driven Architecture (EDA) Design real-time, event-driven architectures using: Apache Kafka for streaming and pub/sub messaging RabbitMQ, IBM MQ, TIBCO EMS for message queuing Event-driven microservices using Kafka Streams, Flink, or Spark Streaming Ensure event sourcing, CQRS, and eventual consistency in distributed systems. Cloud & Hybrid Integration Develop hybrid integration strategies across on-premises, cloud, and SaaS applications. Utilize cloud-native integration tools like AWS Step Functions, Azure Event Grid, Google Cloud Pub/Sub. Integrate enterprise applications (ERP, CRM, HRMS) across SAP, Oracle, Salesforce, Workday. Security & Compliance Ensure secure integration practices, including encryption, authentication, and authorization. Implement zero-trust security models for APIs and data flows. Maintain compliance with industry regulations (GDPR, HIPAA, SOC 2). Governance, Monitoring & Optimization Establish enterprise integration governance frameworks. Use observability tools for real-time monitoring (Datadog, Splunk, New Relic). Optimize integration performance and troubleshoot bottlenecks. Leadership & Collaboration Collaborate with business and IT stakeholders to understand integration requirements. Work with DevOps and cloud teams to ensure CI/CD pipelines for integration.Provide technical guidance to developers, architects, and integration engineers. Qualifications: Extensive experience designing and executing enterprise-grade integration architectures. Hands-on expertise with integration tools such as Informatica, WebLogic, TIBCO, and Apache Kafka. Proven track record in API management, microservices architecture, and event-driven systems. Strong command of cloud integration patterns and hybrid deployment models. Deep understanding of security protocols and regulatory compliance in large-scale environments. Effective communicator and leader with the ability to influence across technical and non-technical audiences. Experience in a global, matrixed enterprise is a strong plus.
Posted 2 weeks ago
4.0 - 9.0 years
13 - 23 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities Preferred candidate profile Experience: A minimum of 4-10 years of experience into data integration/orchestration services, service architecture and providing data driven solutions for client requirements Experience on Microsoft Azure cloud and Snowflake SQL, database query/performance tuning. Experience with Qlik Replicate and Compose tools(Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools such as Talend Cloud Data Integration tool is must Exposure to the financial domain knowledge is considered a plus. Cloud Managed Services such as source control code Github, MS Azure/Devops is considered a plus. Prior experience with State Street and Charles River Development ( CRD) considered a plus. Experience in tools such as Visio, PowerPoint, Excel. Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus. Strong SQL knowledge and debugging skills is a must. Responsibilities: As a Data Integration Developer/Sr Developer, be hands-on ETL/ELT data pipelines, Snowflake DWH, CI/CD deployment Pipelines and data-readiness(data quality) design, development, implementation and address code or data issues. Experience in designing and implementing modern data pipelines for a variety of data sets which includes internal/external data sources, complex relationships, various data formats and high-volume. Experience and understanding of ETL Job performance techniques, Exception handling, Query performance tuning/optimizations and data loads meeting the runtime/schedule time SLAs both batch and real-time data uses cases. Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions. Demonstrate strong collaborative experience across regions (APAC,EMEA and NA) to come up with design standards, High level design solutions document, cross training and resource onboarding activities. Good understanding of SDLC process, Governance clearance, Peer Code reviews, Unit Test Results, Code deployments, Code Security Scanning, Confluence Jira/Kanban stories. Strong attention to detail during root cause analysis, SQL query debugging and defect issue resolution by working with multiple business/IT stakeholders.
Posted 2 weeks ago
175.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you’ll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. How will you make an impact in this role? Build NextGen Data Strategy, Data Virtualization, Data Lakes Warehousing Transform and improve performance of existing reporting & analytics use cases with more efficient and state of the art data engineering solutions. Analytics Development to realize advanced analytics vision and strategy in a scalable, iterative manner. Deliver software that provides superior user experiences, linking customer needs and business drivers together through innovative product engineering. Cultivate an environment of Engineering excellence and continuous improvement, leading changes that drive efficiencies into existing Engineering and delivery processes. Own accountability for all quality aspects and metrics of product portfolio, including system performance, platform availability, operational efficiency, risk management, information security, data management and cost effectiveness. Work with key stakeholders to drive Software solutions that align to strategic roadmaps, prioritized initiatives and strategic Technology directions. Work with peers, staff engineers and staff architects to assimilate new technology and delivery methods into scalable software solutions. Minimum Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Data Virtualization Tools, Big Data, GCP, JAVA, Microservices Strong systems integration architecture skills and a high degree of technical expertise, ranging across a number of technologies with a proven track record of turning new technologies into business solutions. Should be good in one programming language python/Java. Should have good understanding of data structures. GCP /cloud knowledge has added advantage. PowerBI, Tableau and looker good knowledge and understanding. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Experience managing in a fast paced, complex, and dynamic global environment. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. Preferred Qualifications: Bachelor’s degree in computer science, Computer Science Engineering, or related field required; Advanced Degree preferred. 5+ years of hands-on experience in implementing large data-warehousing projects, strong knowledge of latest NextGen BI & Data Strategy & BI Tools Proven experience in Business Intelligence, Reporting on large datasets, Oracle Business Intelligence (OBIEE), Tableau, MicroStrategy, Data Virtualization Tools, Oracle PL/SQL, Informatica, Other ETL Tools like Talend, Java Should be good in one programming language python/Java. Should be good data structures and reasoning. GCP knowledge has added advantage or cloud knowledge. PowerBI, Tableau and looker good knowledge and understanding. Strong systems integration architecture skills and a high degree of technical expertise, ranging across several technologies with a proven track record of turning new technologies into business solutions. Outstanding influential and collaboration skills; ability to drive consensus and tangible outcomes, demonstrated by breaking down silos and fostering cross communication process. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.
Posted 2 weeks ago
6.0 - 10.0 years
16 - 25 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
6-10 years of experience in ETL Testing, Snowflake, DWH Concepts. Strong SQL knowledge & debugging skills are a must. Experience on Azure and Snowflake Testing is plus Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool Experience in JIRA, Xray defect management toolis good to have. Exposure to the financial domain knowledge is considered a plus Testing the data-readiness (data quality) address code or data issues Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution Prior experience with State Street and Charles River Development (CRD) considered a plus Experience in tools such as PowerPoint, Excel, SQL Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus Key Attributes include: Team player with professional and positive approach Creative, innovative and able to think outside of the box Strong attention to detail during root cause analysis and defect issue resolution Self-motivated & self-sufficient Effective communicator both written and verbal Brings a high level of energy with enthusiasm to generate excitement and motivate the team Able to work under pressure with tight deadlines and/or multiple projects Experience in negotiation and conflict resolution Role & responsibilities Preferred candidate profile
Posted 2 weeks ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
We’re Hiring | Talend Developer – Chennai (Immediate Joiner) We’re looking for a Talend Developer who can join immediately and is available to attend the final round of interviews in person at our Chennai office. 🔹 Must-Have Skills 2–5 years of experience with Talend Studio & Talend components Strong understanding of ETL processes and data integration Proficiency in SQL and data transformation logic Experience with relational and cloud databases (MySQL, Snowflake, etc.) 🔹 Nice to Have Exposure to AWS or Azure pipelines Basic Python/Shell scripting skills 💼 Why Join Us? Work on cutting-edge data projects with a collaborative team Fast-track onboarding for immediate contributors Competitive compensation and career growth 📍 Location: Chennai (on-site) 📅 Start Date: ASAP 🧾 Note: Only candidates who are in Chennai or can attend an in-person final interview are eligible. 📧 Send your resumes to: chandralega@fipsar.com 🔗 More details at: www.fipsar.com
Posted 2 weeks ago
7.0 - 12.0 years
10 - 18 Lacs
Hyderabad
Work from Office
Role & responsibilities Job Description: We are seeking a Technical Lead with strong expertise in Talend , SQL , Snowflake , and AWS to lead and deliver enterprise-grade data integration and transformation solutions. The ideal candidate will have a proven track record in data engineering, leading technical teams, and implementing scalable cloud-based data platforms. Key Responsibilities: Lead end-to-end data integration and ETL/ELT solution design using Talend . Architect and implement scalable data pipelines and workflows on AWS . Oversee development and deployment of data solutions on Snowflake cloud data warehouse. Guide a team of data engineers and testers in the delivery of high-quality data products. Collaborate with business stakeholders, data architects, and analysts to translate requirements into technical solutions. Optimize SQL queries for performance and accuracy across large datasets. Ensure adherence to data security, governance, and quality best practices. Conduct code reviews and enforce engineering standards across the team. Required Skills: Talend: Strong hands-on experience with Talend Data Integration or Talend Big Data platform. SQL: Advanced proficiency in writing and optimizing complex SQL queries. Snowflake: Experience with Snowflake data warehouse, schema design, and data loading techniques. AWS: Proficient in AWS services such as S3, Redshift, Lambda, Glue, EC2, etc. Strong understanding of data modeling , ETL frameworks , and data pipeline orchestration . Excellent communication, team leadership, and stakeholder management skills.
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough