Jobs
Interviews

1529 Talend Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

4 - 6 Lacs

Chennai

On-site

Job Title: Data Engineer – C11/Officer (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. - Job Family Group: Technology - Job Family: Digital Software Engineering - Time Type: Full time - Most Relevant Skills Please see the requirements listed above. - Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

0 years

3 - 5 Lacs

Chennai

On-site

Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work Effectively within a global team environment. Excellent Communication skills About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Talend - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs Snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python, etc. to do Extract, Load, and Transform data. Good to have Talend knowledge and hands on experience. Candidates worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and ever improvement approach. Desirable to have Talend / Snowflake Certification Excellent SQL coding skills Excellent communication & documentation skills. Familiar with Agile delivery process. Must be analytic, creative and self-motivated. Work Effectively within a global team environment. Excellent Communication skills

Posted 3 weeks ago

Apply

3.0 - 6.0 years

1 - 4 Lacs

Jaipur

On-site

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Senior Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. The Senior Data Engineering Consultant will: Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, AWS, and Big Data technologies Development of ETL processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In this role, you will have: B.Tech degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 3-6 years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations. Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture Strong experience with Snowflake and Data Warehouse architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL tools (Informatica, Talend, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment.

Posted 3 weeks ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About the Company Brace Infotech is looking for a Java Backend Developer with strong expertise in SQL and hands-on experience with ETL tools. The ideal candidate will be responsible for backend development, database handling, and data integration workflows. About the Role Brace Infotech is looking for a Java Backend Developer with strong expertise in SQL and hands-on experience with ETL tools. The ideal candidate will be responsible for backend development, database handling, and data integration workflows. Responsibilities Proficient in Java (Core and/or Spring Boot) Strong working knowledge of SQL (query optimization, joins, stored procedures) Experience with ETL tools (e.g., Informatica, Talend, Apache Nifi, or others) Good understanding of data flow, transformation logic, and loading mechanisms Familiarity with version control tools like Git Excellent problem-solving and communication skills Qualifications 4–6 Years of experience (can be adjusted based on your need) Immediate Joiners Only Required Skills Proficient in Java (Core and/or Spring Boot) Strong working knowledge of SQL (query optimization, joins, stored procedures) Experience with ETL tools (e.g., Informatica, Talend, Apache Nifi, or others) Good understanding of data flow, transformation logic, and loading mechanisms Familiarity with version control tools like Git Excellent problem-solving and communication skills Preferred Skills Experience working with cloud platforms (AWS, Azure, or GCP) Knowledge of data warehousing concepts Familiarity with REST APIs Pay range and compensation package Full-Time position with a competitive salary based on experience. Equal Opportunity Statement Brace Infotech is committed to diversity and inclusivity in the workplace. ```

Posted 3 weeks ago

Apply

1.0 - 2.0 years

0 Lacs

Pune, Maharashtra, India

On-site

JR0124664 Junior Associate, Solution Engineering (Data Science) – Pune, India Are you excited by the opportunity of using your knowledge of development to lead a team to success? Are you interested in joining a globally diverse organization where our unique contributions are recognized and celebrated, allowing each of us to thrive? Then it’s time to join Western Union as a Junior Associate, Solution Engineering. Western Union powers your pursuit You will be working with a team of professionals with a broad range of responsibilities including all aspects of software engineering like requirements, understanding and validation, solution design detailed design, development, testing and software configuration management. Build products, systems, and services that are optimized, well organized and maintainable, and have a high impact on our end users. Role Responsibilities Applying Data Science methods in solving business use cases preferably in Banking and Financial Services, Payments and Fintech domain. Demonstrate strong capabilities in assessing business needs while providing creative and effective solutions in conformance to emerging technology standards. Partners with key business stakeholders through the projects and ensure smooth knowledge transfer for better utilization of the end work product in business decisioning. Works with cross-functional teams to develop and implement the AI/ML solutions for frequent usage in decisioning. Design, Build, Deploy, Measure Performance of AI/ML Solutions that align with the stakeholder expectations using the appropriate and most effective ML methods. To utilize expertise in Analytics, Machine Learning and AI to build solutions that leverage relevant data sources. Strong emphasis on customer journey, product quality, performance tuning, troubleshooting, and continuous development. Breaks down the problem into its constituent parts; evaluates the available solution options while solving problems. Can prioritize individual tasks based on project criticalities, proactively plans based on critical inputs from historical data, analyses work output and plans for contingencies. Role Requirements 1-2 years of extensive experience in Machine Learning (supervised and unsupervised both), Classification, Data/Text Mining, NLP, Decision Trees, Random Forest, Model Explain ability, Bias Detection, ML model deployment. Hands on expertise in building and deploying machine learning models using tools like Python, AWS Sagemaker, Dataiku. Familiarity with building data pipelines in ETL tools like Matillion/Talend to support the AWS model deployments would be a plus. Extra points for having experience in building GenAI solutions for business use cases using AWS services like Bedrock with a good understanding of LLM models commonly used today. Team player with strong analytical, verbal, and written communication skills. Being Curious, Inquisitive and Creative is what will help you excel in the role. Ability to work in a fast paced, iterative development environment and adapt to changing business priorities and to thrive under pressure We make financial services accessible to humans everywhere. Join us for what’s next. Western Union is positioned to become the world’s most accessible financial services company —transforming lives and communities. We’re a diverse and passionate customer-centric team of over 8,000 employees serving 200 countries and territories, reaching customers and receivers around the globe. More than moving money, we design easy-to-use products and services for our digital and physical financial ecosystem that help our customers move forward. Just as we help our global customers prosper, we support our employees in achieving their professional aspirations. You’ll have plenty of opportunities to learn new skills and build a career, as well as receive a great compensation package. If you’re ready to help drive the future of financial services, it’s time for Western Union. Learn more about our purpose and people at https://careers.westernunion.com/. Benefits You will also have access to short-term incentives, multiple health insurance options, accident and life insurance, and access to best-in-class development platforms, to name a few(https://careers.westernunion.com/global-benefits/). Please see the location-specific benefits below and note that your Recruiter may share additional role-specific benefits during your interview process or in an offer of employment. Your India Specific Benefits Include Employees Provident Fund [EPF] Gratuity Payment Public holidays Annual Leave, Sick leave, Compensatory leave, and Maternity / Paternity leave Annual Health Check-up Hospitalization Insurance Coverage (Mediclaim) Group Life Insurance, Group Personal Accident Insurance Coverage, Business Travel Insurance Cab Facility Relocation Benefit Western Union values in-person collaboration, learning, and ideation whenever possible. We believe this creates value through common ways of working and supports the execution of enterprise objectives which will ultimately help us achieve our strategic goals. By connecting face-to-face, we are better able to learn from our peers, solve problems together, and innovate. Our Hybrid Work Model categorizes each role into one of three categories. Western Union has determined the category of this role to be Hybrid. This is defined as a flexible working arrangement that enables employees to divide their time between working from home and working from an office location. The expectation for Hybrid roles in the Philippines is to work from the office at least 70% of the employee’s working days per month. We are passionate about diversity. Our commitment is to provide an inclusive culture that celebrates the unique backgrounds and perspectives of our global teams while reflecting the communities we serve. We do not discriminate based on race, color, national origin, religion, political affiliation, sex (including pregnancy), sexual orientation, gender identity, age, disability, marital status, or veteran status. The company will provide accommodation for applicants, including those with disabilities, during the recruitment process, following applicable laws. Estimated Job Posting End Date 07-13-2025 This application window is a good-faith estimate of the time that this posting will remain open. This posting will be promptly updated if the deadline is extended or the role is filled.

Posted 3 weeks ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description We are looking for a highly skilled and experienced Snowflake Architect to lead the design, development, and deployment of enterprise-grade cloud data solutions. The ideal candidate will have a strong background in data architecture, cloud data platforms, and Snowflake implementation, with hands-on experience in end-to-end data pipeline and data warehouse design. This role requires strategic thinking, technical leadership, and the ability to work collaboratively across cross-functional Responsibilities : Lead the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. Define data modeling standards, best practices, and governance frameworks. Design and optimize ETL/ELT pipelines using tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Collaborate with stakeholders to understand data requirements and translate them into robust architectural solutions. Implement data security, privacy, and role-based access controls within Snowflake. Guide development teams on performance tuning, query optimization, and cost management in Snowflake. Ensure high availability, fault tolerance, and compliance across data platforms. Mentor developers and junior architects on Snowflake capabilities and Skills & Experience : 8+ years of overall experience in data engineering, BI, or data architecture, with at least 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. Strong experience with SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. Good understanding of data lakes, data mesh, and modern data stack principles. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. Solid knowledge of data governance, metadata management, and cataloging to Have : Snowflake certification (e.g., SnowPro Core/Advanced Architect). Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. Knowledge of data visualization tools such as Power BI, Tableau, or Looker. Experience in healthcare, BFSI, or retail domain projects (ref:hirist.tech)

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Supply Chain Data Integration Consultant Senior The opportunity We're looking for Senior Level Consultants with expertise in Data Modelling, Data Integration, Data Manipulation, and analysis to join the Supply Chain Technology group of our GDS consulting Team. This is a fantastic opportunity to be part of a leading firm while being instrumental in the growth of a new service offering. This role demands a highly technical, extremely hands-on Data Warehouse Modelling consultant who will work closely with our EY Partners and external clients to develop new business as well as drive other initiatives on different business needs. The ideal candidate must have a good understanding of the value of data warehouse and ETL with Supply Chain industry knowledge and proven experience in delivering solutions to different lines of business and technical leadership. Your key responsibilities A minimum of 5+ years of experience in BI/Data integration/ETL/DWH solutions in cloud and on-premises platforms such as Informatica/PC/IICS/Alteryx/Talend/Azure Data Factory (ADF)/SSIS/SSAS/SSRS and experience on any reporting tool like Power BI, Tableau, OBIEE, etc. Performing Data Analysis and Data Manipulation as per client requirements. Expert in Data Modelling to simplify business concepts. Create extensive ER Diagrams to help business in decision-making. Working experience with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures, and integrated datasets using data integration technologies. Should be able to develop sophisticated workflows & macros (Batch, Iterative, etc.) in Alteryx with enterprise data. Design and develop ETL workflows and datasets in Alteryx to be used by the BI Reporting tool. Perform end-to-end Data validation to maintain the accuracy of data sets. Support client needs by developing SSIS Packages in Visual Studio (version 2012 or higher) or Azure Data Factory (Extensive hands-on experience implementing data migration and data processing using Azure Data Factory). Support client needs by delivering Various Integrations with third-party applications. Experience in pulling data from a variety of data source types using appropriate connection managers as per Client needs. Develop, Customize, Deploy, maintain SSIS packages as per client business requirements. Should have thorough knowledge in creating dynamic packages in Visual Studio with multiple concepts such as - reading multiple files, Error handling, Archiving, Configuration creation, Package Deployment, etc. Experience working with clients throughout various parts of the implementation lifecycle. Proactive with a Solution-oriented mindset, ready to learn new technologies for Client requirements. Analyzing and translating business needs into long-term solution data models. Evaluating existing Data Warehouses or Systems. Strong knowledge of database structure systems and data mining. Skills and attributes for success Deliver large/medium DWH programs, demonstrate expert core consulting skills and an advanced level of Informatica, SQL, PL/SQL, Alteryx, ADF, SSIS, Snowflake, Databricks knowledge, and industry expertise to support delivery to clients. Demonstrate management and an ability to lead projects or teams individually. Experience in team management, communication, and presentation. To qualify for the role, you must have 5+ years ETL experience as Lead/Architect. Expertise in the ETL Mappings, Data Warehouse concepts. Should be able to design a Data Warehouse and present solutions as per client needs. Thorough knowledge in Structured Query Language (SQL) and experience working on SQL Server. Experience in SQL tuning and optimization using explain plan and SQL trace files. Should have experience in developing SSIS Batch Jobs Deployment, Scheduling Jobs, etc. Building Alteryx workflows for data integration, modeling, optimization, and data quality. Knowledge of Azure components like ADF, Azure Data Lake, and Azure SQL DB. Knowledge of data modeling and ETL design. Design and develop complex mappings, Process Flows, and ETL scripts. In-depth experience in designing the database and data modeling. Ideally, you'll also have Strong knowledge of ELT/ETL concepts, design, and coding. Expertise in data handling to resolve any data issues as per client needs. Experience in designing and developing DB objects such as Tables, Views, Indexes, Materialized Views, and Analytical functions. Experience of creating complex SQL queries for retrieving, manipulating, checking, and migrating complex datasets in DB. Experience in SQL tuning and optimization using explain plan and SQL trace files. Candidates ideally should have ideally good knowledge of ETL technologies/tools such as Alteryx, SSAS, SSRS, Azure Analysis Services, Azure Power Apps. Good verbal and written communication in English, Strong interpersonal, analytical, and problem-solving abilities. Experience of interacting with customers in understanding business requirement documents and translating them into ETL specifications and High- and Low-level design documents. Candidates having additional knowledge of BI tools such as PowerBi, Tableau, etc will be preferred. Experience with Cloud databases and multiple ETL tools. What we look for The incumbent should be able to drive ETL Infrastructure related developments. Additional knowledge of complex source system data structures preferably in Financial services (preferred) Industry and reporting related developments will be an advantage. An opportunity to be a part of market-leading, multi-disciplinary team of 10000 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY GDS consulting practices globally with leading businesses across a range of industries. What working at EY offers At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies, and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching, and feedback from some of the most engaging colleagues around. Opportunities to develop new skills and progress your career. The freedom and flexibility to handle your role in a way that's right for you. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Data Quality Tester at ValueLabs, you will play a crucial role in ensuring the accuracy and integrity of data by validating data sources, applying transformation logic, and uploading data into target tables. Your responsibilities will include not only validating your own work but also ensuring the quality of the entire development team's output. To excel in this role, you should have a solid foundation in SQL, PYTHON, and JavaScript. You will be expected to identify test approaches, coverage gaps, and potential risks, as well as develop strategies for risk mitigation. Additionally, you will be involved in data conversion testing, data maintenance, and authoring SQL queries to extract and analyze data across various databases. Your expertise in database-related testing and ETL testing will be essential for this position. Familiarity with data quality tools such as Microsoft SQL Server Data Quality Services, Talend, or Informatica is highly preferred. You should also possess strong data quality skills, including data discovery, profiling, lineage analysis, root cause analysis, and issue remediation. Collaboration with stakeholders to address data quality issues will be a key aspect of your role. You will be involved in all stages of data quality management, from identifying issues to implementing end-to-end data quality projects. Your ability to review data models, data mappings, and architectural documentation will be critical in creating and executing effective system integration testing (SIT) plans and test cases. If you have over 5 years of experience in data testing and are ready to take on this challenging role, we encourage you to apply. We are looking for immediate joiners who can contribute to our team within 15 days. Please send your CV to imranmohammed.1@valuelabs.com if you are interested in exploring this opportunity further.,

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Engineer – C11/Officer (India) The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Pune

Work from Office

Role Purpose The purpose of the role is to provide effective technical support to the process and actively resolve client issues directly or through timely escalation to meet process SLAs. Do Support process by managing transactions as per required quality standards Fielding all incoming help requests from clients via telephone and/or emails in a courteous manner Document all pertinent end user identification information, including name, department, contact information and nature of problem or issue Update own availability in the RAVE system to ensure productivity of the process Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Follow standard processes and procedures to resolve all client queries Resolve client queries as per the SLAs defined in the contract Access and maintain internal knowledge bases, resources and frequently asked questions to aid in and provide effective problem resolution to clients Identify and learn appropriate product details to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Maintain and update self-help documents for customers to speed up resolution time Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by complying with service agreements Deliver excellent customer service through effective diagnosis and troubleshooting of client queries Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Assist clients with navigating around product menus and facilitate better understanding of product features Troubleshoot all client queries in a user-friendly, courteous and professional manner Maintain logs and records of all customer queries as per the standard procedures and guidelines Accurately process and record all incoming call and email using the designated tracking software Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract /SLAs Build capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Partner with team leaders to brainstorm and identify training themes and learning issues to better serve the client Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Pentaho DI - Kettle. Experience:3-5 Years.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

15 - 19 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Project description During the 2008 financial crisis, many big banks failed or faced issues due to liquidity issues. Lack of liquidity can kill any financial institution over the night. That's why it's so critical to constantly monitor liquidity risks and properly maintain collaterals. We are looking for a number of talented developers, who would like to join our team in Pune, which is building liquidity risk and collateral management platform for one of the biggest investment banks over the globe. The platform is a set of front-end tools and back-end engines. Our platform helps the bank to increase efficiency and scalability, reduce operational risk and eliminate the majority of manual interventions in processing margin calls. Responsibilities The candidate will work on development of new functionality for Liqudity Risk platform closely with other teams over the globe. Skills Must have BigData experience (6 years+); Java/python J2EE, Spark, Hive; SQL Databases; UNIX Shell; Strong Experience in Apache Hadoop, Spark, Hive, Impala, Yarn, Talend, Hue; Big Data Reporting, Querying and analysis. Nice to have Spark Calculators based on business logic/rules Basic performance tuning and troubleshooting knowledge Experience with all aspects of the SDLC Experience with complex deployment infrastructures Knowledge in software architecture, design and testing Data flow automation (Apache NiFi, Airflow etc) Understanding of difference between OOP and Functional design approach Understanding of an event driven architecture Spring, Maven, GIT, uDeploy;

Posted 3 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Pune

Work from Office

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally Mandatory Skills: Talend DI. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

27 - 32 Lacs

Pune

Work from Office

: In Scope of Position based Promotions (INTERNAL only) Job TitleGlobal Reporting GRH LocationPune, India Role Description It is crucial for the bank to understand how profitable each businesses activity is and Finance has a responsibility to understand precisely the resource commitment the bank makes to any given client or transaction e.g. cost, capital, funding, liquidity and risk. Finance is playing a central role in keeping the bank focused on simplification and financial resource management. With our diverse teams in 47 countries, we offer a broad portfolio of capabilities. Our key functions range from Group Finance, Treasury, Planning and Performance Management, and Investor Relations to enabling functions such as Finance Change and Administration. These teams make sure we cover all Finance specific aspects for our internal and external stakeholders such as shareholder, employees, clients and regulators. Together, it is the role of Finance to oversee all financial details for Deutsche Bank globally. Sound financial principles are at the core of everything we do. Thats why Finance is vital to the way we run our business. In a global marketplace thats constantly evolving, being adaptable, decisive and accurate is critical Primary objective of the role is to produce and distribute LCR/NSFR reports for local entities within Deutsche Bank . Regular product-level and metric level analytics before final distribution of the metrics to regulators. What well offer you 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Accident and Term life Insurance Your key responsibilities Position Specific Responsibilities and Accountabilities Partner with cross-functional teams to define and implement strategic reporting and automation solutions. Drive business adoption of reporting tools to support cost optimization and improve reporting usability across the organization. Evaluate and recommend tools based on cost, infrastructure readiness, onboarding complexity, and resource availability. Apply deep knowledge of reporting platforms and domain-specific problem statements to deliver impactful solutions. Lead initiatives to standardize reporting frameworks and ensure alignment with enterprise data governance and compliance standards. Preferred Tools & Technologies Reporting & VisualizationSAP Business Objects, SAP Lumira, SAP Analytics Cloud ETL & AutomationETL tools (e.g., Informatica, Talend), scripting for automation Data VisualizationTableau, Power BI, or equivalent platforms Your skills and experience Experience/ Exposure Strong data analysis skills & attention to detail Strong communication skills, both oral and written Strong IT skills, primarily on SAP Business Objects (Web Intelligence, Lumira) and SAP Analytics Cloud Good experience in ETL and Visualization tools. Knowledge of Financial Planning and Performance within a business or infrastructure function in a banking environment. Experience in leading implementations involving multiple dependencies and stakeholder groups would be beneficial. Ability to challenge in a constructive way to ensure optimal outcomes. History of taking initiative, being pro-active and ability to work independently. Open mindset, willing to work collaboratively with the team to problem solve and brainstorm and open to feedback. Educated to bachelors degree level in a relevant financial discipline or engineering degree or equivalent qualification / work experience. Education/ Qualifications Bachelor degree or equivalent qualification. How well support you

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Talend Big Data. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description:- Technical Business Analyst (COUPA) Location:- LTIM Pan India Experience Required:- 10+ Years Notice Period:- Immedidate Joiner-1 Month Job Description:- This role would act as technical business analyst by leveraging in-depth technical understanding of the Coupa Platform (herein referred as “platform”) and its integrations. The role would act as the technical SME of the platform and would be accountable for scaling/change/programme management by working with the necessary stakeholders. Technical Ownership, Integration & Middleware Oversight, Data Migration Execution, Delivery Management, Stakeholder Management & Governance, DevOps Skills Essential Skills:- • Coupa certified • 10+ years of overall IT experience with strong background on Coupa Technical delivery roles, with proven experience in large-scale data migration and middleware integration. • Experience with integration technologies such as MuleSoft, Dell Boomi, Azure Integration Services, Kafka, or similar. • Proficient in ETL tools and practices (e.g., Informatica, Talend, SQL-based scripting). • Familiarity with cloud platforms (AWS, Azure, GCP) and hybrid integration strategies. • Strong problem-solving and analytical skills in technical and data contexts. • Ability to translate complex technical designs into business-aligned delivery outcomes. • Leadership in cross-functional and cross-technology environments. • Effective communicator capable of working with developers, data engineers, testers, and business stakeholders. • Experienced with IT Service Management tools like ServiceNow & Jira • Experience in managing and developing 3rd party business relationships Educational Qualifications • UG - B. Tech /B.E. or other equivalent technical qualifications Personal Attributes:

Posted 3 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLAs defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers and clients business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLAs Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Mandatory Skills: Pentaho DI - Kettle. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Mantas Scenario Developer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Hands on Mantas expert throughout the full development life cycle, including: requirements analysis, functional design, technical design, programming, testing, documentation, implementation, and on-going technical support Translate business needs (BRD) into effective technical solutions and documents (FRD/TSD) Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Mantas: Expert in Oracle Mantas/FCCM, Scenario Manager, Scenario Development, thorough knowledge and hands on experience in Mantas FSDM, DIS, Batch Scenario Manager Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Role: Senior Associate – Data Engineer Tower: Data Analytics & Insights Managed Service Experience: 6 - 10 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore AC Job Description As a Managed Services - Data Engineer Senior Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Proven track record as an SME in chosen domain. Mentor Junior resources within the team, conduct KSS and lessons learnt. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review. Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, SSIS, SSMS, Informatica, Python Secondary Skill: Azure/AWS/GCP (preferrable anyone), Power BI, Advance Excel, Excel Macro Data Ingestion Senior Associate Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, SSIS, SSMS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Hadoop, Spark etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Should have experience in creating visually impactful dashboards in Tableau for data reporting. Extract, interpret and analyze data to identify key metrics and transform raw data into meaningful, actionable information. Good understanding of formulas / DAX, Measures, Establishing hierarchies, Data refresh, Row/column/report level security, Report governance, complex visualizations, level of detail (LOD) expressions etc. Ability to create and replicate functionalities like parameters (for top fields, sheet switching), interactive buttons to switch between dashboards, burger menus etc. Participate in requirement gathering with business and evaluate the data as per the requirement. Coordinate and manage data analytics & Reporting activities with stakeholders. Expertise in writing and analyzing complex SQL queries. Excellent problem solving, design, debugging, and testing skills, Competency in Excel (macros, pivot tables, etc.) Good to have minimum 5 years’ hands on Experience of delivering Managed Data and Analytics programs (Managed services and Managed assets) Should have Strong communication, problem solving, quantitative and analytical abilities. Effectively communicate with project team members and sponsors throughout the project lifecycle (status updates, gaps/risks, roadblocks, testing outcomes) Nice To Have Certification in any cloud platform Experience in Data ingestion technology by using any of the industry tools like Informatica, Talend, DataStage, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 3 weeks ago

Apply

2.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Guidewire developer at PwC, you will specialise in developing and customising applications using the Guidewire platform. Guidewire is a software suite that provides insurance companies with tools for policy administration, claims management, and billing. You will be responsible for designing, coding, and testing software solutions that meet the specific needs of insurance organisations. Focused on relationships, you are building meaningful client connections, and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn’t clear, you ask questions, and you use these moments as opportunities to grow. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Respond effectively to the diverse perspectives, needs, and feelings of others. Use a broad range of tools, methodologies and techniques to generate new ideas and solve problems. Use critical thinking to break down complex concepts. Understand the broader objectives of your project or role and how your work fits into the overall strategy. Develop a deeper understanding of the business context and how it is changing. Use reflection to develop self awareness, enhance strengths and address development areas. Interpret data to inform insights and recommendations. Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. A career in our Managed Services team will provide you an opportunity to collaborate with a wide array of teams to help our clients implement and operate new capabilities, achieve operational efficiencies, and harness the power of technology. Our Application Evolution Services (formerly Application Managed Services) team will provide you with the opportunity to help organizations harness the power of their enterprise applications by optimizing the technology while driving transformation and innovation to increase business performance. We assist our clients in capitalizing on technology improvements, implementing new capabilities, and achieving operational efficiencies by managing and maintaining their application ecosystems. We help our clients maximize the value of their investment by managing the support and continuous transformation of their solutions in the areas of sales, finance, supply chain, engineering, manufacturing and human capital. To really stand out and make us fit for the future in a constantly changing world, each and every one of us at PwC needs to be a purpose-led and values-driven leader at every level. To help us achieve this we have the PwC Professional; our global leadership development framework. It gives us a single set of expectations across our lines, geographies and career paths, and provides transparency on the skills we need as individuals to be successful and progress in our careers, now and in the future. Responsibilities As a Associate, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution. PwC Professional skills and responsibilities for this management level include but are not limited to: Encourage everyone to have a voice and invite opinion from all, including quieter members of the team. Deal effectively with ambiguous and unstructured problems and situations. Initiate open and candid coaching conversations at all levels. Move easily between big picture thinking and managing relevant detail. Anticipate stakeholder needs, and develop and discuss potential solutions, even before the stakeholder realizes they are required. Contribute functional knowledge in your area of expertise. Contribute to an environment where people and technology thrive together to accomplish more than they could apart. Navigate the complexities of cross-border and/or diverse teams and engagements. Initiate and lead open conversations with teams, clients and stakeholders to build trust. Uphold the firm's code of ethics and business conduct. Within our global, Managed Services platform, we provide Application Evolution Services (formerly Application Managed Services), where we focus more so on the evolution of our clients’ applications and cloud portfolio. Our focus is to empower our client’s to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. Basic Qualifications Job Requirements and Preferences : Minimum Degree Required: Bachelor Degree or equivalent Preferred Qualifications Preferred Knowledge/Skills: Demonstrates Expert Abilities And Extensive Application Managed Service Projects And Solutioning The Datahub Integration With Guidewire Suite Of Applications On Premises And SaaS, With Proven Success Executing And Leading All Aspects Of Complex Engagements Within The Datahub Application Achieving On-time And On-budget Delivery, As Well As The Following 2+ years of experience as Data analyst for Datahub and its integration and reporting tools Strong understanding of data warehouse concepts, data mapping with integrations Good knowledge on SQL queries, analytical services, reporting services Experience with one or more SDLC methodologies Expertise related to metadata management, data modeling, data model rationalization, and database products Understands context of the project within the larger portfolio Demonstrated a strong attention to detail Possesses strong analytical skills Demonstrated a strong sense of ownership and commitment to program goals Strong verbal and written communication skills Identifies and capture operational database requirements and proposed enhancements in support of requested application development or business functionality Develops and translates business requirements into detailed data designs Maps between systems Assists development teams and QA teams as application data analyst and support production implementations Identifies entities, attributes, and referential relationships for data models using a robust enterprise information engineering approach Participates in data analysis, archiving, database design, and development activities for migration of existing data as needed Develops ETL interfaces from source databases and systems to the data warehouse Works closely with application development teams to ensure quality interaction with the database Job Functions To be responsible for providing functional guidance or solutions To develop and guide the team members in enhancing their functional understanding and increasing productivity To ensure process compliance in the assigned module, and participate in functional discussions or review. To prepare and submit status reports for minimizing exposure and risks on the project or closure of escalations. Technologies Guidewire Datahub, Integration with Guidewire suite of applications and Conversion ETL tools SQL competence (and a grasp of database structure are required. Understanding of data modeling concepts. Knowledge of at least one ETL tool (Informatica, Snowflake SSIS, Talend, etc.) At PwC, our work model includes three ways of working: virtual, in-person, and flex (a hybrid of in-person and virtual). Visit the following link to learn more: https://pwc.to/ways-we-work. PwC does not intend to hire experienced or entry level job seekers who will need, now or in the future, PwC sponsorship through the H-1B lottery, except as set forth within the following policy: https://pwc.to/H-1B-Lottery-Policy. All qualified applicants will receive consideration for employment at PwC without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran, marital, or citizenship status; or any other status protected by law. PwC is proud to be an affirmative action and equal opportunity employer. For positions based in San Francisco, consideration of qualified candidates with arrest and conviction records will be in a manner consistent with the San Francisco Fair Chance Ordinance. For positions in Colorado, visit the following link for information related to Colorado's Equal Pay for Equal Work Act: https://pwc.to/coloradoadvisoryseniormanager.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

8 - 10 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 1 Opening Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: UST is seeking a highly skilled and motivated Lead Data Engineer to join our Telecommunications vertical, leading impactful data engineering initiatives for US-based Telco clients. The ideal candidate will have 6–8 years of experience in designing and developing scalable data pipelines using Snowflake, Azure Data Factory, Azure Databricks. Proficiency in Python, PySpark, and advanced SQL is essential, with a strong focus on query optimization, performance tuning, and cost-effective architecture. A solid understanding of data integration, real-time and batch processing, and metadata management is required, along with experience in building robust ETL/ELT workflows. Candidates should demonstrate a strong commitment to data quality, validation, and consistency, with working knowledge of data governance, RBAC, encryption, and compliance frameworks considered a plus. Familiarity with Power BI or similar BI tools is also advantageous, enabling effective data visualization and storytelling. The role demands the ability to work in a dynamic, fast-paced environment, collaborating closely with stakeholders and cross-functional teams while also being capable of working independently. Strong communication skills and the ability to coordinate across multiple teams and stakeholders are critical for success. In addition to technical expertise, the candidate should bring experience in solution design and architecture planning, contributing to scalable and future-ready data platforms. A proactive mindset, eagerness to learn, and adaptability to the rapidly evolving data engineering landscape—including AI integration into data workflows—are highly valued. This is a leadership role that involves mentoring junior engineers, fostering innovation, and driving continuous improvement in data engineering practices. Skills Azure Databricks,Snowflake,python,Data Engineering About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 3 weeks ago

Apply

4.0 - 6.0 years

10 - 15 Lacs

Noida

On-site

Job Title: Data Migration Engineer Location: Noida Experience: 4 To 6 Years Key Responsibilities: Design and implement scalable ETL pipelines for structured/unstructured data. Perform data ingestion and transformation using Google BigQuery . Work with SQL , Python/Shell scripting , and tools like Talend, NiFi, Informatica , or AWS Glue . Handle migrations across cloud platforms (AWS, GCP, Azure). Apply best practices in Data Modeling, schema design , and performance optimization . Mandatory Key Skills: ETL & Data Migration SQL, Python, Shell Google BigQuery Talend / NiFi / Informatica / AWS Glue PostgreSQL, Oracle, MongoDB, Cassandra AWS / GCP / Azure Kafka / Debezium / Dataflow (preferred) Apache Airflow / DBT (preferred) Data Security & Compliance For More Information Interested can Directly Contact at 7433085125. Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,500,000.00 per year Benefits: Flexible schedule Health insurance Life insurance Provident Fund Schedule: Day shift Fixed shift Monday to Friday Supplemental Pay: Performance bonus Work Location: In person

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Responsibilities Collect and understand the report requirements by working with different SMEs • Create custom tables/views using ETL tools like SSIS or any cloud based ETL tool to support report/dashboard development • Create datasets using SQL queries and stored procedures using SSMS or others with best performance. • Develop reports/dashboards using Power BI. • Deploy the reports/DB/ETL objects to production environment • Monitor the deployed reports/DB/ETL objects and make changes to the objects in lieu of source application changes • Communicate the overall report design with both technical and business stakeholders • Establish quality processes to deliver a stable and reliable reporting solutions to the client • Production monitoring and support Qualifications • Excellent SQL skills - Write SQL queries, create stored procedures, fix performance issues • Excellent Reporting skills – Hands on experience in Power BI reporting tool. • Good ETL knowledge – Hands on experience in any ETL tool like SSIS/Informatica/Talend • Hands on in Snowflake a cloud-based data platform is preferred. • Understanding on Data warehouse, fact and dimensional tables, star and snowflake schema • Banking knowledge (Preferred – Deposits and Loans understanding)

Posted 3 weeks ago

Apply

3.0 - 6.0 years

0 Lacs

Jaipur, Rajasthan, India

On-site

Unlock yourself. Take your career to the next level. At Atrium, we live and deliver at the intersection of industry strategy, intelligent platforms, and data science — empowering our customers to maximize the power of their data to solve their most complex challenges. We have a unique understanding of the role data plays in the world today and serve as market leaders in intelligent solutions. Our data-driven, industry-specific approach to business transformation for our customers places us uniquely in the market. Who are you? You are smart, collaborative and take ownership to get things done. You love to learn and are intellectually curious in business and technology tools, platforms and languages. You are energized by solving complex problems and bored when you don’t have something to do. You love working in teams, and are passionate about pulling your weight to make sure the team succeeds. What will you be doing at Atrium? In this role, you will join the best and brightest in the industry to skillfully push the boundaries of what’s possible. You will work with customers to make smarter decisions through innovative problem-solving using data engineering, Analytics, and systems of intelligence. You will partner to advise, implement, and optimize solutions through industry expertise, leading cloud platforms, and data engineering. As a Senior Data Engineering Consultant, you will be responsible for expanding and optimizing the data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams. You will support the software developers, database architects, data analysts, and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. The Senior Data Engineering Consultant Will Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, AWS, and Big Data technologies Development of ETL processes to ensure timely delivery of required data for customers Implementation of Data Quality measures to ensure accuracy, consistency, and integrity of data Design, implement, and maintain data models that can support the organization's data storage and analysis needs Deliver technical and functional specifications to support data governance and knowledge sharing In This Role, You Will Have B.Tech degree in Computer Science, Software Engineering, or equivalent combination of relevant work experience and education 3-6 years of experience delivering consulting services to medium and large enterprises. Implementations must have included a combination of the following experiences: Data Warehousing or Big Data consulting for mid-to-large-sized organizations. Strong analytical skills with a thorough understanding of how to interpret customer business needs and translate those into a data architecture Strong experience with Snowflake and Data Warehouse architecture SnowPro Core certification is highly desired Hands-on experience with Python (Pandas, Dataframes, Functions) Hands-on experience with SQL (Stored Procedures, functions) including debugging, performance optimization, and database design Strong Experience with Apache Airflow and API integrations Solid experience in any one of the ETL tools (Informatica, Talend, SAP BODS, DataStage, Dell Boomi, Mulesoft, FiveTran, Matillion, etc.) Nice to have: Experience in Docker, DBT, data replication tools (SLT, HVR, Qlik, etc), Shell Scripting, Linux commands, AWS S3, or Big data technologies Strong project management, problem-solving, and troubleshooting skills with the ability to exercise mature judgment Enthusiastic, professional, and confident team player with a strong focus on customer success who can present effectively even under adverse conditions Strong presentation and communication skills Next Steps Our recruitment process is highly personalized. Some candidates complete the hiring process in one week, others may take longer as it’s important we find the right position for you. It's all about timing and can be a journey as we continue to learn about one another. We want to get to know you and encourage you to be selective - after all, deciding to join a company is a big decision! At Atrium, we believe a diverse workforce allows us to match our growth ambitions and drive inclusion across the business. We are an equal opportunity employer and all qualified applicants will receive consideration for employment.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

25 - 30 Lacs

Chennai

Work from Office

Job Summary: We are seeking a highly skilled Data Engineer to design, develop, and maintain robust data pipelines and architectures. The ideal candidate will transform raw, complex datasets into clean, structured, and scalable formats that enable analytics, reporting, and business intelligence across the organization. This role requires strong collaboration with data scientists, analysts, and cross-functional teams to ensure timely and accurate data availability and system performance. Key Responsibilities Design and implement scalable data pipelines to support real-time and batch processing. Develop and maintain ETL/ELT processes that move, clean, and organize data from multiple sources. Build and manage modern data architectures that support efficient storage, processing, and access. Collaborate with stakeholders to understand data needs and deliver reliable solutions. Perform data transformation, enrichment, validation, and normalization for analysis and reporting. Monitor and ensure the quality, integrity, and consistency of data across systems. Optimize workflows for performance, scalability, and cost-efficiency. Support cloud and on-premise data integrations, migrations, and automation initiatives. Document data flows, schemas, and infrastructure for operational and development purposes. • Apply best practices in data governance, security, and compliance. Required Qualifications & Skills: Bachelors or Masters degree in Computer Science, Data Engineering, or a related field. Proven 6+ Years experience in data engineering, ETL development, or data pipeline management. Proficiency with tools and technologies such as: SQL, Python, Spark, Scala ETL tools (e.g., Apache Airflow, Talend) Cloud platforms (e.g., AWS, GCP, Azure) Big Data tools (e.g., Hadoop, Hive, Kafka) Data warehouses (e.g., Snowflake, Redshift, BigQuery) Strong understanding of data modeling, data architecture, and data lakes. Experience with CI/CD, version control, and working in Agile environments. Preferred Qualifications: • Experience with data observability and monitoring tools. • Knowledge of data cataloging and governance frameworks. • AWS/GCP/Azure data certification is a plus.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies