Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12.0 - 17.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer Lead, you will design, develop, and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL processes to migrate and deploy data across systems. A typical day involves working on data solutions and ETL processes. Roles & Responsibilities: Expected to be an SME. Collaborate and manage the team to perform. Responsible for team decisions. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Lead data architecture design. Implement data integration solutions. Optimize ETL processes. Professional & Technical Skills: Must To Have Skills: Proficiency in Talend ETL. Strong understanding of data modeling. Experience with SQL and database management. Knowledge of cloud platforms like AWS or Azure. Hands-on experience with data warehousing. Good To Have Skills: Experience with data visualization tools. Additional Information: The candidate should have a minimum of 12 years of experience in Talend ETL. This position is based at our Hyderabad office. A 15 years full-time education is required. Qualification 15 years full time education
Posted 2 months ago
7.0 - 12.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Talend ETL Good to have skills : NA Minimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will be responsible for designing, developing, and maintaining data solutions for data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across systems. Roles & Responsibilities: Expected to be a SME with deep knowledge and experience. Engage with multiple teams and contribute on key decisions. Expected to provide solutions to problems that apply across multiple teams. Create data pipelines to extract, transform, and load data across systems. Implement ETL processes to migrate and deploy data across systems. Ensure data quality and integrity throughout the data lifecycle. Professional & Technical Skills: Required Skill:Expert proficiency in Talend Big Data. Strong understanding of data engineering principles and best practices. Experience with data integration and data warehousing concepts. Experience with data migration and deployment. Proficiency in SQL and database management. Knowledge of data modeling and optimization techniques. Additional Information: The candidate should have minimum 5 years of experience in Talend Big Data. Qualification 15 years full time education
Posted 2 months ago
6.0 - 10.0 years
12 - 15 Lacs
Mumbai, Gurugram, Bengaluru
Work from Office
Skill/Operating Group Technology Consulting Level Manager Location Gurgaon/Mumbai/Bangalore Travel Percentage Expected Travel could be anywhere between 0-100% Principal Duties And Responsibilities: Working closely with our clients, Consulting professionals design, build and implement strategies that can help enhance business performance. They develop specialized expertise"strategic, industry, functional, technical"in a diverse project environment that offers multiple opportunities for career growth. The opportunities to make a difference within exciting client initiatives are limitless in this ever-changing business landscape. Here are just a few of your day-to-day responsibilities. Identifying, assessing, and solving complex business problems for area of responsibility, where analysis of situations or data requires an in-depth evaluation of variable factors Overseeing the production and implementation of solutions covering multiple cloud technologies, associated Infrastructure / application architecture, development, and operating models Called upon to apply your solid understanding of Data, Data on cloud and disruptive technologies. Implementing programs/interventions that prepare the organization for implementation of new business processes Assisting our clients to build the required capabilities for growth and innovation to sustain high performance Managing multi-disciplinary teams to shape, sell, communicate, and implement programs Experience in participating in client presentations & orals for proposal defense etc. Experience in effectively communicating the target state, architecture & topology on cloud to clients Deep understanding of industry best practices in data governance and management Provide thought leadership to the downstream teams for developing offerings and assets Qualifications Qualifications: Bachelors degree MBA Degree from Tier-1 College (Preferable) 6-10 years of large-scale consulting experience and/or working with hi tech companies in data governance, and data management. Certified on DAMA (Data Management) Experience: We are looking for experienced professionals with information strategy, data governance, data quality, data management, and MDM experience across all stages of the innovation spectrum, with a remit to build the future in real-time. The candidate should have practical industry expertise in one of these areas - Financial Services, Retail, consumer goods, Telecommunications, Life Sciences, Transportation, Hospitality, Automotive/Industrial, Mining and Resources. Key Competencies and Skills: The right candidate should have competency and skills aligned to one or more of these archetypes - Data SME - Experience in deal shaping & strong presentation skills, leading proposal experience, customer orals; technical understanding of data platforms, data on cloud strategy, data strategy, data operating model, change management of data transformation programs, data modeling skills. MDM / DQ/ DG Architect - Data Governance & Management SME for areas including Data Quality, MDM, Metadata, data lineage, data catalog. Experience one or more technologies in this space:Collibra, Talend, Informatica, SAP MDG, Stibo, Alteryx, Alation etc. Exceptional interpersonal and presentation skills - ability to convey technology and business value propositions to stakeholders Capacity to develop high impact thought leadership that articulates a forward-thinking view of the market Other desired skills - Strong desire to work in technology-driven business transformation Strong knowledge of technology trends across IT and digital and how they can be applied to companies to address real world problems and opportunities. Comfort conveying both high-level and detailed information, adjusting the way ideas are presented to better address varying social styles and audiences. Leading proof of concept and/or pilot implementations and defining the plan to scale implementations across multiple technology domains Flexibility to accommodate client travel requirements Published Thought leadership Whitepapers, POVs
Posted 2 months ago
5.0 - 8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 months ago
3.0 - 4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Description Enphase Energy is a global energy technology company and leading provider of solar, battery, and electric vehicle charging products. Founded in 2006, Enphase transformed the solar industry with our revolutionary microinverter technology, which turns sunlight into a safe, reliable, resilient, and scalable source of energy to power our lives. Today, the Enphase Energy System helps people make, use, save, and sell their own power. Enphase is also one of the fastest growing and innovative clean energy companies in the world, with approximately 68 million products installed across more than 145 countries. We are building teams that are designing, developing, and manufacturing next-generation energy technologies and our work environment is fast-paced, fun and full of exciting new projects. If you are passionate about advancing a more sustainable future, this is the perfect time to join Enphase! About the role : The Enphase ‘Analyst – Procurement’ will get involved in Claims Process, Component Capacity and Inventory Analysis, Supplier Risk Assessments and Other Procurement related Analytics. This role is to understand existing process in detail and implement RPA model wherever it is applicable. Perform market research on latest process and procedures available with respect to procurement function and automate/Digitize the process. A highly Challenging Job role where you need to Interact with many stake holders and to solve operational issues. You will be part of the Global Sourcing & Procurement team reporting to Lead Analyst. What you will do : Perform detailed analysis on Component Inventory against the Demand, On-Hand & Open Order Qtys: Use advanced data analytics tools like Power BI or Tableau to visualize inventory data. Implement predictive analytics to forecast demand more accurately. Automate the input data consolidation from different Contract Manufacturers: Use ETL (Extract, Transform, Load) tools like Alteryx or Talend to automate data consolidation. Implement APIs to directly pull data from manufacturers' systems. Prepare and submit a monthly STD cost file to finance as per the corporate calendar timelines: Create a standardized template and automate data entry using Excel macros or Python scripts. Set up reminders and workflows in project management tool to ensure timely submission. Work as a program manager by driving component Qualification process working with cross functional teams to get the Qualification completed on time to achieve planned cost savings: Use project management software like Jira to track progress and deadlines. Regularly hold cross-functional team meetings to ensure alignment and address any roadblocks. Finalize the quarterly CBOM (Costed Bill of Materials) and Quote files from all contract manufacturers by following the CBOM calendar timelines: Implement a centralized database to store and manage CBOM data. Use version control systems to track changes and ensure accuracy. Managing Claims management process with Contract Manufacturers and Suppliers: Develop a standardized claims validation process and effectively track & manage claims. Regularly review and update the claims process to improve efficiency. Do market research on new processes & Best Practices on procurement and see how it can be leveraged in the existing process Perform and maintain detailed analysis on Supplier risk assessment with the help of 3rd party vendors: Regularly review and update risk assessment criteria based on changing market conditions. Compile and perform Supplier pricing trend analysis to support Commodity Managers for their QBRs: Create dashboards in BI tools to visualize pricing trends and support decision-making. Work closely with Commodity Managers and identify the Potential or NPI Suppliers to be evaluated for risk assessments: Maintain a database of potential suppliers and their risk assessment results. Maintain & Manage Item master pricing list by refreshing the data on regular intervals without any errors: Use data validation techniques and automated scripts to ensure data accuracy. Implement a regular review process to update and verify pricing data. Who you are and what you bring : Any Bachelor's degree, preferred in Engineering, with minimum 5+ years of experience in Supply Chain Analytics. Should have very good Analytical & Problem-Solving skills. Should have hands on experience on excel based Automations, using MS Power Query, Excel VBA & Gen AI. Should be open minded and should take ownership. Should have strong Verbal Communication and Presentation skills. Strong professional relationship management with internal and external interfaces. Strong interpersonal skills with proven ability to communicate effectively both verbally and in writing with internal customers and suppliers. Ability to perform effectively and independently in a virtual environment. Ability to effectively manage job responsibilities with minimal supervision Show more Show less
Posted 2 months ago
4.0 - 6.0 years
13 - 18 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What You’ll Do Take primary ownership in driving both self and team efforts across all phases of the project lifecycle, ensuring alignment with business objectives. Translate business requirements into technical specifications and lead team efforts to design, build, and manage technology solutions that effectively address business problems. Develop and apply advanced statistical models and leverage analytic techniques to utilize data for guiding decision-making for clients and internal teams. Apply appropriate development methodologies (e.g., agile, waterfall) and best practices (e.g., mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely project completion. Partner with project and program leads to deliver projects and assist in project management responsibilities, including project planning, people management, staffing, and risk mitigation. Collaborate with team members globally, ensuring seamless communication, sharing responsibilities, and undertaking tasks effectively. Manage a diverse team of skill sets (programmers, cloud analysts, BI developers, reporting, operations, etc.), mentoring and coaching junior members to enhance their skills and capabilities. Lead task planning and distribution across team members, ensuring timely completion with high quality and providing accurate status reports to senior management. Design custom analyses in programming languages (e.g., R, Python), data visualization tools (e.g., Tableau), and other analytical platforms (e.g., SAS, Visual Basic, Excel) to address client needs. Synthesize and communicate results to clients and internal teams through compelling oral and written presentations. Create project deliverables and implement solutions, while exhibiting a continuous improvement mindset and the capability to learn new technologies, business domains, and project management processes. Guide and mentor Associates within teams, fostering a collaborative environment and enhancing team performance. Demonstrate advanced problem-solving skills, ensuring the team continuously improves its capabilities and approaches to challenges. Exhibit a proactive approach to decision-making, considering the broader picture, especially regarding technical nuances and strategic planning. What You’ll Bring Education Bachelor’s or Master’s degree in Computer Science, Engineering, MIS, or related fields, with strong academic performance, especially in analytic and quantitative coursework. Experience Consulting Industry 4-6 years of relevant consulting experience, ideally in medium-to-large scale technology solution delivery projects Technical Skills 1+ year of hands-on experience in data processing solutions, data modeling, and experience with ETL technologies (e.g., Hadoop, Spark, PySpark, Informatica, Talend, SSIS). Proficiency in programming languages like Python, SQL, Java, Scala, and understanding of data structures. Experience with cloud platforms such as AWS, Azure, or GCP, and exposure to distributed computing. Deep expertise in SQL and data management best practices, with a focus on data analytics and visualization. Consulting/Project Leadership Proven experience leading project teams and managing end-to-end delivery, mentoring team members, and maintaining high standards. Ability to translate complex data and analytics concepts into accessible presentations and frameworks for both technical and non-technical stakeholders. Deep understanding of data management best practices and data analytics methodologies, ensuring high-quality data insights. Effective in a global team environment with a readiness to travel as needed. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 months ago
2.0 - 7.0 years
10 - 14 Lacs
Gurugram
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you’ll do Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you’ll bring Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Additional Skills Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 months ago
2.0 - 7.0 years
10 - 14 Lacs
Pune
Work from Office
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it , our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage an d passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. What you'll do Build complex solutions for clients using Programing languages, ETL service platform, Cloud, etc. Undertake complete ownership in accomplishing activities and assigned responsibilities across all phases of project lifecycle to solve business problems across one or more client engagements; Apply appropriate development methodologies (e.g.agile, waterfall) and best practices (e.g.mid-development client reviews, embedded QA procedures, unit testing) to ensure successful and timely completion of assignments; Collaborate with other team members to leverage expertise and ensure seamless transitions; Exhibit flexibility in undertaking new and challenging problems and demonstrate excellent task management; Assist in creating project outputs such as business case development, solution vision and design, user requirements, prototypes, and technical architecture (if needed), test cases, and operations management; Bring transparency in driving assigned tasks to completion and report accurate status; Bring Consulting mindset in problem solving, innovation by leveraging technical and business knowledge/ expertise and collaborate across other teams; Assist senior team members, delivery leads in project management responsibilities What you'll bring Bachelor's degree with specialization in Computer Science, IT or other computer related disciplines with record of academic success; Up to 2 years of relevant consulting industry experience working on small/medium-scale technology solution delivery engagements Experience in ETL interfacing technologies like Informatica, Talend, SSIS, etc. Experience in data warehousing & SQL Exposure to Cloud Platforms will be a plus - AWS, Azure, GCP. Strong verbal and written communication skills with ability to articulate results and issues to internal and client teams; Proven ability to work creatively and analytically in a problem-solving environment; Ability to work within a virtual global team environment and contribute to the overall timely delivery of multiple projects; Willingness to travel to other global offices as needed to work with client or other internal project teams. Perks & Benefits ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At www.zs.com
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Title: Data Quality Lead Department: Data Governance / IT Location: Pune / Bangalore Experience: 6 -8 yrs Notice period: 30 days Key Responsibilities: Lead the development and implementation of the enterprise-wide Data Quality Define and monitor key data quality metrics across various business domains. Collaborate with IT and data governance teams to establish and enforce data governance policies and frameworks. Conduct regular data quality assessments to identify gaps and areas for improvement. Implement data cleansing, validation, and enrichment processes to enhance data accuracy and reliability. Preferred Skills: Experience with tools like Informatica, Talend, Collibra, or similar. Familiarity with regulatory requirements Certification in Data Management or Data Governance. Show more Show less
Posted 2 months ago
5.0 - 8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 5 to 8 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less
Posted 2 months ago
3.0 - 4.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology/MCA 3 to 4 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills T echnical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in at least one of the data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data: Exposure to ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management: Understanding of Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design: Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages: Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps: Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Technical Skills (Valuable) Ab Initio: Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud: Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls: Exposure to data validation, cleansing, enrichment and data controls Containerization: Fair understanding of containerization platforms like Docker, Kubernetes File Formats: Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others: Basics of Job scheduler like Autosys. Basics of Entitlement management Certification on any of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity and affirmative action employer. Qualified applicants will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran. Citigroup Inc. and its subsidiaries ("Citi”) invite all qualified interested applicants to apply for career opportunities. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View the " EEO is the Law " poster. View the EEO is the Law Supplement . View the EEO Policy Statement . View the Pay Transparency Posting Show more Show less
Posted 2 months ago
3.0 years
0 Lacs
India
Remote
🚀 We’re Hiring: Data Engineer | Join Our Team! Location: Remote We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. 🧠 What You'll Do 🔹 Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, and Azure Data Factory 🔹 Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable 🔹 Work on modern data lakehouse architectures and contribute to data governance and quality frameworks 🎯 Tech Stack ☁️ Azure | 🧱 Databricks | 🐍 PySpark | 📊 SQL 👤 What We’re Looking For ✅ 3+ years experience in data engineering or analytics engineering ✅ Hands-on with cloud data platforms and large-scale data processing ✅ Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes Show more Show less
Posted 2 months ago
5.0 - 7.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Job Title: Assistant Manager - Data Engineer Location: Andheri (Mumbai) Job Type: Full-Time Department: IT Position Overview: The Assistant Manager - Data Engineer will play a pivotal role in the design, development, and maintenance of data pipelines that ensure the efficiency, scalability, and reliability of our data infrastructure. This role will involve optimizing and automating ETL/ELT processes, as well as developing and refining databases, data warehouses, and data lakes. As an Assistant Manager, you will also mentor junior engineers and collaborate closely with cross-functional teams to support business goals and drive data excellence. Key Responsibilities: Data Pipeline Development: Design, build, and maintain efficient, scalable, and reliable data pipelines to support data analytics, reporting, and business intelligence initiatives. Database and Data Warehouse Management: Develop, optimize, and manage databases, data warehouses, and data lakes to enhance data accessibility and business decision-making. ETL/ELT Optimization: Automate and optimize data extraction, transformation, and loading (ETL/ELT) processes, ensuring efficient data flow and improved system performance. Data Modeling & Architecture: Develop and maintain data models to enable structured data storage, analysis, and reporting in alignment with business needs. Workflow Management Systems: Implement, optimize, and maintain workflow management tools (e.g., Apache Airflow, Talend) to streamline data engineering tasks and improve operational efficiency. Team Leadership & Mentorship: Guide, mentor, and support junior data engineers to enhance their skills and contribute effectively to projects. Collaboration with Cross-Functional Teams: Work closely with data scientists, analysts, business stakeholders, and IT teams to understand requirements and deliver solutions that align with business objectives. Performance Optimization: Continuously monitor and optimize data pipelines and storage solutions to ensure maximum performance and cost efficiency. Documentation & Process Improvement: Create and maintain documentation for data models, workflows, and systems. Contribute to the continuous improvement of data engineering practices. Qualifications: Educational Background: B.E., B.Tech., MCA Professional Experience: At least 5 to 7 years of experience in a data engineering or similar role, with hands-on experience in building and optimizing data pipelines, ETL processes, and database management. Technical Skills: Proficiency in Python and SQL for data processing, transformation, and querying. Experience with modern data warehousing solutions (e.g., Amazon Redshift, Snowflake, Google BigQuery, Azure Data Lake). Strong background in data modeling (dimensional, relational, star/snowflake schema). Hands-on experience with ETL tools (e.g., Apache Airflow, Talend, Informatica) and workflow management systems . Familiarity with cloud platforms (AWS, Azure, Google Cloud) and distributed data processing frameworks (e.g., Apache Spark). Data Visualization & Exploration: Familiarity with data visualization tools (e.g., Tableau, Power BI) for analysis and reporting. Leadership Skills: Demonstrated ability to manage and mentor a team of junior data engineers while fostering a collaborative and innovative work environment. Problem-Solving & Analytical Skills: Strong analytical and troubleshooting skills with the ability to optimize complex data systems for performance and scalability. Experience in Pharma/Healthcare (preferred but not required): Knowledge of the pharmaceutical industry and experience with data in regulated environments Desired Skills: Familiarity with industry-specific data standards and regulations. Experience working with machine learning models or data science pipelines is a plus. Strong communication skills with the ability to present technical data to non-technical stakeholders. Why Join Us: Impactful Work: Contribute to the pharmaceutical industry by improving data-driven decisions that impact public health. Career Growth: Opportunities to develop professionally in a fast-growing industry and company. Collaborative Environment: Work with a dynamic and talented team of engineers, data scientists, and business stakeholders. Competitive Benefits: Competitive salary, health benefits and more. Show more Show less
Posted 2 months ago
5.0 - 8.0 years
9 - 9 Lacs
Hyderābād
On-site
Title: Data Integration Developer – Assistant Manager Department: Alpha Data Platform Reports To: Data Integration Lead, Engineering Summary: State Street Global Alpha Data Platform , lets you load, enrich and aggregate investment data. Alpha Clients will be able to manage multi-asset class data from any service provider or data vendor for a more holistic and integrated view of their holdings. This platform reflects State Street’s years of experience servicing complex instruments for our global client base and our investments in building advanced data management technologies. Reporting to the Alpha Development delivery manager in <
Posted 2 months ago
4.0 years
6 - 9 Lacs
Hyderābād
On-site
About Citco Citco is a global leader in fund services, corporate governance and related asset services with staff across 80 offices worldwide. With more than $1.7 trillion in assets under administration, we deliver end-to-end solutions and exceptional service to meet our clients’ needs. For more information about Citco, please visit www.citco.com About the Team & Business Line: Citco Fund Services is a division of the Citco Group of Companies and is the largest independent administrator of Hedge Funds in the world. Our continuous investment in learning and technology solutions means our people are equipped to deliver a seamless client experience. This position reports in to the Loan Services Business Line As a core member of our Loan Services Data and Reporting team, you will be working with some of the industry’s most accomplished professionals to deliver award-winning services for complex fund structures that our clients can depend upon Job Duties in Brief: Your Role: Develop and execute database queries and conduct data analyses Create scripts to analyze and modify data, import/export scripts and execute stored procedures Model data by writing SQL queries/Python codes to support data integration and dashboard requirements Develop data pipelines that provide fast, optimized, and robust end-to-end solutions Leverage and contribute to design/building relational database schemas for analytics. Handle and manipulate data in various structures and repositories (data cube, data mart, data warehouse, data lake) Analyze, implement and contribute to building of APIs to improve data integration pipeline Perform data preparation tasks including data cleaning, normalization, deduplication, type conversion etc. Perform data integration through extracting, transforming and loading (ETL) data from various sources. Identify opportunities to improve processes and strategies with technology solutions and identify development needs in order to improve and streamline operations Create tabular reports, matrix reports, parameterized reports, visual reports/dashbords in a reporting application such as Power BI Desktop/Cloud or QLIK Integrating PBI/QLIK reports into other applications using embedded analytics like Power BI service (SaaS), or by API automation is also an advantage Implementation of NLP techniques for text representation, semantic extraction techniques, data structures and modelling Contribute to deployment and maintainence of machine learning solutions in production environments Building and Designing cloud applications using Microsoft Azure/AWS cloud technologies. About You: Background / Qualifications Bachelor’s Degree in technology/related field or equivalent work experience 4+ Years of SQL and/or Python experience is a must Strong knowledge of data concepts and tools and experienced in RDMS such as MS SQL Server, Oracle etc. Well-versed with concepts and techniques of Business Intelligence and Data Warehousing. Strong database designing and SQL skills. objects development, performance tuning and data analysis In-depth understanding of database management systems, OLAP & ETL frameworks Familiarity or hands on experience working with REST or SOAP APIs Well versed with concepts for API Management and Integration with various data sources in cloud platforms, to help with connecting to traditional SQL and new age data sources, such as Snowflake Familiarity with Machine Learning concepts like feature selection/deep learning/AI and ML/DL frameworks (like Tensorflow or PyTorch) and libraries (like scikit-learn, StatsModels) is an advantage Familiarity with BI technologies (e.g. Microsoft Power BI, Oracle BI) is an advantage Hands-on experience at least in one ETL tool (SSIS, Informatica, Talend, Glue, Azure Data factory) and associated data integration principles is an advantage Minimum 1+ year experience with Cloud platform technologies (AWS/Azure), including Azure Machine Learning is desirable. Following AWS experience is a plus: Implementing identity and access management (IAM) policies Managing user accounts with IAM Knowledge of writing infrastructure as code (IaC) using CloudFormation or Terraform. Implementing cloud storage using Amazon Simple Storage Service (S3) Experience with serverless approaches using AWS Lambda, e.g. AWS (SAM) Configuring Amazon Elastic Compute Cloud (EC2) Instances Previous Work Experience: Experience querying databases and strong programming skills: Python, SQL, PySpark etc. Prior experience supporting ETL production environments & web technologies such as XML is an advatange Previous working experience on Azure Data Services including ADF, ADLS, Blob, Data Bricks, Hive, Python, Spark and/or features of Azure ML Studio, ML Services and ML Ops is an advantage Experience with dashboard and reporting applications like Qlik, Tableau, Power BI Other: Well rounded individual possessing a high degree of initiative Proactive person willing to accept responsibility with very little hand-holding A strong analytical and logical mindset Demonstrated proficiency in interpersonal and communication skills including oral and written English. Ability to work in fast paced, complex Business & IT environments Knowledge of Loan Servicing and/or Loan Administration is an advantage Understanding of Agile/Scrum methodology as it relates to the software development lifecycle What We Offer: A rewarding and challenging environment that spans multiple geographies and multiple business lines Great working environment, competitive salary and benefits, and opportunities for educational support Be part of an industry leading global organisation, renowned for excellence Opportunities for personal and professional career development Our Benefits Your well-being is of paramount importance to us, and central to our success. We provide a range of benefits, training and education support, and flexible working arrangements to help you achieve success in your career while balancing personal needs. Ask us about specific benefits in your location. We embrace diversity, prioritizing the hiring of people from diverse backgrounds. Our inclusive culture is a source of pride and strength, fostering innovation and mutual respect. Citco welcomes and encourages applications from people with disabilities. Accommodations are available upon request for candidates taking part in all aspects of the selection .
Posted 2 months ago
5.0 years
6 - 9 Lacs
Gurgaon
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title and Summary Senior Analyst, Big Data Analytics & Engineering Overview: Job Title: Sr. Analyst, Data Engineering, Value Quantification Team (Based in Pune, India) About Mastercard: Mastercard is a global technology leader in the payments industry, committed to powering an inclusive, digital economy that benefits everyone, everywhere. By leveraging secure data, cutting-edge technology, and innovative solutions, we empower individuals, financial institutions, governments, and businesses to achieve their potential. Our culture is driven by our Decency Quotient (DQ), ensuring inclusivity, respect, and integrity guide everything we do. Operating across 210+ countries and territories, Mastercard is dedicated to building a sustainable world with priceless opportunities for all. Position Overview: This is a techno-functional position that combines strong technical skills with a deep understanding of business needs and requirements with 5-7 years or experience. The role focuses on developing and maintaining advanced data engineering solutions for pre-sales value quantification within the Services business unit. As a Sr. Analyst, you will be responsible for creating and optimizing data pipelines, managing large datasets, and ensuring the integrity and accessibility of data to support Mastercard’s internal teams in quantifying the value of services, enhancing customer engagement, and driving business outcomes. The role requires close collaboration across teams to ensure data solutions meet business needs and deliver measurable impact. Role Responsibilities: Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation. Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts. Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashboards and reports that communicate the value of services. Generate actionable insights from data to inform strategic decisions and provide clear, data-backed recommendations. Cross-Functional Collaboration & Stakeholder Engagement: Collaborate with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communicate insights and data value through compelling presentations and dashboards to senior leadership and internal teams, ensuring tool adoption and usage. All About You: Data Engineering Expertise: Proficiency in data engineering tools and techniques to develop and maintain data pipelines. Experience with data integration tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx. Advanced SQL Skills: Strong skills in SQL for querying and managing large datasets. Experience with database management systems and data warehousing solutions. Programming Proficiency: Knowledge of programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development. Business Intelligence and Reporting: Experience in creating insightful dashboards and reports using business intelligence platforms such as Tableau and Power BI. Statistical Analysis: Ability to perform statistical analysis to identify trends, correlations, and insights that support strategic decision-making. Cross-Functional Collaboration: Strong collaboration skills to work effectively with Sales, Marketing, Consulting, Product, and other internal teams to understand business needs and ensure successful data solution development and deployment. Communication and Presentation: Excellent communication skills to convey insights and data value through compelling presentations and dashboards to senior leadership and internal teams. Execution Focus: A results-driven mindset with the ability to balance strategic vision with tactical execution, ensuring that data solutions are delivered on time and create measurable business value. Education: Bachelor’s degree in Data Science, Computer Science, Business Analytics, Economics, Finance, or a related field. Advanced degrees or certifications in analytics, data science, AI/ML, or an MBA are preferred. Why Us? At Mastercard, you’ll have the opportunity to shape the future of internal operations by leading the development of data engineering solutions that empower teams across the organization. Join us to make a meaningful impact, drive business outcomes, and help Mastercard’s internal teams create better customer engagement strategies through innovative value-based ROI narratives. Location: Gurgaon/Pune, India Employment Type: Full-Time Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.
Posted 2 months ago
5.0 years
5 - 8 Lacs
Chennai
On-site
Candidate should be having 5+ years of experience in Talend. Should be expert in using all components in Talend. Good and experienced in SQL Good analytical skill in resolving UAT/PROD issues and provide solution. Should have experience in various file formats like .xls, .csv, xml, JSON etc., Should have experience in resolving performance related issues and exception handling techniques. Should have good PLSQL skills. Basics in Unix Fundamental knowledge of Linux and/or Unix About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 months ago
10.0 years
4 - 5 Lacs
Chennai
On-site
The Testing Sr Analyst is a seasoned professional role. Applies in-depth disciplinary knowledge, contributing to the development of new techniques and the improvement of processes and work-flow for the area or function. Integrates subject matter and industry expertise within a defined area. Requires in-depth understanding of how areas collectively integrate within the sub-function as well as coordinate and contribute to the objectives of the function and overall business. Evaluates moderately complex and variable issues with substantial potential impact, where development of an approach/taking of an action involves weighing various alternatives and balancing potentially conflicting situations using multiple sources of information. Requires good analytical skills in order to filter, prioritize and validate potentially complex and dynamic material from multiple sources. Strong communication and diplomacy skills are required. Regularly assumes informal/formal leadership role within teams. Involved in coaching and training of new recruits Significant impact in terms of project size, geography, etc. by influencing decisions through advice, counsel and/or facilitating services to others in area of specialization. Work and performance of all teams in the area are directly affected by the performance of the individual. Responsibilities: Supports initiatives related to User Acceptance Testing (UAT) process and product rollout into production. Testing specialists who work with technology project managers, UAT professionals and users to design and implement appropriate scripts/plans for an application testing strategy/approach. Tests and analyzes a broad range of systems and applications to ensure they meet or exceed specified standards and end-user requirements. Work closely with key stakeholders to understand business and functional requirements to develop test plans, test cases and scripts. Works complex testing assignments. Executes test scripts according to application requirements documentation. Identifies defects and recommends appropriate course of action; performs root cause analyses. Coordinates multiple testers and testing activities within a project. Retests after corrections are made to ensure problems are resolved. Documents, evaluates and researches test results for future replication. Identifies, recommends and implements process improvements to enhance testing strategies. Analyzes requirements and design aspects of projects. Interfaces with client leads and development teams. Exhibits sound understanding of concepts and principles in own technical area and a basic knowledge of these elements in other areas. Requires in-depth understanding of how own area integrates within IT testing and has basic commercial awareness. Makes evaluative judgments based on analysis of factual information in complicated and novel situations. Participate in test strategy meetings, Has direct impact on the team and closely related teams by ensuring the quality of the tasks services information provided by self and others. Requires sound and comprehensive communication and diplomacy skills to exchange complex information. Provide metrics related to the cost, effort, and milestones of Quality activities on a project level Acts as advisor and mentor for junior members of the team. Regularly assumes informal/formal leadership role within teams. Perform other duties and functions as assigned Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Qualifications: 10+ years Testing Analyst experience Familiarity with the Software Development Lifecycle (SDLC) and how Quality Assurance methodology fits into the SDLC Knowledge of relevant operating systems, languages and database tools Knowledge of defect tracking systems and processes; including change management Knowledge of automated regression testing tools. Experience of testing trading platforms or similar software. Ability to work under pressure during tight dead lines Requires methodical approach to testing and problem solving. Requires theoretical and analytical skills, with demonstrated ability in planning and operations Excellent communication and stakeholder management skills with a proactive attitude, always seeking opportunities to add value Specific software languages will be dependent of area of business Education: Bachelor’s/University degree or equivalent experience We are seeking a highly skilled ETL Automation Tester with strong expertise in complex SQL, file-to-database validation, and data quality assurance. The ideal candidate will have hands-on experience validating various feed file formats (.csv, .json, .xls), and be comfortable with automation frameworks and tools for enhancing test efficiency. This role involves close collaboration with developers, data engineers, and stakeholders to ensure the integrity, consistency, and quality of data across our systems. Experience in testing reporting systems with Cognos/Tableau is required. Key Responsibilities: Lead the end-to-end validation of ETL processes, including data extraction, transformation, and loading validation across large volumes of structured and semi-structured data. Drive data quality assurance initiatives by defining test strategy, creating comprehensive test plans, and executing test cases based on data mapping documents and transformation logic. Validate file-based feeds (.csv, .json, .xls, etc.) by ensuring accurate ingestion into target data warehouse environments. Develop and optimize complex SQL queries to perform deep data audits, aggregation checks, and integrity validations across staging and warehouse layers. Own the defect lifecycle using tools like JIRA, providing high-quality defect reporting and traceability across all testing cycles. Collaborate with business analysts, developers, and data architects to ensure test alignment with business expectations and technical design. Perform report-level validations in tools such as Cognos or Tableau, ensuring consistency between backend data and visual representations. Mentor junior testers, review test artifacts, and guide the team in best practices for ETL testing and documentation. Contribute to QA process improvements, testing templates, and governance initiatives to standardize data testing practices across projects. Required Skills: Strong hands-on experience in ETL and data warehouse testing. Advanced proficiency in SQL and strong experience with RDBMS technologies (Oracle, SQL Server, PostgreSQL, etc.). In-depth experience with file-to-database validation and knowledge of various data formats. Proven track record in test strategy design, test planning, and defect management for large-scale data migration or ETL initiatives. Experience with ETL tools like Talend, or custom data processing scripts (tool-specific expertise not mandatory). Strong understanding of data modeling concepts, referential integrity, and transformation rules. Familiarity with Agile methodologies and experience working in fast-paced environments with iterative delivery. Excellent communication, stakeholder management, and documentation skills. Good to Have: Exposure to BI tools like Cognos, Tableau, etc. for end-user report validation. Prior experience in validating front-end UI connected to data dashboards or reports. - Job Family Group: Technology - Job Family: Technology Quality - Time Type: Full time - Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi . View Citi’s EEO Policy Statement and the Know Your Rights poster.
Posted 2 months ago
3.0 years
2 - 5 Lacs
Chennai
On-site
Req ID: 324657 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Analyst to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Key Responsibilities: Extract, transform, and load (ETL) data from various sources, ensuring data quality, integrity, and accuracy. Perform data cleansing, validation, and preprocessing to prepare structured and unstructured data for analysis. Develop and execute queries, scripts, and data manipulation tasks using SQL, Python, or other relevant tools. Analyze large datasets to identify trends, patterns, and correlations, drawing meaningful conclusions that inform business decisions. Create clear and concise data visualizations, dashboards, and reports to communicate findings effectively to stakeholders. Collaborate with clients and cross-functional teams to gather and understand data requirements, translating them into actionable insights. Work closely with other departments to support their data needs. Collaborate with Data Scientists and other analysts to support predictive modeling, machine learning, and statistical analysis. Continuously monitor data quality and proactively identify anomalies or discrepancies, recommending corrective actions. Stay up-to-date with industry trends, emerging technologies, and best practices to enhance analytical techniques. Assist in the identification and implementation of process improvements to streamline data workflows and analysis. Basic Qualifications: 3 + years of proficiency in data analysis tools such as [Tools - e.g., Excel, SQL, R, Python]. 3+ years of experience supporting Software Engineering, Data Engineering, or Data Analytics projects. 2+ years of experience leading a team supporting data related projects to develop end-to-end technical solutions. Undergraduate or Graduate degree preferred Ability to travel at least 25%. Preferred Skills: Strong proficiency in data analysis tools such as Python, SQL, Talend (any ETL). Experience with data visualization tools like PowerBI. Experience with cloud data platforms . Familiarity with ETL (Extract, Transform, Load) processes and tools. Knowledge of machine learning techniques and tools. Experience in a specific industry (e.g., financial services, healthcare, manufacturing) can be a plus. Understanding of data governance and data privacy regulations. Ability to query and manipulate databases and data warehouses. Excellent analytical and problem-solving skills. Strong communication skills with the ability to explain complex data insights to non-technical stakeholders. Detail-oriented with a commitment to accuracy. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 2 months ago
8.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Key Responsibilities ETL & BI Testing: Manage the testing of ETL processes, data pipelines, and BI reports to ensure accuracy and reliability. Develop and execute test strategies, test plans, and test cases for data validation. Perform data reconciliation, transformation validation, and SQL-based testing to ensure data correctness. Validate reports and dashboards built using BI tools (Power BI, Tableau). Automate ETL testing where applicable using Python, Selenium, or other automation tools. Identify and log defects, track issues, and ensure timely resolution. Collaborate with business stakeholders to understand data requirements and reporting needs. Assist in documenting functional and non-functional requirements for data transformation and reporting. Support data mapping, data profiling, and understanding business rules applied to datasets. Participate in requirement-gathering sessions and provide inputs on data validation needs. Required Skills & Experience 6–8 years of experience in ETL, Data Warehouse, and BI testing. Strong experience with SQL, data validation techniques, and database testing. Hands-on experience with ETL tools (Informatica, Talend, SSIS, or similar). Proficiency in BI tools like Power BI, Tableau for report validation. Good knowledge of data modeling, star schema, and OLAP concepts. Mentor a team of ETL/BI testers and provide guidance on testing best practices. Coordinate with developers, BAs, and business users to ensure end-to-end data validation. Define QA processes, best practices, and automation strategies to improve testing efficiency. Experience in data reconciliation, transformation logic validation, and data pipeline testing. Experience in Insurance domain is an added advantage. Automation skills for data and report testing (Python, Selenium, or ETL testing frameworks) are a plus. Experience in understanding and documenting business & data requirements. Ability to work with business users to gather and analyze reporting needs. Strong analytical and problem-solving skills. Excellent communication and stakeholder management abilities. Experience with Agile/Scrum methodologies and working in a cross-functional team. Preferred Qualifications Experience in cloud-based data platforms (AWS, Azure, GCP) is a plus. ISTQB or equivalent certification in software testing is preferred. Show more Show less
Posted 2 months ago
0 years
0 Lacs
Andhra Pradesh
On-site
DBT - Designing, developing, and technical architecture, data pipelines, and performance scaling using tools to integrate Talend data and ensure data quality in a big data environment. Very strong on PL/SQL - Queries, Procedures, JOINs. Snowflake SQL - Writing SQL queries against Snowflake and developing scripts in Unix, Python, etc., to perform Extract, Load, and Transform operations. Good to have Talend knowledge and hands-on experience. Candidates who have worked in PROD support would be preferred. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures. Perform data analysis, troubleshoot data issues, and provide technical support to end-users. Develop and maintain data warehouse and ETL processes, ensuring data quality and integrity. Complex problem-solving capability and continuous improvement approach. Desirable to have Talend / Snowflake Certification. Excellent SQL coding skills, excellent communication, and documentation skills. Familiar with Agile delivery process. Must be analytical, creative, and self-motivated. Work effectively within a global team environment. Excellent communication skills. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 months ago
0 years
0 Lacs
Andhra Pradesh
On-site
Data Tester Job Description Highlights: 5 plus years experience in data testing ETL Testing: Validating the extraction, transformation, and loading (ETL) of data from various sources. Data Validation: Ensuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL Proficiency: Writing and executing SQL queries to fetch and analyze data. Data Modeling: Understanding data models, data mappings, and architectural documentation. Test Case Design: Creating test cases, test data, and executing test plans. Troubleshooting: Identifying and resolving data-related issues. Dashboard Testing: Validating dashboards for accuracy, functionality, and user experience. Collaboration: Working with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional Testing: Simulating user interactions and clicks to ensure dashboards are functioning correctly. Performance Testing: Evaluating dashboard responsiveness and load times. Data Quality Testing: Verifying that the data displayed on dashboards is accurate, complete, and consistent. Usability Testing: Assessing the ease of use and navigation of dashboards. Data Visualization Testing: Ensuring charts, graphs, and other visualizations are accurate and present data effectively. Security Testing: Verifying that dashboards are secure and protect sensitive data. Tools and Technologies: SQL: Used for querying and validating data. Hands on snowflake ETL Tools: Tools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization Tools: Tableau, Power BI, or other BI tools used for creating and testing dashboards. Testing Frameworks: Frameworks like Selenium or JUnit used for automating testing tasks. Cloud Platforms: AWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.
Posted 2 months ago
0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Role Description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures Of Outcomes TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Code Outputs Expected: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure Define and govern the configuration management plan. Ensure compliance from the team. Test Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Execute and monitor the release process. Design Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface With Customer Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications Obtain relevant domain and technology certifications. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark Show more Show less
Posted 2 months ago
0 years
0 Lacs
Andhra Pradesh, India
On-site
Data Tester Job Description Highlights 5 plus years experience in data testing ETL Testing: Validating the extraction, transformation, and loading (ETL) of data from various sources. Data Validation: Ensuring the accuracy, completeness, and integrity of data in databases and data warehouses. SQL Proficiency: Writing and executing SQL queries to fetch and analyze data. Data Modeling: Understanding data models, data mappings, and architectural documentation. Test Case Design: Creating test cases, test data, and executing test plans. Troubleshooting: Identifying and resolving data-related issues. Dashboard Testing: Validating dashboards for accuracy, functionality, and user experience. Collaboration: Working with developers and other stakeholders to ensure data quality and functionality. Primary Responsibilities Dashboard Testing Components: Functional Testing: Simulating user interactions and clicks to ensure dashboards are functioning correctly. Performance Testing: Evaluating dashboard responsiveness and load times. Data Quality Testing: Verifying that the data displayed on dashboards is accurate, complete, and consistent. Usability Testing: Assessing the ease of use and navigation of dashboards. Data Visualization Testing: Ensuring charts, graphs, and other visualizations are accurate and present data effectively. Security Testing: Verifying that dashboards are secure and protect sensitive data. Tools And Technologies SQL: Used for querying and validating data. Hands on snowflake ETL Tools: Tools like Talend, Informatica, or Azure Data Factory used for data extraction, transformation, and loading. Data Visualization Tools: Tableau, Power BI, or other BI tools used for creating and testing dashboards. Testing Frameworks: Frameworks like Selenium or JUnit used for automating testing tasks. Cloud Platforms: AWS platforms used for data storage and processing. Hands on Snowflake experience HealthCare Domain knowledge is plus point. Secondary Skills Automation framework, Life science domain experience. UI Testing, API Testing Any other ETL Tools Show more Show less
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39928 Jobs | Dublin
Wipro
19400 Jobs | Bengaluru
Accenture in India
15955 Jobs | Dublin 2
EY
15128 Jobs | London
Uplers
11280 Jobs | Ahmedabad
Amazon
10521 Jobs | Seattle,WA
Oracle
9339 Jobs | Redwood City
IBM
9274 Jobs | Armonk
Accenture services Pvt Ltd
7978 Jobs |
Capgemini
7754 Jobs | Paris,France