Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
2.0 - 3.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 2-3 years of IT experience in a Systems Administrator / Engineer role. Minimum 1 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 5 days ago
15.0 years
0 Lacs
Bhubaneswar, Odisha, India
On-site
Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Data Architecture Principles Good to have skills : Amazon Web Services (AWS), Teradata Vantage, Data Governance Minimum 15 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Platform Architect, you will architect the data platform blueprint and implement the design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Roles & Responsibilities: - Expected to be a SME with deep knowledge and experience. - Should have Influencing and Advisory skills. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Expected to provide solutions to problems that apply across multiple teams. - Lead the design and implementation of the data platform architecture. - Collaborate with cross-functional teams to ensure data platform alignment with business objectives. - Provide technical guidance and mentorship to junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Architecture Principles. - Good To Have Skills: Experience with Amazon Web Services (AWS), Teradata Vantage, Data Governance. - Strong understanding of data architecture principles and best practices. - Experience in designing and implementing scalable data solutions. - Knowledge of cloud platforms and data governance frameworks. Additional Information: - The candidate should have a minimum of 15 years of experience in Data Architecture Principles. - This position is based at our Pune office. - A 15 years full-time education is required. 15 years full time education Show more Show less
Posted 5 days ago
2.0 - 3.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops What Makes You a Qualified Candidate Minimum 2-3 years of IT experience in a Systems Administrator / Engineer role. Minimum 1 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status. Show more Show less
Posted 5 days ago
5.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Comcast brings together the best in media and technology. We drive innovation to create the world's best entertainment and online experiences. As a Fortune 50 leader, we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines. We are at the forefront of change and move at an amazing pace, thanks to our remarkable people, who bring cutting-edge products and services to life for millions of customers every day. If you share in our passion for teamwork, our vision to revolutionize industries and our goal to lead the future in media and technology, we want you to fast-forward your career at Comcast. Job Summary Responsible for planning and designing new software and web applications. Analyzes, tests and assists with the integration of new applications. Documents all development activity. Assists with training non-technical personnel. Has in-depth experience, knowledge and skills in own discipline. Usually determines own work priorities. Acts as a resource for colleagues with less experience. Job Description Core Responsibilities Create basic user interfaces using HTML, CSS, and JavaScript frameworks. Create detailed deployment and configuration documentation Create documentation for application functionality changes Create UML class and component diagrams Create, document, and execute scenario-based tests for application changes Implement software changes in .NET leveraging C#, HTML, and JavaScript (or other languages when applicable) using Visual Studio. Create build and release automation Create data access layers using the Entity Framework and C# Create, consume, and test web services using C# or Postman Design, document, and implement simple SQL Server databases using documented best practices Design, document, implement, and test simple ETL processes with SQL Server Integration Services Troubleshoot application bugs using Visual Studio, diagnostic log analysis, and SysInternals tools Use lambda expressions with C# to simplify application code Deploy applications using Azure DevOps Pipeline Additional desired skills – cloud-native platforms such as Azure & AWS, front-end frameworks including Angular and React, data platforms such as Teradata and MinIO Employees At All Levels Are Expected To Understand our Operating Principles; make them the guidelines for how you do your job. Own the customer experience - think and act in ways that put our customers first, give them seamless digital options at every touchpoint, and make them promoters of our products and services. Know your stuff - be enthusiastic learners, users and advocates of our game-changing technology, products and services, especially our digital tools and experiences. Win as a team - make big things happen by working together and being open to new ideas. Be an active part of the Net Promoter System - a way of working that brings more employee and customer feedback into the company - by joining huddles, making call backs and helping us elevate opportunities to do better for our customers. Drive results and growth. Respect and promote inclusion & diversity. Do what's right for each other, our customers, investors and our communities. Disclaimer This information has been designed to indicate the general nature and level of work performed by employees in this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities and qualifications. Comcast is proud to be an equal opportunity workplace. We will consider all qualified applicants for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, veteran status, genetic information, or any other basis protected by applicable law. Base pay is one part of the Total Rewards that Comcast provides to compensate and recognize employees for their work. Most sales positions are eligible for a Commission under the terms of an applicable plan, while most non-sales positions are eligible for a Bonus. Additionally, Comcast provides best-in-class Benefits to eligible employees. We believe that benefits should connect you to the support you need when it matters most, and should help you care for those who matter most. That’s why we provide an array of options, expert guidance and always-on tools, that are personalized to meet the needs of your reality – to help support you physically, financially and emotionally through the big milestones and in your everyday life. Please visit the compensation and benefits summary on our careers site for more details. Education Bachelor's Degree While possessing the stated degree is preferred, Comcast also may consider applicants who hold some combination of coursework and experience, or who have extensive related professional experience. Relevant Work Experience 5-7 Years Show more Show less
Posted 5 days ago
75.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About the company: Making Trade Happen Coface is a team of 4,500 people of 78 nationalities across nearly 60 countries, all sharing a corporate culture across the world. Together, we work towards one objective: facilitating trade by helping our 50,000 corporate clients develop their businesses. With 75 years of experience, Coface is a leader in the credit insurance and risk management market. We have also developed a range of other value-added services, including factoring, debt collection, Single Risk insurance, bonding, and information services. As a close-knit, international organisation at the core of the global economy, Coface offers an enriching work experience on several levels: relational, professional, and cultural. Every day, our teams are making trade happen. Join us! MISSION : We are seeking an experienced and highly motivated professional to join our team in a role focused on stakeholder management, Power BI dashboard development, and data analysis. The ideal candidate will collaborate with cross-functional teams to address data needs, develop actionable insights through advanced Power BI dashboards, and manage complex data landscapes. Key responsibilities include designing and maintaining Power BI reports, ensuring data accuracy, conducting in-depth analysis to identify trends, and presenting findings to senior stakeholders. The role also requires strong communication skills to translate complex technical concepts into clear, actionable insights for non-technical stakeholders. MAIN RESPONSIBILITIES: Key Requirements: Proficiency in Power BI, data visualization, and coding. Strong analytical skills and ability to synthesize insights from large, complex datasets. Experience managing stakeholder expectations and providing training on BI tools. Transform complex data into easily understandable insights Create multi-dimensional data models that are well-adjusted data warehousing practices. Execute security at the row level in the Power BI application with an apt understanding of the application security layer models. Data visualization using best practices with high end-user focus Technical Skills : 8-12 years of overall experience in software development 8+ years of dedicated experience in Power BI Well versed with all BI and DWH (Data Ware Housing) concepts and architecture Experience in working with clients in the APAC region preferably in Insurance industry Familiarity with the tools and technologies used by the Microsoft SQL Server BI Stack, including SSRS and TSQL, Power Query, MDX, PowerBI, and DAX. Power BI Technical Skills - Power BI Desktop & Service, Data Modeling (DAX & Relationships), Power Query (M Language), Data Visualization & UI Design, Paginated Reports (Power BI Report Builder), Power Automate Integration Data Skills – Oracle, SQL, Data Warehousing, Data Cleansing & Transformation Business & Analytical Skills -Requirement Gathering, Data Storytelling, KPI & Metrics Development Administration & Security - Row-Level Security (RLS), Power BI Service Administration Design, build, maintain, and map data models to process raw data from unrelated sources. Proficient in financial reporting through Power BI Strong knowledge of Oracle & SQL and relational databases. Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) In-depth understanding of the overall development process for listed tools: Data extraction from various data sources (like SAP ERP, SAP BW, Oracle, Teradata, Snowflake) Knowledge on Scripts to import data from databases, flat files, log files. Understanding of general accounting principles and financial reporting What can we offer you? Advantages we can outline: Flexible working model: up to 3days of home office per week after the 1st month Opportunities to learn: 450Euro budget every year for training, languages platform, e-learning platform, dedicated development program… Career opportunities: Opportunity to build your career (both locally and internationally) in a large global company, one of the world leaders in its field Diversity, Equity & Inclusion: Coface aims to be a leader in Diversity, Equity, and Inclusion within the Trade Credit-Insurance industry. We are committed to creating an environment where every employee can thrive, fostering a culture of belonging and fairness. By attracting top talent from diverse backgrounds, we strive to be a model for an inclusive employee experience. As an equal opportunity employer, Coface welcomes all qualified applicants without regard to gender, race, ethnicity, sexual orientation, age, beliefs, disability, or any other legally protected characteristics. Show more Show less
Posted 5 days ago
4.0 years
0 Lacs
Hyderābād
On-site
Do you love understanding every detail of how new technologies work? Join the team that serves as Apple’s nerve center, our Information Systems and Technology group. There are countless ways you’ll contribute here, whether you’re coordinating technology needs for product launches, designing music solutions for retail locations, or ensuring the strength of in-store Wi-Fi connections. From Apple Pay to the Apple website to our data centers around the globe, you’ll help design and manage the massive systems that countless employees and customers rely on every day. You’ll also build custom tools for employees, empowering them to solve complex problems on their own. Join our team, and together we’ll explore all the ways to improve how Apple operates, freeing our employees to do what they do best: craft magical experiences for our customers. The Global Business Intelligence team provides data services, analytics, reporting, and data science solutions to Apple’s business groups, including Retail, iTunes, Marketing, AppleCare, Operations, Finance, and Sales. These solutions are built on top of a great data platform and leverage multiple frameworks. This position is an extraordinary opportunity for a proficient, experienced, and driven data platform engineer to solve database design and optimization problems and provide a scalable, high-performance and dynamic Enterprise Data Warehouse (EDW) platform!. Description As a Cloud Data Platform Engineer, you will be responsible for leading all aspects of a database platform. This would include either database design, database security, DR strategy, develop standard processes, new feature evaluations or analyze workloads to identify optimization opportunities at a system and application level. You will be driving automation efforts to effectively manage the database platform, and build self service solutions for users. You will also be partnering with development teams, product managers and business users to review the solution design being deployed and provide recommendations to optimize and tune. This role will also address any platform wide performance and stability issues. We're looking for an individual who loves to take challenges, takes on problems with imaginative solutions, works well in collaborative teams to build and support a large Enterprise Data Warehouse. Minimum Qualifications 4+ years of experience in database technologies like Snowflake (preferred), Teradata, BigQuery or Redshift. Demonstrated ability working with Advanced SQL. Experience handling DBA functions, DR strategy, data security, governance, associated automation and tooling for a database platform. Key Qualifications Experience with object oriented programming in Python or Java. Analyze production workloads and develop strategies to run Snowflake database with scale and efficiency. Experience in performance tuning, capacity planning, managing cloud spend and utilization. Experience with SaaS/PaaS enterprise services on GCP/AWS or Azure is a plus Familiarity with in-memory database platforms like SingleStore is a plus Experience with Business intelligence (BI) platforms like Tableau, Thought-Spot and Business Objects is a plus Good communication and personal skills: ability to interact and work well with members of other functional groups in a project team and a strong sense of project ownership. Education & Experience Bachelor’s Degree in Computer Science Engineering or IT from a reputed school Submit CV
Posted 5 days ago
4.0 years
0 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking a Senior Analytics Consultant. Consultant can help support our existing SharePoint solutions and help in creating new products that support the Sales Practices & Conduct Management and Loudspeaker organizations. In this role, you will: Consult, review and research moderately complex business, operational, and technical challenges that require an in-depth evaluation of variable data factors Perform moderately complex data analysis to support and drive strategic initiatives and business needs Develop a deep understanding of technical systems and business processes to extract data driven insights while identifying opportunities for engineering enhancements Lead or participate on large cross group projects Mentor less experienced staff Collaborate and consult with peers, colleagues, external contractors, and mid-level managers to resolve issues and achieve goals Leverage a solid understanding of compliance and risk management requirements for supported area Required Qualifications: 4+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: SQL, Teradata experience, Testing or quality assurance experience SDLC (System Development Life Cycle) experience Experience with System Integration Testing (SIT) Experience with ETL Testing Strong analytical skills with high attention to detail and accuracy Experience in onshore/offshore support model Strong presentation, communication, writing and interpersonal skills. Experience in Agile methodology and leveraging Jira tools for workflow and productivity management. Knowledge of Conduct Management data, such as the Enterprise Allegations Platform (EAP) and/or the Allegation Lifecycle/Methodology Experience with JIRA development. Power Platform Developer Associate Certification Job Expectations: Develop and Maintain Applications: Create, customize, and maintain high-quality PowerApps applications, including both canvas and model-driven apps. Integration: Collaborate with Microsoft services such as SharePoint, Teams, and Power Automate to ensure seamless integration and data flow. Data Management: Utilize data modeling techniques and manage data within the Common Data Service (CDS) for optimal application performance. Low-Code/No-Code Development: Apply no-code/low-code development strategies to design and build applications rapidly. User Support and Optimization: Provide ongoing user support, troubleshoot issues, and optimize application performance to meet user needs and business requirements. Consult, review and research moderately complex business, operational, and technical challenges that require an in-depth evaluation of variable data factors Perform moderately complex data analysis to support and drive strategic initiatives and business needs Develop a deep understanding of technical systems and business processes to extract data driven insights while identifying opportunities for engineering enhancements Lead or participate on large cross group projects Mentor less experienced staff Collaborate and consult with peers, colleagues, external contractors, and mid-level managers to resolve issues and achieve goals Leverage a solid understanding of compliance and risk management requirements for supported area Working from office - 3 days (Monday, Tuesday and Wednesday) a week, 9 hours shift Posting End Date: 19 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 5 days ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
Job Description: About us* At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* GF (Global Finance) Global Financial Control India (GFCI) is part of the CFO Global Delivery strategy to provide offshore delivery to Line of Business (LOBs) and Enterprise Finance functions. The capabilities hosted include General Accounting & Reconciliations, Legal Entity Controllership, Corporate Sustainability Controllership, Corporate Controllership, Management Reporting & Analysis, Finance Systems Support, Operational Risk and Controls, Regulatory Reporting and Strategic initiatives. The Financed Emissions Accounting & Reporting team, a part of the Global Financial Control-Corporate Sustainability Controller organization within the CFO Group, plays a critical role in supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Job Description* The role is responsible for building data sourcing process, data research and analytics using available tools, support model input data monitoring and develop necessary data or reporting frameworks to support our approaches to net zero progress alignment, target setting, client engagement and reputational risk review, empowering banking teams to assist clients on net zero financing strategies and specific commercial opportunities. The role will support and partner with business stakeholders in the Enterprise Climate Program Office, Technology, Climate and Credit Risk, the Global Environment Group, Lines of Business, Legal Entity Controllers and Model Risk Management. Additionally, the role will support data governance, lineage, controls by building, improving and executing data processes. Candidate must be able to communicate across technology partners, climate office and the business lines to execute on viable analytical solutions, with a focus on end-user experience and usability. Candidate must be strong in identifying and explaining data quality issues to help achieve successful and validated data for model execution. This individual should feel at ease creating complex SQL queries and extracting large, raw datasets from various sources, merging, and transforming raw data into usable data and analytic structures, and benchmarking results to known. They must feel comfortable with automating repeatable process, generating data insights that are easy for end users to interpret, conduct quantitative analysis, as well as effectively communicate and disseminate findings and data points to stakeholders. They should also understand greenhouse gas accounting frameworks and financed emissions calculations as applied to different sectors and asset classes. The candidate will have experience representing ERA with critical Climate stakeholders across the firm, and should demonstrate capacity for strategic leadership, exercising significant independent judgment and discretion and work towards strategic goals with limited oversight. Essential Functions: Net zero transition planning and execution: Partners with GEG, Program Office and Lines of Business in developing and executing enterprise-wide net zero transition plan and operational roadmap, with a focus on analysis and reporting capabilities, data procurement, liaising with consultants, external data providers, Climate Risk and Technology functions. Data development & Operations: Research on data requirements, produce executive level and detailed level data summary, validate the accuracy, completeness, reasonableness, timeliness on the dataset and develop desktop procedures for BAU operations. Perform data review and test technology implementation for financed emissions deliverables. Execute BAU processes such as new data cycle creation, execute data controls and data quality processes. Produce data summary materials and walk through with leadership team. Data Analytics & Strategy: Analyze the data and provide how granular data movements across history affects the data new results. Find trends of data improvements or areas for improvement. Develops automated data analysis results and answer the common questions to justify the changes in data. Support ad hoc analytics of bank-wide and client net zero commitment implementation, with an initial focus on automation of financed emissions analysis, reporting against PCAF standards and net zero transition preparedness analytics and engagement to enhance strategy for meeting emissions goals for target sectors. Requirements* Education* Bachelor’s degree in data management or analytics, engineering, sustainability, finance or other related field OR Master’s degree in data science, earth/climate sciences, engineering, sustainability, natural resource management, environmental economics, finance or other related field Certification if any NA Experience Range* Minimum 5+ years in statistical and/or data management and analytics and visualization Two (2) or more years of experience in Climate, Financed Emissions or financial reporting preferred. Foundational Skills* Deep expertise in SQL, Excel, automation & optimization, and project management Knowledge of data architecture concepts, data models, ETL processes Deep understanding of how data process works and ability to solve dynamically evolving and complex data challenges part of day-to-day activities. Experience in extracting, and combining data across from multiple sources, and aggregate data to support model development. Strong documentation & presentation skills to explain the data analysis in a visual and procedural way based on the audience. Excellent interpersonal, management, and teamwork skills. Highly motivated self-starter with excellent time management skills and the ability to effectively manage multiple priorities and timelines. Ability to effectively communicate and resolve conflicts by both oral and written communication to both internal and external clients. Ability to think critically to solve problems with rational solutions. Ability to react and make decisions quickly under pressure with good judgment. Desired Skills* Advanced knowledge of Finance Advanced knowledge of Climate Risk Demonstrated ability to motivate others in a high-stress environment to achieve goal. Ability to adapt to a dynamic and evolving work environment. Ability to quickly identify risks and determine reasonable solutions. Experience in multiple database environment such as Oracle, Hadoop, and Teradata Knowledge on Alteryx, Tableau, R, (knowledge of NLP, data scraping and generative AI welcome) Work Timings* Window 12:30 PM to 9:30 PM (9 hours shift, may require stretch during close period) Job Location* Mumbai Show more Show less
Posted 5 days ago
0 years
0 Lacs
Trivandrum, Kerala, India
Remote
Description Data Engineer Responsibilities : Deliver end-to-end data and analytics capabilities, including data ingest, data transformation, data science, and data visualization in collaboration with Data and Analytics stakeholder groups Design and deploy databases and data pipelines to support analytics projects Develop scalable and fault-tolerant workflows Clearly document issues, solutions, findings and recommendations to be shared internally & externally Learn and apply tools and technologies proficiently, including: Languages: Python, PySpark, ANSI SQL, Python ML libraries Frameworks/Platform: Spark, Snowflake, Airflow, Hadoop , Kafka Cloud Computing: AWS Tools/Products: PyCharm, Jupyter, Tableau, PowerBI Performance optimization for queries and dashboards Develop and deliver clear, compelling briefings to internal and external stakeholders on findings, recommendations, and solutions Analyze client data & systems to determine whether requirements can be met Test and validate data pipelines, transformations, datasets, reports, and dashboards built by team Develop and communicate solutions architectures and present solutions to both business and technical stakeholders Provide end user support to other data engineers and analysts Candidate Requirements Expert experience in the following[Should have/Good to have]: SQL, Python, PySpark, Python ML libraries. Other programming languages (R, Scala, SAS, Java, etc.) are a plus Data and analytics technologies including SQL/NoSQL/Graph databases, ETL, and BI Knowledge of CI/CD and related tools such as Gitlab, AWS CodeCommit etc. AWS services including EMR, Glue, Athena, Batch, Lambda CloudWatch, DynamoDB, EC2, CloudFormation, IAM and EDS Exposure to Snowflake and Airflow. Solid scripting skills (e.g., bash/shell scripts, Python) Proven work experience in the following: Data streaming technologies Big Data technologies including, Hadoop, Spark, Hive, Teradata, etc. Linux command-line operations Networking knowledge (OSI network layers, TCP/IP, virtualization) Candidate should be able to lead the team, communicate with business, gather and interpret business requirements Experience with agile delivery methodologies using Jira or similar tools Experience working with remote teams AWS Solutions Architect / Developer / Data Analytics Specialty certifications, Professional certification is a plus Bachelor Degree in Computer Science relevant field, Masters Degree is a plus Show more Show less
Posted 5 days ago
0 years
4 - 7 Lacs
Gurgaon
On-site
A Software Engineer is curious and self-driven to build and maintain multi-terabyte operational marketing databases and integrate them with cloud technologies. Our databases typically house millions of individuals and billions of transactions and interact with various web services and cloud-based platforms. Once hired, the qualified candidate will be immersed in the development and maintenance of multiple database solutions to meet global client business objectives Job Description: Key responsibilities: Have 2 – 4 yrs exp Will work in close Supervision of Tech Leads/ Lead Devs Should able to understand detailed design with minimal explanation. Individual Contributor. Resource will able to perform mid to complex level tasks with minimal supervision. Senior team members will peer review assigned tasks. Build and configure our Marketing Database/Data environment platform by integrating feeds as per detailed design/transformation logic. Good knowledge of Unix scripting &/or Python Must have strong knowledge in SQL Good understanding of ETL (Talend, Informatica, Datastage, Ab Initio etc) as well as database skills (Oracle, SQL server, Teradata, Vertica, redshift, Snowflake, Big query, Azure DW etc). Fair understanding of relational databases, stored procs etc. Experience in Cloud computing (one or more of AWS, Azure, GCP) will be plus. Less supervision & guidance from senior resources will be required. Location: DGS India - Gurugram - Golf View Corporate Towers Brand: Merkle Time Type: Full time Contract Type: Permanent
Posted 5 days ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328454 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Intelligence Senior Specialist to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Mandatory Skills : Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal Desired Skills : , Python scripting , Autosys - Responsible for workload prioritization and management of resources, both on service requests and small projects. Maintaining and Providing the Status to mgmt. and Onshore leads - Expert in Architecture, design, develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux Scripting Hands on code development, source code control, specification writing and production implementation. Participate in requirement gathering sessions, guide the design & development by providing insights of data sources and peer reviews. Participate and guide data integration solutions that are needed to fill the data gaps Debug data quality issues by analyzing the upstream sources and provide guidance to data integration team resolutions Closely work with DBAs to fix performance bottlenecks Participate in technology governance groups that defines policies, best practices and make design decisions in on-going projects Mentor junior developers regarding best practices and technology stacks used to build the application - Work closely with Operations, Teradata administration teams for code migrations and production support Provide Resource and effort estimates for EDW - ETL & Extract projects. - Experience working as part of a global development team. Shoudl be able to bring innovation to provide value add to the customer . About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Basic Qualifications: Bachelor’s or Master’s degree in a Computer Science, Engineering or a related or related field of study 5+ Years - Ability to work effectively across organizations, product teams and business partners. 5+ Years - Knowledge Agile (Scrum) Methodology, experience in writing user stories 5+ Years - Strong understating of Database concepts and experience with multiple database technologies – optimizing query and data processing performance. 5+ Years - Full Stack Data Engineering Competency in a public cloud – Google, MS Azure, AWS Critical thinking skills to propose data solutions, test, and make them a reality. 5+ Years - Highly Proficient in SQL, Python, Java, Scala, or Go (or similar) - Experience programming engineering transformation in Python or a similar language. 5+ Years Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion. Cloud native technologist Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases. User experience advocacy through empathetic stakeholder relationship. Effective Communication both internally (with team members) and externally (with stakeholders) Knowledge of Data Warehouse concepts – experience with Data Warehouse/ ETL processes Strong process discipline and thorough understating of IT processes (ISP, Data Security). Responsibilities Responsibilities: Interact with GDIA product lines and business partners to understand data engineering opportunities, tooling and needs. Collaborate with Data Engineering and Data Architecture to design and build templates, pipelines and data products including automation, transformation and curation using best practices Develop custom cloud solutions and pipelines with GCP native tools – Data Prep, Data Fusion, Data Flow, DBT and Big Query Operationalize and automate data best practices: quality, auditable, timeliness and complete Participate in design reviews to accelerate the business and ensure scalability Work with Data Engineering and Architecture and Data Platform Engineering to implement strategic solutions Advise and direct team members and business partners on Ford standards and processes. Qualifications Preferred Qualifications: Excellent communication, collaboration and influence skills; ability to energize a team. Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality Hands on experience in Python using libraries like NumPy, Pandas, etc. Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI Experience with Teradata, Hadoop, Hive, Spark and other parts of legacy data platform Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products. Data Governance concepts including GDPR (General Data Protection Regulation), CCPA (California Consumer Protection Act), PoLP and how these can impact technical architecture Show more Show less
Posted 5 days ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position: Database Location: Noida, India www.SEW.ai Who We Are: SEW, with its innovative and industry-leading cloud platforms, delivers the best Digital Customer Experiences (CX) and Workforce Experiences (WX), powered by AI, ML, and IoT Analytics to the global energy, water, and gas providers. At SEW, the vision is to Engage, Empower, and Educate billions of people to save energy and water. We partner with businesses to deliver platforms that are easy-to-use, integrate seamlessly, and help build a strong technology foundation that allows them to become future- ready. Searching for your dream job? We are a true global company that values building meaningful relationships and maintaining a passionate work environment while fostering innovation and creativity. At SEW, we firmly believe that each individual contributes to our success and in return, we provide opportunities from them to learn new skills and build a rewarding professional career. A Couple of Pointers: • We are the fastest growing company with over 420+ clients and 1550+ employees. • Our clientele is based out in the USA, Europe, Canada, Australia, Asia Pacific, Middle East • Our platforms engage millions of global users, and we keep adding millions every month. • We have been awarded 150+ accolades to date. Our clients are continually awarded by industry analysts for implementing our award-winning product. • We have been featured by Forbes, Wall Street Journal, LA Times for our continuous innovation and excellence in the industry. Who we are looking? An ideal candidate who must demonstrate in-depth knowledge and understanding of RDBMS concepts and experienced in writing complex queries and data integration processes in SQL/TSQL and NoSQL. T his individual will be responsible for helping the design, development and implementation of new and existing applications. Roles and Responsibilities: • Reviews the existing database design and data management procedures and provides recommendations for improvement • Responsible for providing subject matter expertise in design of database schemes and performing data modeling (logical and physical models), for product feature enhancements as well as extending analytical capabilities. •Develop technical documentation as needed. • Architect, develop, validate and communicate Business Intelligence (BI) solutions like dashboards, reports, KPIs, instrumentation, and alert tools. • Define data architecture requirements for cross-product integration within and across cloud-based platforms. • Analyze, architect, develop, validate and support integrating data into the SEW platform from external data source; Files (XML, CSV, XLS, etc.), APIs (REST, SOAP), RDBMS. • Perform thorough analysis of complex data and recommend actionable strategies. • Effectively translate data modeling and BI requirements into the design process. • Big Data platform design i.e. tool selection, data integration, and data preparation for predictive modeling • Required Skills: • Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience in Extraction, Transformation and Loading ETL work using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developer in Oracle, MS SQL or other enterprise database with focus on building data integration processes. • Candidate should have any NoSql technology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with Big Data platforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding data warehousing concepts and decision support systems. • Ability to deal with sensitive and confidential material and adhere to worldwide data security and Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skill Show more Show less
Posted 5 days ago
8.0 years
0 Lacs
Tamil Nadu, India
On-site
Job Title: Data Engineer About VXI VXI Global Solutions is a BPO leader in customer service, customer experience, and digital solutions. Founded in 1998, the company has 40,000 employees in more than 40 locations in North America, Asia, Europe, and the Caribbean. We deliver omnichannel and multilingual support, software development, quality assurance, CX advisory, and automation & process excellence to the world’s most respected brands. VXI is one of the fastest growing, privately held business services organizations in the United States and the Philippines, and one of the few US-based customer care organizations in China. VXI is also backed by private equity investor Bain Capital. Our initial partnership ran from 2012 to 2016 and was the beginning of prosperous times for the company. During this period, not only did VXI expand our footprint in the US and Philippines, but we also gained ground in the Chinese and Central American markets. We also acquired Symbio, expanding our global technology services offering and enhancing our competitive position. In 2022, Bain Capital re-invested in the organization after completing a buy-out from Carlyle. This is a rare occurrence in the private equity space and shows the level of performance VXI delivers for our clients, employees, and shareholders. With this recent investment, VXI has started on a transformation to radically improve the CX experience though an industry leading generative AI product portfolio that spans hiring, training, customer contact, and feedback. Job Description: We are seeking talented and motivated Data Engineers to join our dynamic team and contribute to our mission of harnessing the power of data to drive growth and success. As a Data Engineer at VXI Global Solutions, you will play a critical role in designing, implementing, and maintaining our data infrastructure to support our customer experience and management initiatives. You will collaborate with cross-functional teams to understand business requirements, architect scalable data solutions, and ensure data quality and integrity. This is an exciting opportunity to work with cutting-edge technologies and shape the future of data-driven decision-making at VXI Global Solutions. Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes to ingest, transform, and store data from various sources. Collaborate with business stakeholders to understand data requirements and translate them into technical solutions. Implement data models and schemas to support analytics, reporting, and machine learning initiatives. Optimize data processing and storage solutions for performance, scalability, and cost-effectiveness. Ensure data quality and integrity by implementing data validation, monitoring, and error handling mechanisms. Collaborate with data analysts and data scientists to provide them with clean, reliable, and accessible data for analysis and modeling. Stay current with emerging technologies and best practices in data engineering and recommend innovative solutions to enhance our data capabilities. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven 8+ years' experience as a data engineer or similar role Proficiency in SQL, Python, and/or other programming languages for data processing and manipulation. Experience with relational and NoSQL databases (e.g., SQL Server, MySQL, Postgres, Cassandra, DynamoDB, MongoDB, Oracle), data warehousing (e.g., Vertica, Teradata, Oracle Exadata, SAP Hana), and data modeling concepts. Strong understanding of distributed computing frameworks (e.g., Apache Spark, Apache Flink, Apache Storm) and cloud-based data platforms (e.g., AWS Redshift, Azure, Google BigQuery, Snowflake) Familiarity with data visualization tools (e.g., Tableau, Power BI, Looker, Apache Superset) and data pipeline tools (e.g. Airflow, Kafka, Data Flow, Cloud Data Fusion, Airbyte, Informatica, Talend) is a plus. Understanding of data and query optimization, query profiling, and query performance monitoring tools and techniques. Solid understanding of ETL/ELT processes, data validation, and data security best practices Experience in version control systems (Git) and CI/CD pipelines. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills to work effectively with cross-functional teams. Join VXI Global Solutions and be part of a dynamic team dedicated to driving innovation and delivering exceptional customer experiences. Apply now to embark on a rewarding career in data engineering with us! Show more Show less
Posted 5 days ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Role: Data Scientist Location: Noida (Hybrid) Required skills and qualifications Bachelor's/Master's/Ph.D. degree in Data Science, Computer Science, Statistics, Mathematics, or a related field 5+ years prior experience in a data science role or related field is preferred Proficiency in programming languages such as Python or R for data analysis and modeling Proficiency with data mining, mathematics, and statistical analysis using Python and R Strong understanding of machine learning techniques, algorithms, and their applications Advanced experience in pattern recognition and predictive modeling Experience with Excel, PowerPoint, Tableau, SQL Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure) is a plus Great problem-solving skills and the ability to translate business questions into data science tasks Responsibilities Identify relevant data sources and sets to mine for client business needs, and collect large structured and unstructured datasets and variables Collaborate with domain experts, engineers, and stakeholders to understand business requirements and translate them into analytical solutions Communicate complex findings and insights to both technical and non-technical audiences through clear visualizations, presentations, and reports Clean, preprocess, and transform data to ensure its quality and suitability for analysis Perform data and error analysis to improve models using Python and R Conduct exploratory data analysis to identify patterns, trends, and anomalies, and translate them into actionable recommendations with clear objectives in mind Continuously evaluate model performance and make improvements based on real-world outcomes Various classical Statistical techniques such as Regression, Multivariate Analysis etc . Time Series based forecasting modeling Experience with SQL and data warehousing (e.g. GCP/Hadoop/Teradata/Oracle/DB2) Experience using tools in BI, ETL, Reporting /Visualization/Dashboards Programming experience in languages like Python strongly desired Exposure to Bigdata based analytical solutions and hands-on experience with data lakes/ data cleansing/ data management. Ability to get Insights from Data, provide visualization, and data storytelling . Show more Show less
Posted 5 days ago
15.0 years
0 Lacs
India
Remote
Job Title: Data Engineer Lead - AEP Location: Remote Experience Required: 12–15 years overall experience 8+ years in Data Engineering 5+ years leading Data Engineering teams Cloud migration & consulting experience (GCP preferred) Job Summary: We are seeking a highly experienced and strategic Lead Data Engineer with a strong background in leading data engineering teams, modernizing data platforms, and migrating ETL pipelines and data warehouses to Google Cloud Platform (GCP) . You will work directly with enterprise clients, architecting scalable data solutions, and ensuring successful delivery in high-impact environments. Key Responsibilities: Lead end-to-end data engineering projects including cloud migration of legacy ETL pipelines and Data Warehouses to GCP (BigQuery) . Design and implement modern ELT/ETL architectures using Dataform , Dataplex , and other GCP-native services. Provide strategic consulting to clients on data platform modernization, governance, and data quality frameworks. Collaborate with cross-functional teams including data scientists, analysts, and business stakeholders. Define and enforce data engineering best practices , coding standards, and CI/CD processes. Mentor and manage a team of data engineers; foster a high-performance, collaborative team culture. Monitor project progress, ensure delivery timelines, and manage client expectations. Engage in technical pre-sales and solutioning , driving excellence in consulting delivery. Technical Skills & Tools: Cloud Platforms: Strong experience with Google Cloud Platform (GCP) – particularly BigQuery , Dataform , Dataplex , Cloud Composer , Cloud Storage , Pub/Sub . ETL/ELT Tools: Apache Airflow, Dataform, dbt (if applicable). Languages: Python, SQL, Shell scripting. Data Warehousing: BigQuery, Snowflake (optional), traditional DWs (e.g., Teradata, Oracle). DevOps: Git, CI/CD pipelines, Docker. Data Modeling: Dimensional modeling, Data Vault, star/snowflake schemas. Data Governance & Lineage: Dataplex, Collibra, or equivalent tools. Monitoring & Logging: Stackdriver, DataDog, or similar. Preferred Qualifications: Proven consulting experience with premium clients or Tier 1 consulting firms. Hands-on experience leading large-scale cloud migration projects . GCP Certification(s) (e.g., Professional Data Engineer, Cloud Architect). Strong client communication, stakeholder management, and leadership skills. Experience with agile methodologies and project management tools like JIRA. Show more Show less
Posted 6 days ago
0.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Role: Senior Analyst - Data Engineering Experience: 3 to 6 years Location: Bengaluru, Karnataka , India (BLR) Job Descrition: We are seeking a highly experienced and skilled Senior Data Engineer to join our dynamic team. This role requires hands-on experience with databases such as Snowflake and Teradata, as well as advanced knowledge in various data science and AI techniques. The successful candidate will play a pivotal role in driving data-driven decision-making and innovation within our organization. Job Resonsbilities: Design, develop, and implement advanced machine learning models to solve complex business problems. Apply AI techniques and generative AI models to enhance data analysis and predictive capabilities. Utilize Tableau and other visualization tools to create insightful and actionable dashboards for stakeholders. Manage and optimize large datasets using Snowflake and Teradata databases. Collaborate with cross-functional teams to understand business needs and translate them into analytical solutions. Stay updated with the latest advancements in data science, machine learning, and AI technologies. Mentor and guide junior data scientists, fostering a culture of continuous learning and development. Communicate complex analytical concepts and results to non-technical stakeholders effectively. Key Technologies & Skills: Machine Learning Models: Supervised learning, unsupervised learning, reinforcement learning, deep learning, neural networks, decision trees, random forests, support vector machines (SVM), clustering algorithms, etc. AI Techniques: Natural language processing (NLP), computer vision, generative adversarial networks (GANs), transfer learning, etc. Visualization Tools: Tableau, Power BI, Matplotlib, Seaborn, Plotly, etc. Databases: Snowflake, Teradata, SQL, NoSQL databases. Programming Languages: Python (essential), R, SQL. Python Libraries: TensorFlow, PyTorch, scikit-learn, pandas, NumPy, Keras, SciPy, etc. Data Processing: ETL processes, data warehousing, data lakes. Cloud Platforms: AWS, Azure, Google Cloud Platform. Big Data Technologies: Apache Spark, Hadoop. Job Snapshot Updated Date 11-06-2025 Job ID J_3679 Location Bengaluru, Karnataka, India Experience 3 - 6 Years Employee Type Permanent
Posted 6 days ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Join us as a Senior Test Automation Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionise our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior Test Automation Engineer you should have experience with: Hands on Test Automation with deep understanding of Software/QA Methodologies Understand requirements, user stories and able to prepare Test scope, test cases and execute the same Execute Non-functional requirements tests including performance, load, stress, scalability, and reliability Testing/ Automation Tools / frameworks like – Python, Pytest, BDD, TDD, Karate, Rest Assured, Performance Centre, Load runner etc.. Good understanding of tech stack as AWS, Kafka (Messaging Queues), Mongo DB, SQL, ETL and APIs CICD integration tools like Jenkins, TeamCity, GitLab etc. Collaborate closely with Dev/DevOps/BA teams. Unix commands, ETL architecture & Data Warehouse concepts Python Language (For Test Automation) – In-depth understanding of Data Structures: Lists, Tuples, Sets, Dictionaries. OOPS concepts, Data Frames, Lambda functions, Boto3, File handling, DB handling, debugging techniques etc Perform complex SQL queries/ joins to validate data transformations, migration and integrity across source and target systems. Test and defect management - Document Test results, defects, and track issues to resolutions using tool – Jira/ X-Ray Experience with at least one relational database – Oracle (Golden Gate services), MYSQL or SQL Server or Teradata Experience with at least one CICD tool for integrating Test Automation suits – Jenkins or TeamCity Some Other Highly Valued Skills May Include Functional Corporate Banking knowledge Good understanding of Agile methodologies Hands on experience with Gen AI models Good understanding of Snowflake, DBT & Pyspark Experience with BI tools like Tableau/ Power BI for visual data validations You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. . Collaboration with cross-functional teams to analyse requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations, and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Assistant Vice President Expectations To advise and influence decision making, contribute to policy development and take responsibility for operational effectiveness. Collaborate closely with other functions/ business divisions. Lead a team performing complex tasks, using well developed professional knowledge and skills to deliver on work that impacts the whole business function. Set objectives and coach employees in pursuit of those objectives, appraisal of performance relative to objectives and determination of reward outcomes If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviours are: L – Listen and be authentic, E – Energise and inspire, A – Align across the enterprise, D – Develop others. OR for an individual contributor, they will lead collaborative assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments. They will identify new directions for assignments and/ or projects, identifying a combination of cross functional methodologies or practices to meet required outcomes. Consult on complex issues; providing advice to People Leaders to support the resolution of escalated issues. Identify ways to mitigate risk and developing new policies/procedures in support of the control and governance agenda. Take ownership for managing risk and strengthening controls in relation to the work done. Perform work that is closely related to that of other areas, which requires understanding of how areas coordinate and contribute to the achievement of the objectives of the organisation sub-function. Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategy. Engage in complex analysis of data from multiple sources of information, internal and external sources such as procedures and practises (in other areas, teams, companies, etc).to solve problems creatively and effectively. Communicate complex information. 'Complex' information could include sensitive information or information that is difficult to communicate because of its content or its audience. Influence or convince stakeholders to achieve outcomes. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship – our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset – to Empower, Challenge and Drive – the operating manual for how we behave. Back to nav Share job X(Opens in new tab or window) Facebook(Opens in new tab or window) LinkedIn(Opens in new tab or window) Show more Show less
Posted 6 days ago
9.0 - 12.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview As a part of Global Risk Analytics, Enterprise Risk Analytics (ERA ) is responsible for the development of cross-business holistic analytical models and tools. Team responsibilities include: Financed Emissions responsible for supporting the calculation of asset level balance sheet Financed Emissions, which are integral to the Bank ’s goal of achieving Net-zero greenhouse gas emissions by 2050. Financial Crimes Modelling & Analytics responsible for enterprise-wide financial crimes and compliance surveillance model development and ongoing monitoring across all lines of business globally. Operational Risk responsible for operational risk loss forecasting and capital model development for CCAR/stress testing and regulatory capita l reporting/economic capital measurement purpose. Business Transformations is a central team of Project Managers and Quantitative S/W engineers partnering with coverage area ERA teams with the end goal of onboarding ERA production processes on GCP/production platforms as well as identify risk/gaps in ERA processes which can be fixed with well-designed and controlled S/W solutions. Trade Surveillance Analytics responsible for modelling and analytics supporting trade surveillance activities within risk. Advanced Analytics responsible for driving research, development, and implementation of new enhanced risk metrics and provide quantitative support for loss forecasting and stress testing requirements, including process improvement and automation Job Description The role will be responsible for independently conducting quantitative analytics and modeling projects Responsibilities Perform model development proof of concept, research model methodology, explore internal & external data sources, design model development data, and develop preliminary model Conduct complex data analytics on modeling data, identify, explain & address data quality issues, apply data exclusions, perform data transformation, and prepare data for model development Analyze portfolio definition, define model boundary, analyze model segmentation, develop Financed Emissions models for different asset classes, analyze and benchmark model results Work with Financed Emissions Data Team & Climate Risk Tech on the production process of model development & implementation data, including support data sourcing efforts, provide data requirements, perform data acceptance testing, etc. Work with Financed Emissions Production & Reporting Team on model implementation, model production run analysis, result analysis & visualization Work with ERA Model Implementation team & GCP Tech on model implementation, including opine on implementation design, provide implementation data model & requirements, perform model implementation result testing, etc. Work with Model Risk Management (MRM) on model reviews and obtain model approvals Work with GEG (Global Environmental Group) and FLU (Front Line Unit) on model requirements gathering & analysis, Climate Risk target setting, disclosure, analysis & reporting Requirements Education B.E. / B. Tech/M.E. /M. Tech Certifications If any : NA Experience Range : 9 to 12 years Foundational Skills* Advanced knowledge of SQL and Python Advanced Excel, VSCode, LaTex, Tableau skills Experience in multiple data environment such as Oracle, Hadoop, and Teradata Knowledge of data architecture concepts, data models, ETL processes Knowledge of climate risk, financial concepts & products Experience in extracting, and combining data across from multiple sources, and aggregate data for model development Experience in conducting quantitative analysis, performing model driven analytics, and developing models Experience in documenting business requirements for data, model, implementation, etc. Desired Skills Basics of Finance Basics of Climate Risk Work Timings 11:30 AM to 8:30 PM Job Location Hyderabad, Chennai Show more Show less
Posted 6 days ago
0.0 - 2.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Citi, the leading global bank, has approximately 200 million customer accounts and does business in more than 160 countries and jurisdictions. Citi provides consumers, corporations, governments and institutions with a broad range of financial products and services, including consumer banking and credit, corporate and investment banking, securities brokerage, transaction services, and wealth management. Our core activities are safeguarding assets, lending money, making payments and accessing the capital markets on behalf of our clients. Citi's Mission and Value Proposition explains what we do and Citi Leadership Standards explain how we do it. Our mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress. We strive to earn and maintain our clients and the publics trust by constantly adhering to the highest ethical standards and making a positive impact on the communities we serve. Our Leadership Standards is a common set of skills and expected behaviors that illustrate how our employees should work every day to be successful and strengthens our ability to execute against our strategic priorities. Diversity is a key business imperative and a source of strength at Citi. We serve clients from every walk of life, every background and every origin. Our goal is to have our workforce reflect this same diversity at all levels. Citi has made it a priority to foster a culture where the best people want to work, where individuals are promoted based on merit, where we value and demand respect for others and where opportunities to develop are widely available to all. The Operations MIS team focuses on creating reports, dashboards, and performance metrics to provide actionable insights for various business functions, including USPB WFM & Customer Service and Wealth Ops MIS. They are responsible for building and maintaining datamarts, migrating legacy BI tools to modern platforms like Tableau, and automating data refreshes for dashboards. Projects include tracking ATM availability and performance, managing service tickets, and upgrading software for uninterrupted service. They aim to empower business stakeholders with accurate and timely information for strategic decision-making. Their work also supports capacity planning and issue remediation efforts. The Data/Information Mgt Analyst - C09 is a developing professional role. Applies specialty area knowledge in monitoring, assessing, analyzing and/or evaluating processes and data. Interprets data and makes recommendations. Researches and interprets information. Identifies inconsistencies in data or results, defines business issues and formulates recommendations on policies, procedures or practices. Integrates established disciplinary knowledge within own specialty area with basic understanding of related industry practices. Good understanding of how the team interacts with others in accomplishing the objectives of the area. Develops working knowledge of industry practices and standards. Limited but direct impact on the business through the quality of the tasks/services provided. Impact of the job holder is restricted to own team. In this role, you're expected to: Gathers operational data from various cross functional stakeholders to examine past business performance Identifies data patterns & trends, and provides insights to enhance business decision making capability in business planning, process improvement, solution assessment etc. Recommends actions for future developments & strategic business opportunities, as well as enhancements to operational policies. May be involved in exploratory data analysis, confirmatory data analysis and/or qualitative analysis. Translate data into consumer or customer behavioral insights to drive targeting and segmentation strategies, and communicate clearly and effectively to business partners and senior leaders all findings Continuously improve processes and strategies by exploring and evaluating new data sources, tools, and capabilities Work closely with internal and external business partners in building, implementing, tracking and improving decision strategies Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. As a successful candidate, should ideally have the following skills and exposure: Data Warehousing & BI Tools: Strong understanding of data warehousing concepts, ETL processes, and experience working with Business Intelligence platforms (e.g., Tableau, Power BI). Reporting & Dashboard Development: Proficiency in developing reports, dashboards, and performance metrics using reporting tools and data visualization techniques. Ability to create clear and concise visualizations of key data. Data Management & SQL: Expertise in data management principles, SQL programming for data extraction and manipulation, and database management systems. Communication & Stakeholder Management: Ability to effectively communicate technical information to non-technical stakeholders and collaborate with business partners to understand reporting requirements. Automation & Scripting: Experience with scripting languages (e.g., Python) and automation tools (e.g., SSIS) for automating report generation, data refreshes, and other routine tasks. Education: Master degree in Information Technology / Information Systems / Computer Applications / Engineering from a premier institute BTech/B.E/MCA in Information Technology / Information Systems / Computer Applications Experience: Established competency in one or more of the following: 0-2 years (for master’s degree) / 2-4 year (for 4 years bachelor’s degree) of relevant work experience in Data Management / MIS / Reporting / Data analytics within Banking / Financial Services / Analytics Industry Programming –SQL Data Manipulation: PySpark, python, SSIS Visualization: Tableau / Power BI Reporting: SSRS Databases: MS SQL Server, Teradata Understanding of Systems and Technology Platforms Strong analytical aptitude and logical reasoning ability Strong communication skills Good presentation skills Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you’ll have the opportunity to grow your career, give back to your community and make a real impact. ------------------------------------------------------ Job Family Group: Decision Management ------------------------------------------------------ Job Family: Data/Information Management ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 6 days ago
6.0 years
5 - 9 Lacs
Hyderābād
On-site
Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Apply business standards, processes and requirements to set up automated processes for data transformation, loading, and normalization Build, monitor and maintain highly secure digital pipelines for the transport of clinical data for DRN projects Validate data relationships, mappings and definitions Develop methodology to analyze and load data and to ensure data quality Participate in internal and external client data implementation calls to discuss file data formats and content Monitor and maintain extraction processes to ensure that data feeds remain current and complete Monitor data flows for loading errors and incomplete or incompatible formats Coordinate with team members and clients to ensure on-time delivery and receipt of data files Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: College bachelor’s degree, or equivalent work experience 6+ years of Scala or Python or Spark or SQL 6+ months of SAS or SQL programming experience Experience with complex SQL statements and familiarity with relational databases Health care claims data experience Unix scripting knowledge General software development knowledge (.NET, Java, Oracle, Teradata, HTML, etc.) Familiarity with processing large data sets AWS or Azure cloud services exposure (AWS Glue, Azure Synapse, Azure Data Factory, Data Bricks) Proficient with Microsoft operating systems and Internet browsers Proven ability to work independently Proven solid verbal and written communication skills Proven solid ability to multi-task and prioritize multiple projects at any given time Proven solid analytical and problem-solving skills Proven ability to work within a solid team structure Willing or ability to travel to meet with internal or external customers At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.
Posted 6 days ago
2.0 years
1 - 8 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking an Analytics Consultant. In this role, you will: Consult with business line and enterprise functions on less complex research Use functional knowledge to assist in non-model quantitative tools that support strategic decision making Perform analysis of findings and trends using statistical analysis and document process Present recommendations to increase revenue, reduce expense, maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Participate in all group technology efforts including design and implementation of database structures, analytics software, storage, and processing Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Understand compliance and risk management requirements for supported area Ensure adherence to data management or data governance regulations and policies Participate in company initiatives or processes to assist in meeting risk and capital objectives and other strategic goals Collaborate and consult with more experienced consultants and with partners in technology and other business groups Required Qualifications: 2+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in Analytics, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Excellent verbal, written, and interpersonal communication skills. Strong knowledge of Enterprise Risk programs and applicability of risk management framework (3 Line of defense) Experience identifying internal and external data sources from multiple sources across the business Experience with SQL, Teradata, or SAS and Database Management systems like Teradata and MS SQL Server. Experience in risk (includes compliance, financial crimes, operational, audit, legal, credit risk, market risk). Experience in data visualization and business intelligence tools. Advanced Microsoft Office (Word, Excel, Outlook and PowerPoint) skills Demonstrated strong analytical skills with high attention to detail and accuracy. Strong presentation skills and ability to translate and present data in a manner that educates, enhances understanding, and influence decisions, bias for simplicity Strong writing skills - proven ability to translate data sets and conclusions drawn from analysis into business/executive format and language Ability to support multiple projects with tight timelines Meta Data management, Data Lineage, Data Element Mapping, Data Documentation experience. Experience researching and resolving data problems and working with technology teams on remediation of data issues Hands-on proficiency with Python, Power BI (Power Query, DAX, Power apps), Tableau, or SAS Knowledge of Defect management tools like HP ALM. Knowledge of Data Governance. Job Expectations: Ensure adherence to data management or data governance regulations and policies Extract and analyze data from multiple technology systems/platforms and related data sources to identify factors that pose a risk to the firm. Consult with business line and enterprise functions on less complex research Understand compliance and risk management requirements for sanctions compliance and data management Perform analysis of findings and trends using statistical analysis and document process Require a solid background in reporting, understanding and utilizing Relational Databases and Data Warehouses, and be effective in querying and reporting large and complex data sets. Excel at telling stories with data, presenting information in visually compelling ways that appeal to executive audiences, and will be well versed in the development and delivery of reporting solutions. Responsible for building easy to use visualization and perform data analysis to generate meaningful business insights using complex datasets for global stakeholders. Responsible for testing key reports and produce process documentation. Present recommendations to maximize operational efficiency, quality, and compliance Identify and define business requirements and translate data and business needs into research and recommendations to improve efficiency Develop customized reports and ad hoc analyses to make recommendations and provide guidance to less experienced staff Posting End Date: 16 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 6 days ago
3.0 - 5.0 years
10 - 14 Lacs
Gurugram, Bengaluru, Mumbai (All Areas)
Hybrid
Role & responsibilities : Design,develop, and maintain ETL workflows using Ab Initio. Manage and support critical data pipelines and data sets across complex,high-volume environments. Perform data analysis and troubleshoot issues across Teradata and Oracle data sources. Collaborate with DevOps for CI/CD pipeline integration using Jenkins, and manage deployments in Unix/Linux environments. Participate in Agile ceremonies including stand-ups, sprint planning, and roadmap discussions. Support cloud migration efforts, including potential adoption of Azure,Databricks, and PySparkbased solutions. Contribute to project documentation, metadata management (LDM, PDM), onboarding guides, and SOPs Preferred candidate profile 3 years of experience in data engineering, with proven expertise in ETL development and maintenance. Proficiency with Ab Initio tools (GDE, EME, Control Center). Strong SQL skills, particularly with Oracle or Teradata. Solid experience with Unix/Linux systems and scripting. Familiarity with CI/CD pipelines using Jenkins or similar tools. Strong communication skills and ability to collaborate with cross-functional teams.
Posted 6 days ago
2.0 - 4.0 years
8 - 12 Lacs
Mumbai
Work from Office
The SAS to Databricks Migration Developer will be responsible for migrating existing SAS code, data processes, and workflows to the Databricks platform. This role requires expertise in both SAS and Databricks, with a focus on converting SAS logic into scalable PySpark and Python code. The developer will design, implement, and optimize data pipelines, ensuring seamless integration and functionality within the Databricks environment. Collaboration with various teams is essential to understand data requirements and deliver solutions that meet business needs
Posted 6 days ago
0 years
7 - 9 Lacs
Calcutta
On-site
Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant –Banking SME! Responsibilities 1. Domain Expertise & Requirements Gathering Act as the Subject Matter Expert (SME) for banking products, services, and core operations (e.g., retail banking, lending, cards, payments, risk, compliance). Understand current reporting and data usage in Teradata across key functions (Finance, Risk, Compliance, Treasury). Collaborate with business teams to capture data requirements , KPI definitions , and use cases for cloud consumption. Act as the go-to expert on Indian banking processes, products, and regulations. Lead or contribute to solution design for digital banking platforms, core banking systems, or risk and compliance solutions. Liaise with product managers, developers, and business teams to ensure functional accuracy and feasibility. Conduct in-depth gap analysis, process mapping, and define business requirements. Provide insights on industry trends, regulatory changes, and customer expectations. Assist in responding to RFPs and proposals with domain-specific content. Mentor junior business analysts and support training initiatives. Engage with clients to gather requirements and present domain solutions. 2. Source-to-Target Data Mapping Define and validate data mappings from Teradata tables to GCP data models (e.g., BigQuery). Ensure proper representation of key banking entities such as accounts, transactions, customers, products, GLs. Support creation and validation of STTM (Source-to-Target Mapping) documents and transformation logic. 3. Data Validation & Reconciliation Participate in data validation strategy and run-throughs across multiple reconciliation cycles. Support and perform sample-based and logic-based data validation (e.g., balances, interest accruals, transactional totals). Help establish reconciliation rules to compare GCP output against Teradata extracts or reports. 4. Testing & Sign-Off Define test cases and support User Acceptance Testing (UAT) for migrated data. Review report outputs and regulatory extracts (e.g., BCBS, IFRS, GL reconciliation) for accuracy post-migration. Act as business validator during dry runs and final cutovers. 5. Stakeholder Engagement Liaise between technical migration teams and business stakeholders to clarify business rules and resolve data discrepancies. Conduct walkthroughs and data quality discussions with line-of-business leads and data governance teams. Qualifications we seek in you! Minimum Qualifications / Skills Bachelor’s/ Master’s degree in Finance , Business, or related field. experience in the Indian banking sector, with a mix of operational and digital transformation exposure. Deep understanding of RBI regulations, KYC/AML, Basel II/III, and banking products (CASA, loans, trade finance, etc.). Experience in core banking systems like Finacle, TCS B aNCS, Temenos, or similar. Familiarity with digital banking platforms, APIs, UPI, and open banking. Strong analytical, communication, and stakeholder management skills. Exposure to agile or hybrid project environments preferred. Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Kolkata Schedule Full-time Education Level Master's / Equivalent Job Posting Jun 10, 2025, 4:02:24 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time
Posted 6 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2