Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About This Role Wells Fargo is seeking Data Management Analyst. In This Role, You Will Participate in less complex analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Consult with clients to assess the current state of data quality within area of assigned responsibility Required Qualifications: 2+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Manage data entry, cleansing and updating processes across core system. Identify and resolve data inconsistency or quality issues. Banking business or technology experience Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx & LLM Job Expectations: Assist in implementing the data process. Monitor data flows and perform regular audits to maintain data integrity. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries. Posting End Date: 30 Jul 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-472013-2
Posted 3 weeks ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About This Role Wells Fargo is seeking Data Management Analyst. In This Role, You Will Participate in less complex analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Consult with clients to assess the current state of data quality within area of assigned responsibility Required Qualifications: 2+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Manage data entry, cleansing and updating processes across core system. Identify and resolve data inconsistency or quality issues. Banking business or technology experience Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx & LLM Job Expectations: Assist in implementing the data process. Monitor data flows and perform regular audits to maintain data integrity. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries. Posting End Date: 30 Jul 2025 Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants With Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment And Hiring Requirements Third-Party recordings are prohibited unless authorized by Wells Fargo. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process. Reference Number R-472013-1
Posted 3 weeks ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Lowe’s Lowe’s is a FORTUNE® 100 home improvement company serving approximately 16 million customer transactions a week in the United States. With total fiscal year 2024 sales of more than $83 billion, Lowe’s operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe’s supports the communities it serves through programs focused on creating safe, affordable housing, improving community spaces, helping to develop the next generation of skilled trade experts and providing disaster relief to communities in need. For more information, visit Lowes.com. Lowe’s India, the Global Capability Center of Lowe’s Companies Inc., is a hub for driving our technology, business, analytics, and shared services strategy. Based in Bengaluru with over 4,500 associates, it powers innovations across omnichannel retail, AI/ML, enterprise architecture, supply chain, and customer experience. From supporting and launching homegrown solutions to fostering innovation through its Catalyze platform, Lowe’s India plays a pivotal role in transforming home improvement retail while upholding strong commitment to social impact and sustainability. For more information, visit Lowes India About The Team The Customer Insights team uses data and analytics capabilities to provide strategic insights around customer and competitive landscape to help drive effective market strategy for Lowe’s. The team works closely with various key functions across the business to provide insights and recommendations across different business areas. Job Summary In this role the candidate will be part of the Consumer Insights team and work on consumer research projects to support building actionable insights using data from various sources to support building effective Customer Strategy for Lowe’s. The candidate will be responsible for generating regular and ad hoc reports, support survey operations and developing scorecards, dashboards and work on ad hoc Consumer research questions to provide insights on a regular basis to stakeholders across the organization. Roles & Responsibilities Core Responsibilities: Work with the Bangalore and the US team on defining various research needs, applying appropriate methodology and conduct data analysis to generate insights Work on survey programming leveraging tools like Qualtrics/Decipher to run surveys. Work to improve process efficiencies in our ongoing projects such as Brand Health Tracker and help streamline workflows to enhance overall experience. Analyze customer data from multiple sources (surveys, feedback, internal data etc.) to identify key trends, behaviors, and pain points. Develop and conduct quality checks to ensure accuracy of 3rd party data feeding into various reports, trackers and analysis. Contribute to the development of new analytical methods or tools that can further enhance the understanding of customer behavior and improve operational processes. Work with the vendor platforms/ vendor teams (operations team) to develop an infrastructure and manage the research data for any analytics project. Work on ad hoc data / analysis / research requests. Measure the effectiveness of creative campaigns by tracking key performance indicators (KPIs). Conduct competitive analysis to benchmark creative performance against industry standards and key competitors. Present insights and analysis results to our key stakeholders Years Of Experience 1-3 Years Education Qualification & Certifications (optional) Required Minimum Qualifications Bachelor’s degree in business, Operations Management, Marketing, Data Science, Statistics, or a related field. A master’s degree or relevant certifications is a plus Skill Set Required Primary Skills (must have) Key Skills: Excel, Python, SQL, Market Research Proficiency in data analysis tools and software (Excel, Power Point, SPSS etc.). Experienced in understanding survey requirements and building questionnaires. Excellent problem-solving skills with the ability to translate survey data into meaningful insights. Strong written and verbal communication skills, including the ability to present complex information to non-technical audiences. Ability to work collaboratively across cross-functional teams and third-party vendors to implement and improve operational processes. Secondary Skills (desired) Familiarity with business intelligence or data visualization tools like Power BI, Tableau or similar platforms. Knowledge of third-party tools like Kantar AI, IPSOS Digital, Ace Metrics Data analytics tools and software like SQL, Python, Teradata, Hadoop is an added advantage. Experience working on statistical models would be a plus. Experience on working with text data would be an advantage. 1+ years of experience in working in customer service, retail, e-commerce, or other customer-facing industries. Lowe's is an equal opportunity employer and administers all personnel practices without regard to race, color, religious creed, sex, gender, age, ancestry, national origin, mental or physical disability or medical condition, sexual orientation, gender identity or expression, marital status, military or veteran status, genetic information, or any other category protected under federal, state, or local law. Starting rate of pay may vary based on factors including, but not limited to, position offered, location, education, training, and/or experience. For information regarding our benefit programs and eligibility, please visit https://talent.lowes.com/us/en/benefits.
Posted 3 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Lowe’s Lowe's Companies, Inc. (NYSE: LOW) is a FORTUNE® 50 home improvement company serving approximately 17 million customer transactions a week in the U.S. With total fiscal year 2022 sales of over $97 billion, approximately $92 billion of sales were generated in the U.S., where Lowe's operates over 1,700 home improvement stores and employs approximately 300,000 associates. Based in Mooresville, N.C., Lowe's supports the communities it serves through programs focused on creating safe, affordable housing and helping to develop the next generation of skilled trade experts. About Lowe’s India At Lowe's India, we are the enablers who help create an engaging customer experience for our $97 billion home improvement business at Lowe's. Our 4000+ associates work across technology, analytics, business operations, finance & accounting, product management, and shared services. We leverage new technologies and find innovative methods to ensure that Lowe's has a competitive edge in the market. About the Team The pricing Analytics team supports pricing managers and merchants in defining and optimizing the pricing strategies for various product categories across the channels .The team leverages advance analytics to forecast/measure the impact of pricing actions , develop strategic price zones, recommend price changes and identify sales/margin opportunities to achieve company targets . Job Summary: The primary purpose of this role is to develop and maintain descriptive and predictive analytics models and tools that support Lowe's pricing strategy. Collaborating closely with the Pricing team, the analyst will help translate pricing goals and objectives into data and analytics requirements. Utilizing both open source and commercial data science tools, the analyst will gather and wrangle data to deliver data driven insights, trends, and identify anomalies . The analyst will apply the most suitable statistical and machine learning techniques to answer relevant questions and provide retail recommendations . The analyst will actively collaborate with product and business team, incorporating feedback through out the development to drive continuous improvement and ensure a best-in-class position in the pricing space. Roles & Responsibilities: Core Responsibilities: Translate pricing strategy and business objectives into analytics requirements. Develop and implement processes for collecting, exploring, structuring, enhancing, and cleaning large datasets from both internal and external sources. Conduct data validation, detect outliers, and perform root cause analysis to prepare data for statistical and machine learning models. Research, design, and implement relevant statistical and machine learning models to solve specific business problems. Ensure the accuracy of data science and machine learning model results and build trust in their reliability. Apply machine learning model outcomes to relevant business use cases. Assist in designing and executing A/B tests, multivariate experiments, and randomized controlled trials (RCTs) to evaluate the effects of price changes. Perform advanced statistical analyses (e.g., causal inference, Bayesian analysis, regression modeling) to extract actionable insights from experimentation data. Collaborate with teams such as Pricing Strategy & Execution, Analytics COE, Merchandising, IT, and others to define, prioritize, and develop innovative solutions. Keep up to date with the latest developments in data science, statistics, and experimentation techniques. Automate routine manual processes to improve efficiency. Years of Experience: 3-6 years of relevant experience Education Qualification & Certifications (optional) Required Minimum Qualifications : Bachelor’s or Masters in Engineering/business analytics/Data Science/Statistics/economics/math Skill Set Required Primary Skills (must have) 3+ Years of experience in advance quantitative analysis , statistical modeling and Machine Learning. Ability to perform various analytical concepts like Regression, Sampling techniques, hypothesis, Segmentation, Time Series Analysis, Multivariate Statistical Analysis, Predictive Modelling. 3+ years’ experience in corporate Data Science, Analytics, Pricing & Promotions, Merchandising, or Revenue Management . 3+ years’ experience working with common analytics and data science software and technologies such as SQL, Python, R, or SAS. 3+ years’ experience working with Enterprise level databases ( e.g., Hadoop, Teradata, Oracle, DB2 ) 3+ years’ experience using enterprise-grade data visualization tools ( e.g., Power BI , Tableau ) 3+ years’ experience working with cloud platforms ( e.g., GCP, Azure ,AWS ) Secondary Skills (desired) Technical expertise in Alteryx, Knime.
Posted 3 weeks ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Our company: At Teradata, we believe that people thrive when empowered with better information. That’s why we built the most complete cloud analytics and data platform for AI. By delivering harmonized data, trusted AI, and faster innovation, we uplift and empower our customers—and our customers’ customers—to make better, more confident decisions. The world’s top companies across every major industry trust Teradata to improve business performance, enrich customer experiences, and fully integrate data across the enterprise. What you will do: Highly motivated Full Stack Engineer with a solid background in software development. The ideal candidate should be adept at multi-tasking across various development activities, including coding, system configuration, testing, and research. Key Responsibilities: Collaborate with an integrated development team to deliver high-quality applications. Develop end-user applications, leveraging research capabilities and SQL knowledge. Utilize open-source tools and technologies effectively, adapting and extending them as needed to create innovative solutions. Communicate effectively across teams to ensure alignment and clarity throughout the development process. Provide post-production support Who you will work with: On our team we collaborate with several cross-functional agile teams that include product owners, other engineering groups, and quality engineers to conceptualize, build, test and ship software solutions for the next generation of enterprise applications. You will report directly to the Manager of the Applications team. What makes you a qualified candidate: 4+ years of relevant experience, preferably in R&D based teams Strong programming experience with JavaScript frameworks such as Angular, React, or Node. js, or experience writing Python-based microservices. Experience driving cloud-native service development with a focus on DevOps principles (CI/CD, TDD, Automation). Hands on experience on Java, JSP and related areas. Proficiency in Docker, Unix or Linux platforms. Experience with Spring Framework or Spring Boot a plus Expertise in designing and deploying scalable solutions in public cloud environments. A passion for innovation and continuous learning, with the ability to quickly adapt to new technologies. Familiarity with software configuration management tools, defect tracking tools, & peer review tools Excellent debugging skills to troubleshoot and resolve issues effectively. Familiarity with relational database management systems (RDBMS) such as PostgreSQL, MySQL, etc. Strong oral and written communication skills, with the ability to produce runbooks and both technical and non-technical documentation. What you will bring: Master’s or bachelor’s degree in computer science or a related discipline. Practical experience in development and support structures. Knowledge of cloud environments, particularly AWS. Proficiency in SQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 3 weeks ago
3.0 - 8.0 years
13 - 15 Lacs
Bengaluru
Work from Office
We offer a unique opportunity to collaborate with brilliant minds and deliver premier solutions that set a new standard. As a Data Scientist lead within the Data and Analytics team, you will drive the analytics book of work, transforming Chase s cross-channel, cross-line of business acquisition strategy to a hyper-personalized, data-driven, customer-centric model. You will partner strategically across the firm with marketers, channel owners, digital experts, and the broader analytics community to help drive business goals through deep understanding of marketing analytics and optimization. Job Responsibilities Develop the analytics framework and data infrastructure necessary to support the platform team in evaluating value addition. Define and assess the OKRs and goals related to the platforms performance. Provide top-tier business intelligence through dashboards and executive reporting to the Acquisitions Center of Excellence and Line of Business leadership. Construct business cases that drive prioritization and investment in Acquisition & Enablement Platforms. Communicate effectively with product, technology, data, and design teams to identify and advance a data-driven analytical roadmap. Serve as the Acquisition Center of Excellence Analytics local site lead, overseeing local operations and contributing to the expansion of the teams presence in India. Required Qualifications, Capabilities, and Skills 5+ years leveraging data visualization tools for data exploration and marketing performance evaluation. Proven ability to lead and manage teams effectively, showcasing strong leadership skills. Experience in querying big data platforms and SAS/SQL. Comfort building and managing relationships with both analytics and business stakeholders. Proven track record of problem-solving using data and building new analytics capabilities. Talent for translating numbers into an actionable story for business leaders. Experience with best-in-class web analytic tools (Google Analytics, Adobe/Omniture Insight/Visual Sciences, Webtrends, CoreMetrics, etc. ). Superior written, oral communication and presentation skills with experience communicating concisely and effectively with all levels of management and partners. Bachelor s degree is required, with data science, mathematics, statistics, econometrics, engineering, MIS, finance, or related fields accordingly. Preferred Qualifications, Capabilities, and Skills Financial services experience preferred. Tableau experience preferred. Familiarity with Teradata, AWS, & Snowflake preferred. We offer a unique opportunity to collaborate with brilliant minds and deliver premier solutions that set a new standard. As a Data Scientist lead within the Data and Analytics team, you will drive the analytics book of work, transforming Chase s cross-channel, cross-line of business acquisition strategy to a hyper-personalized, data-driven, customer-centric model. You will partner strategically across the firm with marketers, channel owners, digital experts, and the broader analytics community to help drive business goals through deep understanding of marketing analytics and optimization. Job Responsibilities Develop the analytics framework and data infrastructure necessary to support the platform team in evaluating value addition. Define and assess the OKRs and goals related to the platforms performance. Provide top-tier business intelligence through dashboards and executive reporting to the Acquisitions Center of Excellence and Line of Business leadership. Construct business cases that drive prioritization and investment in Acquisition & Enablement Platforms. Communicate effectively with product, technology, data, and design teams to identify and advance a data-driven analytical roadmap. Serve as the Acquisition Center of Excellence Analytics local site lead, overseeing local operations and contributing to the expansion of the teams presence in India. Required Qualifications, Capabilities, and Skills 5+ years leveraging data visualization tools for data exploration and marketing performance evaluation. Proven ability to lead and manage teams effectively, showcasing strong leadership skills. Experience in querying big data platforms and SAS/SQL. Comfort building and managing relationships with both analytics and business stakeholders. Proven track record of problem-solving using data and building new analytics capabilities. Talent for translating numbers into an actionable story for business leaders. Experience with best-in-class web analytic tools (Google Analytics, Adobe/Omniture Insight/Visual Sciences, Webtrends, CoreMetrics, etc. ). Superior written, oral communication and presentation skills with experience communicating concisely and effectively with all levels of management and partners. Bachelor s degree is required, with data science, mathematics, statistics, econometrics, engineering, MIS, finance, or related fields accordingly. Preferred Qualifications, Capabilities, and Skills Financial services experience preferred. Tableau experience preferred. Familiarity with Teradata, AWS, & Snowflake preferred.
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
Navi Mumbai, Maharashtra, India
On-site
Our Company Teradata is the connected multi-cloud data platform for enterprise analytics company. Our enterprise analytics solve business challenges from start to scale. Only Teradata gives you the flexibility to handle the massive and mixed data workloads of the future, today. The Teradata Vantage architecture is cloud native, delivered as-a-service, and built on an open ecosystem. These design features make Vantage the ideal platform to optimize price performance in a multi-cloud environment. What You’ll Do This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Changes, across AWS, Azure & Google Cloud Platforms. Few core responsibilities though not limited to would be as below. The Cloud Ops Administrator is responsible for managing Teradata’s as-a-Service offering on public cloud (AWS/Azure/GC) Delivery responsibilities in the areas of cloud network administration, security administration, instantiation, provisioning, optimizing the environment, third party software support. Supporting the onsite teams with migration from On premise to Cloud for customers Implementing security best practices, and analyzing the partner compatibility Manages and coordinates all activities necessary to implement the Changes in the environment. Ensures Change status, progress and issues are communicated to the appropriate groups. Views and implements the process lifecycle and reports to upper management. Evaluates performance metrics against the critical success factors and assures actions for streamline the process. Perform Change related activities documented in the Change Request to ensure the Change is implemented according to plan Document closure activities in the Change record and completing the Change record Escalate any deviations from plans to appropriate TLs/Managers Provide input for the ongoing improvement of the Change Management process Manage and support 24x7 VaaS environments for multiple customers. Devise and implement security, operations best practices. Implementing development, production environment for data warehousing cloud environment Backup, Archive and Recovery planning and execution of the cloud-based data warehouses across all the platforms AWS/Azure/GC resources. Ensuring SLA are met during implementing the change Ensure all scheduled changes are implemented within the prescribed window First level of escalation for team members First level of help/support for team members Who You’ll Work This Role will mainly be working as part of Change Ops for the Cloud Ops L2 team, which is eventually responsible for all Cases, Incidents, Changes, across Azure & Google Cloud Platforms. This will be reporting into Delivery Manager for Change Ops – Saurabh Mathur What Makes You a Qualified Candidate Minimum 3-5 years of IT experience in a Systems Administrator / Engineer role. Minimum 1 years of Cloud hands-on experience (Azure/AWS/GCP). Cloud Certification ITIL or other relevant certifications are desirable Service Now / ITSM tool day to day Operations Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term What You’ll Bring 4 Year Engineering Degree or 3 Year Masters of Computer Application. Excellent oral and written communication skills in the English language Must be willing to provide 24x7 on-call support on a rotational basis with the team. Must be willing to travel – both short-term and long-term Teradata/DBMS Experience Hands on experience with Teradata administration and strong understanding of Cloud capabilities and limitations Thorough understanding of Cloud Computing: virtualization technologies, Infrastructure as a Service, Platform as a Service and Software as a Service Cloud delivery models and the current competitive landscape Implement and support new and existing customers on VaaS infrastructure. Thorough understanding of infrastructure (firewalls, load balancers, hypervisor, storage, monitoring, security etc. ) and have experience with orchestration to develop a cloud solution. Should have good knowledge of cloud services for Compute, Storage, Network and OS for at least one of the following cloud platforms: Azure Managed responsibilities as a Shift lead Should have experience in Enterprise VPN and Azure virtual LAN with data center Knowledge of monitoring, logging and cost management tools Hands-on experience with database architecture/modeling, RDBMS and No-SQL. Should have good understanding of data archive/restore policies. Teradata Basic If certified with VMware skills will be added advantage. Working experience in Linux administration, Shell Scripting. Working experience on any of the RDBMS like Oracle//DB2/Netezza/Teradata/SQL Server,MySQL. Why We Think You’ll Love Teradata We prioritize a people-first culture because we know our people are at the very heart of our success. We embrace a flexible work model because we trust our people to make decisions about how, when, and where they work. We focus on well-being because we care about our people and their ability to thrive both personally and professionally. We are an anti-racist company because our dedication to Diversity, Equity, and Inclusion is more than a statement. It is a deep commitment to doing the work to foster an equitable environment that celebrates people for all of who they are. Teradata invites all identities and backgrounds in the workplace. We work with deliberation and intent to ensure we are cultivating collaboration and inclusivity across our global organization. We are proud to be an equal opportunity and affirmative action employer. We do not discriminate based upon race, color, ancestry, religion, creed, sex (including pregnancy, childbirth, breastfeeding, or related conditions), national origin, sexual orientation, age, citizenship, marital status, disability, medical condition, genetic information, gender identity or expression, military and veteran status, or any other legally protected status.
Posted 3 weeks ago
4.0 years
5 - 6 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking a Senior Data Management Analyst In this role, you will: Lead or participate in moderately complex programs and initiatives for data quality, governance, and metadata activities Design and conduct moderately complex analysis to identify and remediate data quality, data integrity, process, and control gaps Analyze, assess, and test data controls and data systems to ensure quality and risk compliance standards are met and adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Develop recommendations for optimal approaches to resolve data quality issues and implement plans for assessing the quality of new data sources leveraging domain expertise and data, business, or process analysis to inform and support solution design Lead project teams and mentor less experienced staff members Drive planning and coordination on moderately complex remediation efforts acting as central point of contact Consult with clients to assess the current state of data and metadata quality within area of assigned responsibility Participate in cross-functional groups to develop companywide data governance strategies Provide input into communication routines with stakeholders, business partners, and experienced leaders Required Qualifications: 4+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Solid understanding of master data management (MDM), metadata management and data lineage. Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Key Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx, LLM Job Expectations: Oversee the design and execution of data pipelines and ETL processes. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries. Posting End Date: 30 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 3 weeks ago
2.0 years
5 - 6 Lacs
Hyderābād
On-site
About this role: Wells Fargo is seeking Data Management Analyst. In this role, you will: Participate in less complex analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Consult with clients to assess the current state of data quality within area of assigned responsibility Required Qualifications: 2+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Manage data entry, cleansing and updating processes across core system. Identify and resolve data inconsistency or quality issues. Banking business or technology experience Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx & LLM Job Expectations: Assist in implementing the data process. Monitor data flows and perform regular audits to maintain data integrity. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries. Posting End Date: 30 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 3 weeks ago
4.0 years
5 - 6 Lacs
Bengaluru
On-site
About this role: Wells Fargo is seeking a Senior Data Management Analyst In this role, you will: Lead or participate in moderately complex programs and initiatives for data quality, governance, and metadata activities Design and conduct moderately complex analysis to identify and remediate data quality, data integrity, process, and control gaps Analyze, assess, and test data controls and data systems to ensure quality and risk compliance standards are met and adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Develop recommendations for optimal approaches to resolve data quality issues and implement plans for assessing the quality of new data sources leveraging domain expertise and data, business, or process analysis to inform and support solution design Lead project teams and mentor less experienced staff members Drive planning and coordination on moderately complex remediation efforts acting as central point of contact Consult with clients to assess the current state of data and metadata quality within area of assigned responsibility Participate in cross-functional groups to develop companywide data governance strategies Provide input into communication routines with stakeholders, business partners, and experienced leaders Required Qualifications: 4+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Solid understanding of master data management (MDM), metadata management and data lineage. Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Key Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx, LLM Job Expectations: Oversee the design and execution of data pipelines and ETL processes. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries. Posting End Date: 30 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 3 weeks ago
2.0 years
5 - 6 Lacs
Bengaluru
On-site
About this role: Wells Fargo is seeking Data Management Analyst. In this role, you will: Participate in less complex analysis to identify and remediate data quality or integrity issues and to identify and remediate process or control gaps Adhere to data governance standards and procedures Identify data quality metrics and execute data quality audits to benchmark the state of data quality Design and monitor data governance, data quality and metadata policies, standards, tools, processes, or procedures to ensure data control and remediation for companywide data management functions Support communications with basic documentation related to requirements, design decisions, issue closure, or remediation updates Support issue remediation by performing medium risk data profiling, data or business analysis, and data mapping as part of root cause or impact analysis Provide support to regulatory analysis and reporting requirements Recommend plans for the development and implementation of initiatives that assess the quality of new data sources Work with business and technology partners or subject matter professionals to document or maintain business or technical metadata about systems, business or data elements, or data-related controls Consult with clients to assess the current state of data quality within area of assigned responsibility Required Qualifications: 2+ years of Data Management, Business Analysis, Analytics, or Project Management experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Experience in large enterprise data initiatives Manage data entry, cleansing and updating processes across core system. Identify and resolve data inconsistency or quality issues. Banking business or technology experience Experience using standard BI tools (Tableau, Power BI, MicroStrategy, etc) Must have knowledge on T-SQL, database, data warehousing, joins, data analysis, Indexing. Should be aware of ETL concepts, data integration, transformation, data load techniques. BI concepts/solutions, analytical dashboards, different type of reporting. Cloud concepts, cloud infrastructure, cloud centric solutions (data pipelines, Big Query, etc.) Agile principles, Agile ceremonies, Software Development Lifecycle, JIRA, Scrum/Kanban. Skills: SQL - Teradata, Snowflake, Python, Regression & Clustering, Alteryx & LLM Job Expectations: Assist in implementing the data process. Monitor data flows and perform regular audits to maintain data integrity. Ensure consistent data definition and usage across systems. Collaborate with data engineers and architects to optimize data flow and storage. Work with IT and business teams to implement data cleansing and validation rules Identify, investigate and resolve data quality issues through root cause analysis. Define and maintain data dictionaries, metadata and business glossaries. Posting End Date: 30 Jul 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you will have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you will bring together all the necessary technology and services to help customers adopt IBM software to solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best — running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators — always willing to help and be helped — as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Required Technical And Professional Expertise 4-6 years of experience in data wharehouse, data engineering, or a similar role, with hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in installation and configuration of DB2 databases Excellent communication, collaboration and problem-solving skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you will have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you will bring together all the necessary technology and services to help customers adopt IBM software to solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best — running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators — always willing to help and be helped — as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Required Technical And Professional Expertise 4-6 years of experience in data wharehouse, data engineering, or a similar role, with hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in installation and configuration of DB2 databases Excellent communication, collaboration and problem-solving skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process
Posted 3 weeks ago
4.0 - 6.0 years
0 Lacs
Kochi, Kerala, India
On-site
Introduction Joining the IBM Technology Expert Labs teams means you will have a career delivering world-class services for our clients. As the ultimate expert in IBM products, you will bring together all the necessary technology and services to help customers adopt IBM software to solve their most challenging problems. Working in IBM Technology Expert Labs means accelerating the time to value confidently and ensuring speed and insight while our clients focus on what they do best — running and growing their business. Excellent onboarding and industry-leading learning culture will set you up for a positive impact, while advancing your career. Our culture is collaborative and experiential. As part of a team, you will be surrounded by bright minds and keen co-creators — always willing to help and be helped — as you apply passion to work that will positively impact the world around us. Your Role And Responsibilities This Candidate is responsible for Deploy DB2 databases as containers within Red Hat OpenShift clusters Configure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Required Technical And Professional Expertise 4-6 years of experience in data wharehouse, data engineering, or a similar role, with hands-on experience in cloud data platforms Strong proficiency in DB2, SQL and Python. Strong understanding of: Database design and modelling(dimensional, normalized, NoSQL schemas) Normalization and indexing Data warehousing and ETL processes Cloud platforms Big data technologies (e.g., Hadoop, Spark) Database Migration project experience from one database to another database (target database Db2). Experience in installation and configuration of DB2 databases Excellent communication, collaboration and problem-solving skills Preferred Technical And Professional Experience Experience with machine learning environments and LLMs Certification in IBM watsonx.data or related IBM data and AI technologies Hands-on experience with Lakehouse platform (e.g., Databricks, Snowflake) Having exposure to implement or understanding of DB replication process
Posted 3 weeks ago
5.0 - 7.0 years
20 - 27 Lacs
Bengaluru
Work from Office
MicroStrategy Developer to design and develop reports dashboards and analytical solution Responsibilities include collaborating with stakeholders data modeling writing SQL performance tuning and providing technical support within the MicroStrategy
Posted 3 weeks ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328437 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Modeller to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description Mandatory Skills : ER Studio , Oracle , Teradata Creating and updating data models, defining information requirements for small to medium size projects Creating ETL specifications / source to target mapping based on project requirements Generate data definition language DDL used to create the database schemas and tables Create optimal database views aligned with business and technical needs Work with assigned technical teams to ensure correct deployment of DDL Synchronizing models to ensure that database structures match models Conduct business and data analysis Work independently on projects with guidance from Project Leader Domain Proficiency with Healthcare Plan/Payer Qualifications: Previous experience working with business and technical teams, compiling business definitions for enterprise data model attributes. 3-5 years’ experience in a high-tech environment in Technical or business Application Analysis or equivalent combination of experience and education Physical data modeling, Business and Data analysis, technical specification development and enterprise data mapping Experience In Relational Database, Teradata Preferred Understanding of Data Warehouse, ETL technology and Business Intelligence Reporting Understanding of enterprise logical models (EDM) containing all entities and their relationships and a complete set of documentation using industry standard tools and technique Bachelor’s degree in related technical field of study that includes basic data modeling. Required Skills Excellent written and verbal communication skills. Candidate must be able to interact effectively with both technical and business users. Physical data modeling, Business and Data analysis, technical specification development and data mapping Experience using Power Designer or other Data Modeling tool such as Erwin Advanced SQL expertise Linux operating system Nice to have Experience with Healthcare Payer domain About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 3 weeks ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Req ID: 328482 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a ETL Informatica ,IICS Developer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 3 weeks ago
2.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Requirement: Strong knowledge of Pyspark and SQL 2 + years of experience in PySpark. 4+Years of experience in DataStage, Terada. 6+ Years of experience in Python. Experience in GCP cloud Services Delta Lake Key word: (Matching minimum 3 words) Pyspark GCP Google Cloud Python DataStage Teradata Good to have skills: Google Certification
Posted 4 weeks ago
6.0 - 11.0 years
4 - 7 Lacs
Pune, Maharashtra, India
On-site
This individual contributor is responsible for the design, configuration, build, implementation, maintenance, and performance of critical database systems, to ensure the availability and consistent performance of our corporate applications. This role will specialize in DB2 UDB as its primary platform of accountability. In additional, they are responsible for ensuring adherence to IT policies and standards including security practices. This is a hands-on position requiring solid technical skills, as well as excellent interpersonal and communication skills. This position requires the individual to be able to handle large databases with high degree of complexity and non-uniformity at an enterprise corporate level. Essential Responsibilities: List primary and specific job duties performed daily in order of importance. Performs physical database management system design and build to support application development teams with application database needs in their development lifecycle. This person will translate logical data model designs into physical designs and assist with deployment and security configuration. This individual will be the primary development database point of contact. They will serve as a liaison to database administration production support teams and serve as an escalation point for incident management for databases assigned. Completes work assignments and supports business-specific projects by applying expertise in subject area; supporting the development of work plans to meet business priorities and deadlines; ensuring team follows all procedures and policies; coordinating and assigning resources to accomplish priorities and deadlines; collaborating cross-functionally to make effective business decisions; solving complex problems; escalating high priority issues or risks, as appropriate; and recognizing and capitalizing on improvement opportunities. Practices self-development and promotes learning in others by proactively providing information, resources, advice, and expertise with coworkers and customers; building relationships with cross-functional stakeholders; influencing others through technical explanations and examples; adapting to competing demands and new responsibilities; listening and responding to, seeking, and addressing performance feedback; providing feedback to others and managers; creating and executing plans to capitalize on strengths and develop weaknesses; supporting team collaboration; and adapting to and learning from change, difficulties, and feedback. As part of the IT Engineering job family, this position is responsible for leveraging DEVOPS, and both Waterfall and Agile practices, to design, develop, and deliver resilient, secure, multi-channel, high-volume, high-transaction, on/off-premise, cloud-based solutions. Supports the review of team deliverables. Provides some recommendations and input on options, risks, costs, and benefits for systems designs. Collaborates with team members to develop project support plans, schedules, and assignments. Translates business and functional requirements into technical specifications that support integrated and sustainable designs for designated infrastructure systems by partnering with Business Analysts to understand business needs and functional specifications. Serves as a liaison with business partners, Solutions, and enterprise architects to define and understand target strategies. Collaborates with counterparts in various IT Teams (e.g., database, operations, technical support) throughout system development and implementation. Develops and modifies solutions by identifying technical solutions to business problems Provides consultation and technical advice on IT infrastructure planning, engineering, and architecture for assigned systems by assessing the implications of IT strategies on infrastructure capabilities. Reviews and makes changes to technical specifications and documentation. Collaborates with IT teams and key business partners to troubleshoot complex systems and provides solutions, as appropriate. Evaluates existing systems to make recommendations on resources required to maintain service levels. Evaluates new service options, identifies issues and impacts, and provides recommendations on feasibility and ROI. Collaborates with architects and software engineers to ensure functional specifications are converted into flexible, scalable, and maintainable designs. Verifies system designs adhere to company architecture standards. Drives physical database architecture design for new initiatives. Leads the implementation of assigned enterprise infrastructure systems to ensure successful deployment and operation by developing and documenting detailed standards (e.g., guidelines, processes, procedures) for the introduction and maintenance of services. Job Qualifications: Bachelor's degree in Computer Science, CIS, or related field OR Minimum Six (6) years experience in an IT operations environment with technical experience in distributed technologies, systems development, and/or networking. Six (6) years of overall IT experience in managing various database technologies. Four (4) years of experience with DB2 UDB database, DB2 UDB High Availability and Disaster Recovery (HA/DR) solutions such as Clustering, Log shipping and Replication, etc. Two (2) years experience with database performance tuning and troubleshooting Three (3) years experience using SQL or similar query language. Two (2) years experience with Database configurations for high performance (database and query tuning) Two (2) years experience with database monitoring solutions like Foglight or other monitoring tools Two (2) years experience in supporting complex Development/projects Two (2) years experience in storage engineering and/or data backup engineering, including experience with data replication. Three (3) years experience in database administration. Preferred Qualifications Two (2) years experience working with an IT Infrastructure Library (ITIL) framework Two (2) years experience with other database technology such as DB2 LUW, Sybase, Teradata, Datacom, Oracle, Postgres Two (2) years experience using PostgreSQL (versions 9.x through current), High Availability, Clustering and Replication features - Streaming Replication, Patroni, pgBouncer, pgPool Two (2) years experience working with cloud environments (AWS, Azure, Google Cloud) and database services. Two (2) years scripting experience using UNIX/Linux scripting languages. Two (2) years scripting experience using PowerShell scripting languages. Two (2) years experience writing technical documentation in an infrastructure development environment. Two (2) years coding experience with one or more programming languages. Two (2) years of work experience in a role requiring interaction with senior leadership (e.g., Director level and above) Three (3) years experience with Database Design, Engineering, Implementation and Operations Three (3) years experience with job design, Controlling and scheduling & Automation Two (2) years experience IT Service Management Process (Incidents, Change Management etc.) Two (2) years experience with Microsoft Office tools Two (2) years experience working with operating system and client/server utilities. Two (2) years experience working with configuration management software. Two (2) years experience in the design and implementation of complex middleware infrastructure solutions. Two (2) years experience working with load balancing technologies. Two (2) years of experience building technology solutions to meet corporate or industry IT Policies and regulatory requirements. Two (2) years experience in the design and implementation of complex data infrastructure solutions. Two (2) years experience working on cross-functional project teams Three (3) years experience working in a large matrixed organization.
Posted 4 weeks ago
0.0 years
0 Lacs
Noida, Uttar Pradesh
On-site
Req ID: 328482 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a ETL Informatica ,IICS Developer to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Work as ETL developer using Oracle SQL , PL/SQL, Informatica, Linux Scripting , Tidal ETL code development, unit testing, source code control, technical specification writing and production implementation. Develop and deliver ETL applications using Informatica , Teradata, Oracle , Tidal and Linux. Interact with BAs and other teams for requirement clarification, query resolution, testing and Sign Offs Developing software that conforms to a design model within its constraints Preparing documentation for design, source code, unit test plans. Ability to work as part of a Global development team. Should have good knowledge of healthcare domain and Data Warehousing concepts About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Cliff IT Solutions is a leading provider of Identity Management and Security governance solutions for onsite and cloud applications. Our services help organizations reduce risk and maximize profitability through end-to-end integration of enterprise Identity access management products and cloud-based applications. We focus on delivering high-quality products from initiation through execution, supported by our expertise in cutting-edge technologies. At Cliff IT, teamwork and collaboration are at the heart of our operations, fostering strong partnerships with stakeholders and clients. We are dedicated to helping our clients improve information systems and application security while staying connected to the ever-changing market landscape. Role Description This is a full-time on-site role for a Sr. Azure Data Engineer with Informatica & Teradata expertise, located in the Greater Minneapolis-St. Paul Area. The Sr. Azure Data Engineer will be responsible for designing and implementing data engineering solutions, including data modeling, ETL processes, and data warehousing. Day-to-day tasks include building and maintaining data pipelines, performing data analytics, and collaborating with cross-functional teams to support business intelligence activities. Qualifications Strong Data Engineering and Data Modeling skills Proficiency in Extract Transform Load (ETL) processes and Data Warehousing Experience in Data Analytics and generating actionable insights Familiarity with cloud-based solutions, particularly Azure Excellent problem-solving abilities and attention to detail Bachelor's or Master's degree in Computer Science, Information Technology, or a related field Experience with Informatica and Teradata is highly desirable Strong collaboration and communication skills Bachelor's degree in Computer Science, Information Technology, Engineering, or a related field. [4+] years of hands-on experience as a Teradata Developer with strong SQL proficiency. [4+] years of extensive experience with Informatica PowerCenter (versions [9.x, 10.x]) including Designer, Workflow Manager, Workflow Monitor, and Repository Manager. Solid understanding of data warehousing concepts, Kimball methodology, and dimensional modeling (Star Schema, Snowflake Schema). Expertise in writing and optimizing complex SQL queries, stored procedures, functions, and views in Teradata. Experience with Teradata utilities such as BTEQ, FastLoad, MultiLoad, TPT (Teradata Parallel Transporter), and FastExport. Proven ability to design and implement efficient and scalable ETL solutions. Strong analytical and problem-solving skills with attention to detail. Excellent communication (written and verbal) and interpersonal skills. Ability to work independently and collaboratively in a fast-paced environment. Knowledge of scripting languages (e.g., Shell Scripting, Python) for automation. Experience with cloud platform Azure and data warehousing in the cloud. Familiarity with data visualization tools (e.g., Tableau, Power BI) and reporting concepts. Experience in an Agile/Scrum development environment.
Posted 4 weeks ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Company Description Cliff IT Solutions is a provider of Identity Management and Security governance solutions for onsite and cloud applications, helping organizations enforce security, reduce risk, and maximize profitability. We deliver end-to-end integration of industry-leading identity access management products and cloud-based applications. Cliff IT Solutions transforms ideas into quality products using cutting-edge technologies and domain expertise. We emphasize teamwork and collaboration, developing strong partnerships with stakeholders and clients. Our customer-focused approach ensures that clients stay updated with market conditions and make informed decisions. Role Description This is a full-time on-site role for a Sr. Data Modeler & Data Analyst located in the Hyderabad . The Sr. Data Modeler & Data Analyst will be responsible for designing and implementing data models, ensuring data quality, and establishing robust data governance frameworks. Day-to-day tasks include creating data architecture, managing Extract Transform Load (ETL) processes, and collaborating with various stakeholders to enhance data systems and processes. The role requires a strong understanding of data management principles and the ability to improve information systems effectively. Qualifications Data Governance and Data Quality skills Data Modeling and Data Architecture skills Experience with Extract Transform Load (ETL) processes Excellent analytical and problem-solving skills Strong communication and teamwork abilities Relevant degrees such as Computer Science, Information Technology, or related fields Experience in the Identity Management and Security Governance domain is a plus Experience Informatica, Teradata, Axiom, SQL, Databricks
Posted 4 weeks ago
48.0 years
0 Lacs
Panchkula, Haryana, India
On-site
Position Title Lead/Sr. ETL Engineer Location Panchkula, India Date Posted July 4, 2025 Description We are looking for a skilled and experienced Lead/Senior ETL Engineer with 48 years of experience to join our data engineering team. In this role, you will be responsible for designing and developing high-performing ETL solutions, managing data pipelines, and ensuring seamless integration across systems. Youll also contribute to architectural decisions, lead delivery planning, and provide mentorship to team members. Your hands-on expertise in ETL tools, cloud platforms, and scripting will be key to building efficient, scalable, and reliable data solutions for enterprise-level implementations. Skills Key Skills Strong hands-on experience with ETL tools like SSIS, DataStage, Informatica, or Talend. Deep understanding of Data Warehousing concepts, including Data Marts, Star/Snowflake schemas, Fact & Dimension tables. Proficient in working with relational databases: SQL Server, Oracle, Teradata, DB2, or MySQL. Solid scripting/programming skills in Python. Hands-on experience with cloud platforms such as AWS or Azure. Knowledge of middleware architecture and enterprise data integration strategies. Familiarity with reporting/BI tools like Tableau and Power BI. Strong grasp of data modeling principles and performance optimization. Ability to write and review high and low-level design documents. Strong communication skills and experience working with cross-cultural, distributed teams. Responsibilities Roles and Responsibilities Design and develop ETL workflows and data integration strategies. Create and review high and low-level designs adhering to best practices. Collaborate with cross-functional teams to deliver enterprise-grade middleware solutions. Coach and mentor junior engineers to support skill development and performance. Ensure timely delivery, escalate issues proactively, and manage QA and validation processes. Participate in planning, estimations, and recruitment activities. Work on multiple projects simultaneously, ensuring quality and consistency in delivery. Experience in Sales and Marketing data domains. Exposure to reporting and analytics projects. Strong problem-solving abilities with a data-driven mindset. Ability to work independently and collaboratively in a fast-paced environment. Prior experience in global implementations and managing multi-location teams is a plus. Application Position: Lead/Sr. ETL Engineer Name * E-mail * Phone * CV & Documents * Add file Required fields Thank you for submitting your application. We will contact you shortly! Contacts Email: careers@grazitti.com Address HSIIDC Technology Park, Plot No 19, Sector 22, 134104, Panchkula, Haryana, India
Posted 4 weeks ago
0 years
2 - 9 Lacs
Pune
On-site
About VOIS: VO IS (Vodafone Intelligent Solutions) is a strategic arm of Vodafone Group Plc, creating value and enhancing quality and efficiency across 28 countries, and operating from 7 locations: Albania, Egypt, Hungary, India, Romania, Spain and the UK. Over 29,000 highly skilled individuals are dedicated to being Vodafone Group’s partner of choice for talent, technology, and transformation. We deliver the best services across IT, Business Intelligence Services, Customer Operations, Business Operations, HR, Finance, Supply Chain, HR Operations, and many more. Established in 2006, VO IS has evolved into a global, multi-functional organisation, a Centre of Excellence for Intelligent Solutions focused on adding value and delivering business outcomes for Vodafone. About VOIS India: In 2009, VO IS started operating in India and now has established global delivery centres in Pune, Bangalore and Ahmedabad. With more than 14,500 employees, VO IS India supports global markets and group functions of Vodafone, and delivers best-in-class customer experience through multi-functional services in the areas of Information Technology, Networks, Business Intelligence and Analytics, Digital Business Solutions (Robotics & AI), Commercial Operations (Consumer & Business), Intelligent Operations, Finance Operations, Supply Chain Operations and HR Operations and more. Job Description Role purpose: A DevOps engineer is responsible for ensuring the smooth health of daily ETL batch based on Abinitio/Teradata Vantage cloud on GCP (L2 support), deployment support. The role blends technical & leadership expertise with operational efficiency to support business-critical applications. Troubleshoot any issues in daily DWH production ETL jobs based on Abinitio & Teradata Vantage Cloud on GCP Provide deployment support for new & enhanced applications Co-ordination with project managers, development, testing & other teams Update remedy tickets based on the incident/service request status On call support for daily batch jobs during night & weekend Additional Skills : Automation & innovation from existing processes Performance improvement of the jobs Good knowledge of SQL, Unix scripting Knowledge of telecom domain and Google cloud Work on Google cloud POC,Manage IAM, virtual machines, Kubernetes Strong leadership Essential 8 yrs experience in Abinitio support Knowledge of Teradata vantage Good hands on Unix/SQL Handling cloud platform on GCP Strong leadership Desired ITIL skills Unix scripting VOIS Equal Opportunity Employer Commitment India: VO IS is proud to be an Equal Employment Opportunity Employer. We celebrate differences and we welcome and value diverse people and insights. We believe that being authentically human and inclusive powers our employees’ growth and enables them to create a positive impact on themselves and society. We do not discriminate based on age, colour, gender (including pregnancy, childbirth, or related medical conditions), gender identity, gender expression, national origin, race, religion, sexual orientation, status as an individual with a disability, or other applicable legally protected characteristics. As a result of living and breathing our commitment, our employees have helped us get certified as a Great Place to Work in India for four years running. We have been also highlighted among the Top 10 Best Workplaces for Millennials, Equity, and Inclusion , Top 50 Best Workplaces for Women , Top 25 Best Workplaces in IT & IT-BPM and 10th Overall Best Workplaces in India by the Great Place to Work Institute in 2024. These achievements position us among a select group of trustworthy and high-performing companies which put their employees at the heart of everything they do. By joining us, you are part of our commitment. We look forward to welcoming you into our family which represents a variety of cultures, backgrounds, perspectives, and skills! Apply now, and we’ll be in touch!
Posted 4 weeks ago
8.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Spaulding Ridge is an advisory and IT implementation firm. We help global organizations get financial clarity into the complex, daily sales, and operational decisions that impact profitable revenue generations, efficient operational performance, and reliable financial management. At Spaulding Ridge, we believe all business is personal. Core to our values is our relationships with our clients, our business partners, our team, and the global community. Our employees dedicate their time to helping our clients transform their business, from strategy through implementation and business transformation. What You Will Do And Learn As a Snowflake Architect/ Manager in Data Solutions, you’ll be responsible for designing, implementing, and testing proposed modern analytic solutions. Working closely with our client partners and architects, you’ll develop relationships with key technical resources while delivering tangible business outcomes. Manage the Data engineering lifecycle including research, proof of concepts, architecture, design, development, test, deployment, and maintenance Collaborate with team members to design and implement technology that aligns with client business objectives Build proof of concepts for a modern analytics stack supporting a variety of Cloud-based Business Systems for potential clients Team management experience and ability to manage, mentor and develop talent of assigned junior resources Create actionable recommendations based on identified platform, structural and/or logic problems Communicate and demonstrate a clear understanding of client business needs, goals, and objectives Collaborate with other architects on solution designs and recommendations. Qualifications: 8+ years’ experience developing industry leading business intelligence and analytic solutions Must have thorough knowledge of data warehouse concepts and dimensional modelling Must have experience in writing advanced SQL Must have at least 5+ years of hands-on experience on DBT (Data Build Tool). Mandatory to have most recent hands-on experience on DBT. Must have experience working with DBT on one or more of the modern databases like Snowflake / Amazon Redshift / BigQuery / Databricks / etc. Hands-on experience with Snowflake would carry higher weightage Snowflake SnowPro Core certification would carry higher weightage Experience working in AWS, Azure, GCP or similar cloud data platform would be an added advantage Hands-on experience on Azure would carry higher weightage Must have experience in setting up DBT projects Must have experience in understanding / creating / modifying & optimizing YML files within DBT Must have experience in implementing and managing data models using DBT, ensuring efficient and scalable data transformations Must have experience with various materialization techniques within DBT Must have experience in writing & executing DBT Test cases Must have experience in setting up DBT environments Must have experience in setting up DBT Jobs Must have experience with writing DBT Jinja and Macros Must have experience in creating DBT Snapshots Must have experience in creating & managing incremental models using DBT Must have experience with DBT Docs Should have a good understanding of DBT Seeds Must have experience with DBT Deployment Must Experience with architecting data pipelines using DBT, utilizing advanced DBT features Proficiency in version control systems and CI/CD Must have hands-on experience configuring DBT with one or more version control systems like Azure DevOps / Github / Gitlab / etc. Must have experience in PR approval workflow Participate in code reviews and best practices for SQL and DBT development Experience working with visualization tools such as Tableau, PowerBI, Looker and other similar analytic tools would be an added advantage 2+ years of Business Data Analyst experience 2+ years of experience writing Business requirements, Use cases and/or user stories, for data warehouse or data mart initiatives. Understanding and experience on ETL/ELT is an added advantage 2+ years of consulting experience working on project-based delivery using Software Development Life Cycle (SDLC) 2+ years of years of experience with relational databases (Postgres, MySQL, SQL Server, Oracle, Teradata etc.) 2+ years of experience creating functional test cases and supporting user acceptance testing 2+ years of experience in Agile/Kanban/DevOps Delivery Outstanding analytical, communication, and interpersonal skillsAbility to manage projects and teams against planned work Responsible for managing the day-to-day client relationship on projects Spaulding Ridge’s Commitment to an Inclusive Workplace When we engage the expertise, insights, and creativity of people from all walks of life, we become a better organization, we deliver superior services to clients, and we transform our communities and world for the better. At Spaulding Ridge, we believe our team should reflect the rich diversity of society and we take seriously the responsibility to cultivate a workplace where every bandmate feels accepted, respected, and valued for who they are. We do this by creating a culture of trust and belonging, through practices and policies that support inclusion, and through our employee led Employee Resource Groups (ERGs): CRE (Cultural Race and Ethnicity), Women Elevate, PROUD and Mental Wellness Alliance. The company is committed to offering Equal Employment Opportunity and to providing reasonable accommodation to applicants with physical and/or mental disabilities. If you are interested in applying for employment with Spaulding Ridge and are in need of accommodation or special assistance to navigate our website or to complete your application, please send an e-mail with your request to our VP of Human Resources, Cara Halladay (challaday@spauldingridge.com). Requests for reasonable accommodation will be considered on a case-by-case basis. Qualified applicants will receive consideration for employment without regard to their age, race, religion, national origin, gender, sexual orientation, gender identity, protected veteran status or disability.
Posted 4 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39815 Jobs | Dublin
Wipro
19317 Jobs | Bengaluru
Accenture in India
15105 Jobs | Dublin 2
EY
14860 Jobs | London
Uplers
11139 Jobs | Ahmedabad
Amazon
10431 Jobs | Seattle,WA
IBM
9214 Jobs | Armonk
Oracle
9174 Jobs | Redwood City
Accenture services Pvt Ltd
7676 Jobs |
Capgemini
7672 Jobs | Paris,France