Get alerts for new jobs matching your selected skills, preferred locations, and experience range.
4 - 7 years
8 - 12 Lacs
Bengaluru
Work from Office
Responsibilities : Strong experience with SQL and PySpark for data validation and reconciliation. Experience working with Teradata , AWS Databricks , and mainframe systems. Hands-on experience with data quality frameworks and automation tools. Ability to perform detailed root cause analysis and resolve data discrepancies. Excellent problem-solving and analytical skills. Strong communication and collaboration skills with technical and non-technical stakeholders.
Posted 2 months ago
4 - 6 years
6 - 8 Lacs
Hyderabad
Work from Office
BSA - Marketing Data Hyderabad, India Business Analysis Job ID R17361 Who We Are MassMutual India At MassMutual, our vision is to help people secure their future and protect the ones they love. As a Business Systems Analyst, you ll help deliver high-quality data assets that enable the success of MassMutual s marketing teams. This role will work within the Marketing Data team, which is responsible for the end-to-end delivery and management of trusted, relevant, and governed data assets that drive and measure marketing activity. Among other things, our team builds ETL pipelines that gather data and integrate them into both our data warehouse and external destinations. Your day-to-day work will help produce clear, rationalized information and specifications necessary to deliver our data pipelines, which integrate information from multiple sources. These deliverables include scope definitions, business requirements, specifications of data relationships, and source-to-target (S2T) mappings. Primary Responsibilities: The person we hire for this role will support work in the following areas: Scope: Help identify the technical scope of work Requirements: Independently gather prioritized requirements, and decompose into well-written user stories with acceptance criteria Data subject matter expertise : Serve as an initial point of contact to determine availability and fitness of our data to support business questions and analyses Specifications: Produce clear information needed to deliver our data pipelines, e.g.: data and process flows, data relationships, source to target (S2T) mappings Coordination: Liaise with business stakeholders and technical resources during development, and help align technical execution with our peer teams Analysis and validation: Help profile and analyze multiple data sources to support pipeline development, and coordinate user acceptance testing Requirements: Bachelor s Degree, preferably in Business or an analytical field such as Economics, Mathematics, Engineering, Computer Science Business / data analysis : 4-6 years of experience as a business analyst, business systems analyst, or data analyst SQL : 4-6 years of experience using SQL Data warehousing and ETL : Knowledge of database and data warehousing solutions (e.g.: Vertica, Postgres, Oracle DB, Teradata, Redshift), along with ETL / ELT pipelines File formats : Experience working with several file format sources (JSON, AVRO, Parquet, CSV) Requirements-gathering: Ability to isolate and elicit expected vs. actual experience and recommend the best action plan / solution, with little to no guidance Communication : Strong interpersonal communication, coordination, requirement gathering and planning skills with cross functional teams Nice to have: Agile methodology : Familiar with Agile delivery process and related ceremonies Marketing business knowledge : Experience or familiarity with marketing processes Testing: Familiarity with manual and automated testing methods API querying : Familiarity / experience with querying REST APIs, either manually (Postman) or with libraries (Python requests library) Scripting : Exposure to one or more scripting languages, such as Python or Shell scripting Why Join Us Does this sound like a great fitApply today! Share This Job Jobs Like This One Jobs Like This One Join Our Talent Network. Check out why it s great to work at MassMutual India. Please complete the form to join our Talent Network, which means you ll receive follow-up emails with additional information on jobs that might interest you, MassMutual India news, and information about what it s like to work here. First Name Last Name Email Country Code Phone Number Add Resume (Optional) By submitting this form I agree to receive career opportunity information from MassMutual India and acknowledge that I have taken note of the contents of this including the categories of personal data being processed, the way they are processed, the purposes and the legal grounds for processing, the retention period, the recipients of my personal data, the security of my personal data, as well as my data processing rights.
Posted 2 months ago
4 - 6 years
25 - 30 Lacs
Pune
Work from Office
As part of our team, youll be tasked with handling substantial datasets to develop machine learning models catering to our enterprise clients. Your role will also involve contributing to the development of foundational models for our product. To excel in this position, you should possess a strong analytical aptitude, with a deep understanding of data analysis, mathematics, and statistics. Critical thinking and problem-solving abilities are imperative for the interpretation of data. Furthermore, we value a genuine enthusiasm for machine learning and a commitment to research. Responsibilities: Develop and Deploy predictive models in production and conduct advanced analytics, data mining, and data visualization to influence strategic decisions Architect and build data models to transform data into insights at scale Evaluate model performance and conduct iterative model training to maximize predictive and forecast accuracy on an on-going basis. Requirements: Bachelor or above degree in Computer Science, Applied Mathematics, Statistics, Econometrics, or related field 4-6 years of work experience depending on educational level and relevance Sharp analytical abilities and problem solving skills Working knowledge of Statistics, programming and predictive modeling Expert knowledge of Python or a related scripting language, familiarity with Python packages such as pandas, numpy, scipy, nltk is a must. Databricks Spark are good to have Comfort manipulating and analyzing complex, high-volume, high dimensionality data coming from various data sources Experience working with databases like MySQL, NoSQL, MongoDB, SQL Server, Oracle or Teradata.
Posted 2 months ago
6 - 10 years
11 - 15 Lacs
Jaipur
Work from Office
As a Senior Consultant, Data Engineer at Hakkoda, you are more than just a builder you are a trusted advisor, leading technical teams in designing and developing cutting-edge cloud data solutions, including Snowflake. You will partner with clients to architect and optimize data pipelines, ensuring scalable, secure, and high-performing data environments. Your expertise in data migration, governance, and architecture will drive meaningful transformation for data-driven organizations. In this role, you ll thrive in a collaborative and fast-paced environment, where curiosity, innovation, and leadership are valued. What We Are Looking For: We are currently hiring for the position of Sr. Consultant Data Engineer to join our expanding team of experts. In this role, you will be instrumental in designing and developing solutions within the Snowflake Data Cloud environment. Responsibilities encompass data ingestion pipelines, data architecture, data governance, and security. The ideal candidate thrives on optimizing data systems and enjoys the challenge of building them from the ground up. Qualifications: Location: Jaipur, Rajasthan (Work from Office) Looking for candidates who can join within a month Bachelor s degree in engineering, computer science, or equivalent field. 6-10 years of experience in related technical roles, encompassing data management, database development, ETL, Data Warehouses, and pipelines. At least 3+ years of experience within the Snowflake Data Cloud environment. Proven experience in designing and developing data warehouses using platforms such as Teradata, Oracle Exadata, Netezza, SQL Server, and Spark. Proficiency in building ETL / ELT ingestion pipelines with tools like DataStage, Informatica, Matillion. Strong SQL scripting skills. Cloud experience, particularly on AWS (experience with Azure and GCP is advantageous). Proficient in Python scripting, with a requirement for Scala expertise. Ability to prepare comprehensive reports and present them to internal and customer stakeholders. Demonstrated problem-solving skills and an action-oriented mindset. Strong interpersonal skills, including assertiveness and the ability to build robust client relationships. Comfortable working in Agile teams. Previous experience in hiring, developing, and managing a technical team. Advanced proficiency in English.
Posted 2 months ago
5 - 10 years
10 - 11 Lacs
Kochi
Work from Office
Altivate is a digital transformation enabler on a mission to help businesses find smarter and more innovative ways of doing business. We combine different knowledge and technologies to offer our clients tailored solutions and services to address their unique needs. Altivate provides end-to-end services and solutions based on industry best practices. Our technology competencies are vast and unique; they include SAP, AWS, Microsoft Azure, Microsoft PowerBI, Google Cloud Platform, Teradata, Tableau, MicroStrategy, etc. We work with our clients on unravelling new business opportunities presented by new technologies. We help our clients become more resilient, sustainable, and profitable, efficiently improving their performance and bottom line. Altivate is a proud: *SAP Gold Partner *SAP Certified Partner Center of Expertise *AWS Select Partner *Azure Partner *GCP Partner Job Summary: We are looking for SAP Ariba Consultant who will be working remotely from India and is expected to carry out quality assured project for our clients. The ideal candidate must make use of SAP Best Practices and impart impeccable solutions to the clients and must contribute towards building of Centre of Excellence within the organisation. Duties and Responsibilities: Provide high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. Review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. Coach and Mentor the team. Document and implement SAP Ariba best practices and promote these practices to team members Provide high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. Support Ariba upgrades including test plan development and user acceptance test coordination. Skills, knowledge, capabilities and experience required: Minimum 5+ years extensive functional knowledge of SAP Ariba Procurement, Contract Management, and Supplier Life Cycle Management Experience in application design, development and configuration of Ariba solutions Solid understanding and experience of working on integration with SAP S/4 Hana (both Ariba to S/4 and vice versa) Ability to understand complex business processes, identify solutions and process improvement opportunities Experience in 2 or more End to End Implementations. Have a structured approach to client for managing the change using multiple communication mechanisms. Capable of communicating effectively with project team members at different technical levels. *Please make sure you dont leave the applications question(s) unanswered as candidates who fail to genuinely respond to our questions at this stage wont have high chances of being shortlisted by our tanlent acquisition team.
Posted 2 months ago
4 - 7 years
7 - 11 Lacs
Pune
Work from Office
Snowflake Migration Specialist will play a key role in the migration process, ensuring the successful transition of data from legacy systems to Snowflake. This position requires a deep understanding of Snowflake, data warehousing concepts, and cloud-based data architectures. The ideal candidate will have experience in managing data migrations, troubleshooting, and optimizing the performance of data platforms. Key Responsibilities: Lead the end-to-end migration of data and workloads from on-premises data platforms (e.g., Oracle, SQL Server, Teradata) or cloud-based systems to Snowflake. Migration of legacy/on-prem ETL/ELT workloads to modern ETL/ELT workloads that works better with Snowflake (Matillion, Fivetran, dbt etc.) Analyze current data architectures, workloads, and business requirements to design optimal solutions for migration to Snowflake. Work closely with stakeholders, including data engineers, business analysts, and IT teams, to understand requirements and provide guidance on Snowflake best practices. Develop and execute migration strategies for moving large volumes of data to Snowflake while minimizing downtime and ensuring data integrity. Optimize Snowflake workloads, including query performance tuning, data model optimization, and managing storage costs. Troubleshoot issues during the migration process and provide timely resolutions. Automate migration tasks and processes using scripts, tools, and APIs to streamline the process and increase efficiency. Collaborate with security teams to ensure compliance with data security and governance requirements. Provide documentation and knowledge transfer for best practices, migration processes, and post-migration support. Required Skills and Experience: Proven experience with Snowflake data platform, including migration, implementation, and optimization. Strong knowledge of data warehousing concepts and experience with cloud data platforms (Azure, AWS). Hands-on experience in migrating data from any one of the traditional databases (e.g., Oracle, SQL Server, Teradata) to Snowflake. Proficiency in SQL and experience with ETL processes, data integration, and data pipelines. Experience in optimizing data processing workloads and query performance within Snowflake. Familiarity with Snowflake data security and governance features, including role-based access control and data encryption. Knowledge of cloud infrastructure (AWS, Azure) and associated services (e.g., S3, Azure Blob Storage) used in Snowflake. Familiarity with data pipeline orchestration tools such as Apache Airflow or similar. Strong troubleshooting and problem-solving skills. Excellent communication skills and ability to work effectively with cross-functional teams. Ability to work independently, manage multiple priorities, and meet deadlines. Preferred Qualifications: Snowflake certification (e.g., SnowPro Core, SnowPro Advanced) is highly desirable. Experience with at two one ETL/ELT tools like SSIS, Talend, Informatica, Matillion or the likes. Knowledge of data modeling and schema design in Snowflake. Experience with Python or other programming languages for automation and migration tasks. Knowledge of Cloud Platform services (AWS, Azure) that are used in Snowflake Familiarity with CI/CD pipelines for data engineering.
Posted 2 months ago
6 - 7 years
13 - 15 Lacs
Bengaluru
Work from Office
Level : Senior Data Engineer Experience : 7-10 Location : Any , Bangalore Preferred, Chennai, Hyderabad or Noida Website : https://www.qualitestgroup.com/ About Us: Qualitest is the world s leading managed services provider of AI-led quality engineering solutions. It helps brands transition through the digital assurance journey and make the move from conventional functional testing to adopt innovations such as automation, AI, blockchain, and XR. Qualitest s core mission is to mitigate business risks associated with digital adoption. It fulfills this through customized quality engineering solutions that leverage Qualitest s deep, industry-specific knowledge for various sectors, including technology, telecommunications, finance, healthcare, media, utilities, retail, manufacturing, and defense. These scalable solutions protect brands through end-to-end value demonstration with a focus on customer experience and release velocity. Qualitest has offices in the United States, United Kingdom, Germany, Israel, Romania, India, Mexico, Portugal, Switzerland, and Argentina. It employs more than 7,000 engineers who serve over 400 customers worldwide. A pioneer and innovator in its industry, Qualitest is the only services provider positioned by Everest Group as a Leader in both Next-generation Quality Engineering (QE) Services PEAK Matrix Assessment 2023 and the Quality Engineering (QE) Specialist Services PEAK Matrix Assessment 2023. Responsibilities Design and execute scalable data migration pipelines from relational databases (such as Teradata) to AWS Databricks. Utilize tools like IBM DataStage, FiveTran, Airflow, and Databricks Workflow to support data migration tasks. Optimize ETL processes to enhance performance and streamline data transformations with Spark and SQL. Develop and automate data ingestion, validation, and quality checks within a cloud environment to ensure reliable data flows. Work closely with data architects and business stakeholders to ensure a seamless, efficient, and secure migration process.
Posted 2 months ago
8 - 10 years
13 - 18 Lacs
Bengaluru
Work from Office
Level : Architect Experience : 10+ Location : Any , Bangalore Preferred, Chennai, Hyderabad or Noida Website : https://www.qualitestgroup.com/ About Us: Qualitest is the world s leading managed services provider of AI-led quality engineering solutions. It helps brands transition through the digital assurance journey and make the move from conventional functional testing to adopt innovations such as automation, AI, blockchain, and XR. Qualitest s core mission is to mitigate business risks associated with digital adoption. It fulfills this through customized quality engineering solutions that leverage Qualitest s deep, industry-specific knowledge for various sectors, including technology, telecommunications, finance, healthcare, media, utilities, retail, manufacturing, and defense. These scalable solutions protect brands through end-to-end value demonstration with a focus on customer experience and release velocity. Qualitest has offices in the United States, United Kingdom, Germany, Israel, Romania, India, Mexico, Portugal, Switzerland, and Argentina. It employs more than 7,000 engineers who serve over 400 customers worldwide. A pioneer and innovator in its industry, Qualitest is the only services provider positioned by Everest Group as a Leader in both Next-generation Quality Engineering (QE) Services PEAK Matrix Assessment 2023 and the Quality Engineering (QE) Specialist Services PEAK Matrix Assessment 2023. Responsibilities : With 8 to 10 years of experience, I have led large-scale migrations from Teradata to AWS Databricks. I possess expertise in the Databricks tech stack, including Delta Lake, Spark, Unity Catalog, Databricks Workflows, and Lakehouse architecture. I am skilled in Stored Procedures, PL/SQL, and SQL across relational databases such as Oracle, SQL Server, and DB2. Additionally, I have hands-on experience working with IBM DataStage and Airflow for building ETL/ELT jobs. In my leadership roles, I have successfully managed technical teams, collaborated with stakeholders, and driven automation for smooth migration processes. I focus on ensuring performance, security, scalability, and cost efficiency at every stage of the migration. Key Skills: ETL SQL AWS
Posted 2 months ago
2 - 4 years
7 - 11 Lacs
Chennai, Pune, Delhi
Work from Office
Data Engineer - Maharashtra, India (On-site) About the Role: We re looking for a talented Data Engineer to join our dynamic team. You ll work on high-impact projects like foundational data tables, ETL frameworks, and data lineage, collaborating with cross-functional teams to drive data quality, accessibility, and innovation. Key Responsibilities: Develop and Maintain Foundational Data Tables : Design and implement foundational data tables to support critical business objectives and new features. Build and Enhance Self-Serve ETL Frameworks : Create and optimize ETL frameworks for both streaming and batch processes to facilitate accessible, high-quality data. Migrate and Validate Test Tables : Collaborate with analytics partners to migrate test tables into the data foundation, ensuring accuracy, quality, and optimization. Integrate Schema Registry and Establish Data Lineage : Integrate schema registries into ETL workflows and set up data lineage across various data domains, ensuring consistency and clarity in data management. Qualifications: Experience : 5+ years of experience building production-grade data pipelines and transforming data into governed, actionable datasets. Technical Skills : Proficiency in SQL, Spark, AWS Glue, Python, and hands-on experience with MPP databases like Snowflake, AWS Redshift, or Teradata. Collaboration Skills : Proven experience working effectively with analytics, data science, and DevOps teams, ensuring alignment across data initiatives. Data Governance Knowledge : Strong understanding of data governance principles, pipeline metrics, and building solutions for data visibility and monitoring. Why Join Us Innovative Projects : Work on high-impact projects in a fast-paced, collaborative environment where your contributions will directly support key business decisions. Professional Growth : Opportunity to hone your technical skills and grow your career in a supportive, innovative team setting. Dynamic Team Environment : Join a team of data professionals dedicated to driving data quality and accessibility through cutting-edge solutions. If you re ready to make a difference in data engineering, apply today to join our team and help us build data solutions that matter!
Posted 2 months ago
7 - 12 years
20 - 27 Lacs
Chennai, Pune, Delhi
Work from Office
We are looking for an experienced Lead Data Architect to join our team and drive the architecture, design, and implementation of our data platforms. The ideal candidate will have extensive experience with data warehouse technologies, including Teradata, Oracle, Databricks and Snowflake, as well as expertise in big data platforms like Hadoop. You will be responsible for leading the design and development of scalable data solutions that support our organization s strategic goals. Key Responsibilities: Lead the development and implementation of data architecture strategies that align with business objectives. Design and oversee the architecture of enterprise data warehouses, data lakes, and big data platforms. Establish best practices and standards for data modeling, integration, and management. Data Architecture and Strategy Education: Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Experience: 15+ years of experience in data architecture, data warehousing, and big data solutions. Extensive experience with data warehouse platforms such as Teradata and Oracle. Deep understanding and hands-on experience with big data technologies like Hadoop, HDFS, Hive, and Spark. Proven track record of designing and implementing large-scale data architectures in complex environments. Skills: Strong expertise in data modeling, data integration (ETL/ELT), and database design. Proficiency in SQL, PL/SQL, and performance tuning in Teradata, Oracle, and other databases. Familiarity with cloud data platforms and services is a plus (e.g., AWS Redshift, Google BigQuery, Azure Synapse). Experience with data governance, security, and compliance best practices. Excellent problem-solving, analytical, and critical-thinking skills. Strong leadership, communication, and collaboration abilities. Preferred Qualifications : Experience with additional big data and NoSQL technologies (e.g., Cassandra, MongoDB). Familiarity with data visualization and BI tools (e.g., Tableau, Power BI). Experience with cloud-based data architectures and hybrid data environments. Certifications in data architecture, data warehousing, or related areas. Selected applicant will be subject to a background investigation, which will be conducted and the results of which will be used in compliance with applicable law.
Posted 2 months ago
5 - 10 years
7 - 17 Lacs
Bengaluru
Work from Office
About this role: Wells Fargo is seeking a Lead Analytics Consultant. In this role, you will: Advise line of business and companywide functions on business strategies based on research of performance metrics, trends in population distributions, and other complex data analysis to maximize profits and asset growth, and minimize operating losses within risk and other operating standards Provide influence and leadership in the identification of new tools and methods to analyze data Ensure adherence to compliance and legal regulations and policies on all projects managed Provide updates on project logs, monthly budget forecasts, monthly newsletters, and operations reviews Assist managers in building quarterly and annual plans and forecast future market research needs for business partners supported Strategically collaborate and consult with peers, colleagues, and mid-level to senior managers to resolve issues and achieve goals Lead projects, teams, or serve as a peer mentor to staff, interns and external contractors Required Qualifications: 5+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Required good experience and technical knowledge in SAS, SQL and Teradata Good knowledge on Banking domain
Posted 2 months ago
5 - 10 years
17 - 30 Lacs
Chennai, Pune, Greater Noida
Hybrid
JD for Abinitio Developers: Good knowledge of AbInitio Co-operating system, Knowledge in AbInitio Graphical Development Interface, unix/linux Experience in Designing the Graphs, Parameterizing the graph using PSETS Scheduling the job through Autosys Writing unix shell Scripts. Good knowledge and understanding of Plans/Conduct>IT Experience in repository(EME) commands and knowledge of code versioning. Basic Abinito debugging skills Knowledge in databases like Teradata and/or Oracle Knowledge in writing medium to complex SQL queries Ability to Debug the SQL queries. Knowledge in performance tuning of the ETL process by writing optimal SQL code Knowledge in writing stored procedures and functions in the database. Should have knowledge of working in Agile projects. The associate should possess good communication and interpersonal skills Experience in Continuous Flows Lead day to day design and ETL build Daily hands on development of ETL to ingest and load data to Databases (Teradata/Hadoop) Conduct code reviews with team members and set a high technical standard Drive agile development and a devops approach including continuous build, continuous testing As a senior developer drive the framework and components Partner closely with the business on technology requirements and strategy and create solutions and execute projects end to end from inception to deployment Understand the high level design and create low level design documents Interested candidates can drop your CV on sharada@wrootsglobal.in
Posted 2 months ago
4 - 9 years
7 - 17 Lacs
Hyderabad
Work from Office
About this role: Wells Fargo is seeking a Senior Quantitative Model Solutions Specialist In this role, you will: Lead or participate in moderately complex model maintenance and optimization initiatives related to operating processes, controls, reporting, testing, implementation, and documentation Review and analyze moderately complex data sets, quantitative models, and model outputs to validate model efficiency and results in support of business initiatives Advise and guide team on moderately complex model optimization and processes strategies Independently resolve moderately complex issues and lead team to meet project deliverables while leveraging solid understanding of policies and compliance requirements Collaborate and consult with peers, colleagues, and managers to resolve issues and achieve goals. Required Qualifications: 4+ years of quantitative solutions engineering, model solutions or quantitative model operations experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education. Desired Qualifications: Overall experience around 4 to 10 years in similar role Bachelors degree or higher in a quantitative field such as Computer Science, Applied mathematics, engineering, statistics, finance or econometrics from top tier institutes Strong problem solving skills 4+ years of experience in credit risk analytics with exposure to statistical and machine learning model development, implementation or ML Ops 4+ years Data engineer- Oracle, Teradata, Hadoop, SQL 2+ years of advanced programming expertise in SAS 4+ years of advanced programming and debugging skills in Python OOP, packaging, build and deployment, data structures and algorithms, decorators, logging, exception handling, JIT compilers 2+ years of experience in High performance computing, Big Data and real time solutions PySpark, MapR streaming, parallel processing, real time optimization. 2+ years of experience in unit testing, UAT testing, regression testing and code review Comfortable with Git, GitHub, CI/CD pipelines and UNIX commands Excellent verbal, written, and interpersonal communication skills Strong ability to develop partnerships and collaborate with other business and functional areas Knowledge and understanding of issues or change management processes Experience determining root cause analysis Detail oriented, results driven, and has the ability to navigate in a quickly changing and high demand environment while balancing multiple priorities Understanding of bank regulatory data sets and other industry data sources Ability to research and report on a variety of issues using problem solving skills Exposure to banking domain in Credit Risk area on Retail/Commercial portfolio Perform various complex activities related to predictive modeling process enhancements and Python conversions Provide engineering and analytical solutions across model development, implementation, monitoring and production in a Big Data environment Support implementation of python based solutions for real time and/or batch based Machine Learning scorecard models for consumer and commercial banking Identify opportunities and deliver process improvements, standardization, rationalization and automations
Posted 3 months ago
5 - 9 years
7 - 17 Lacs
Bengaluru
Work from Office
About this role: Wells Fargo is seeking a Lead Compliance Officer ( Lead Data Analytics Officer). In this role, you will: Provide oversight and monitoring of business group risk based compliance programs Maintain compliance risk expertise and consulting for projects and initiatives with moderate to high risk, over multiple business lines Establish, implement and maintain risk standards and programs to drive compliance with federal, state, agency, legal and regulatory and Corporate Policy requirements Oversee the Front Line's execution and challenges appropriately on compliance related decisions Develop, oversee, and provide independent credible challenge to standards with subject matter expertise Provide direction to the business on developing corrective action plans and effectively managing regulatory change Report findings and make recommendations to management and appropriate committees Identify and recommend opportunities for process improvement and risk control development Receive direction from leaders and exercise independent judgment while developing the knowledge to understand function, policies, procedures, and compliance requirements Monitor the reporting, escalation, and timely remediation of issues, deficiencies or regulatory matters regarding compliance risk management Oversee the Front Line's execution and challenges appropriately on compliance related decisions Make decisions and resolve issues to meet business objectives Interpret policies, procedures, and compliance requirements Collaborate and consult with peers, colleagues and managers to resolve issues and achieve goals Work with complex business units, rules and regulations on moderate to high risk compliance matters Interface with Audit, Legal, external agencies, and regulatory bodies on risk related topics Required Qualifications: 5+ years of Compliance experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Demonstrated/hands-on experience executing analytics or creating visualizations, using any combination of Business Intelligence (BI) and analytic tools including Tableau, Power BI, SAS Viya, Alteryx, or SQL. Exposure to Teradata 5+years of experience analyzing, interpreting and formatting data using scripts or other analytic tools 4+ years of Compliance and/or Risk experience Ability to evaluate risks and the impact of decisions on an overall organization Ability to identify and evaluate exposures and potential risks Knowledge and understanding of SharePoint and Power Apps Excellent verbal, written, and interpersonal communication skills Strong analytical skills with high attention to detail and accuracy Ability to exercise independent judgment and creative problem-solving techniques Ability to perform autonomously to prioritize workload, meet deadlines, and drive optimized solutions. Ability to effectively assess stakeholder, partner, or client needs while consulting, building solutions, and developing processes Ability to translate complex technical needs into straightforward requests for information and collaboration, working with a wide range of people Familiarity with formal project and change management processes, particularly Agile and similar methodologies Professional presentation skills Job Expectations: Develop visualizations and data extracts using diverse data sets to support compliance insights with increase scope, precision, and efficiency Support the end-to-end life-cycle of visualization and analytic projects including tool section, sourcing of data, ideation, documentation, peer reviews, UAT, production deployment and support. Create Interactive Dashboards Train and support other team members on data visualization tools and techniques Collaborate with cross-functional teams to understand business requirements and develop data visualization solutions Perform data analysis to identify trends and insights Create and maintain documentation of data visualization processes and best practices Continuously improve data visualization processes and tools to enhance efficiency Actively connecting managing relationships with US stakeholders to stay plugged into the evolution of the data analytics program & priorities Attend strategic meetings as representative for the data analytics priority execution (I&P) team Direct reporting, scheduling of monitoring work in tandem with manager Provide adhoc reporting analytics or projects as needed Act as backup for stateside leaders if they are OOO or unavailable
Posted 3 months ago
5 - 9 years
7 - 17 Lacs
Bengaluru
Work from Office
In this role, you will: Advise line of business and companywide functions on business strategies based on research of performance metrics, trends in population distributions, and other complex data analysis to maximize profits and asset growth, and minimize operating losses within risk and other operating standards Provide influence and leadership in the identification of new tools and methods to analyze data Ensure adherence to compliance and legal regulations and policies on all projects managed Provide updates on project logs, monthly budget forecasts, monthly newsletters, and operations reviews Assist managers in building quarterly and annual plans and forecast future market research needs for business partners supported Strategically collaborate and consult with peers, colleagues, and mid-level to senior managers to resolve issues and achieve goals Lead projects, teams, or serve as a peer mentor to staff, interns and external contractors Required Qualifications: 5+ years of Analytics experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Required good experience and technical knowledge in SAS, SQL and Teradata Good knowledge on Banking domain
Posted 3 months ago
2 - 7 years
18 - 20 Lacs
Gurgaon
Work from Office
Providing Analytical & Decision Support across GS through advanced analytics (from sourcing to staging data, generating insights to exposing them for consumption via reporting platforms/strategy implementation) Enabling business user self-service through creation of MIS capabilities Systematically identify out of pattern activities in a timely manner and address information gaps by providing insightful analytics Working independently assuming responsibility for the development, validation and implementation of projects Participate on global teams evaluating processes and making suggestions for process and system improvements Interacting with all levels of the organization across multiple time zones. Critical Factors to Success: Ensure timely and accurate MIS based on customer requirements Centrally manage MIS and key operational metrics and address functional data needs across operations and support teams Provide analytical and decision support framework and address information gaps through insightful analytics and developing lead indicators Build collaborative relationships across GS groups and participate on global teams evaluating processes and making suggestions for process and system improvements Put enterprise thinking first, connect the role s agenda to enterprise priorities and balance the needs of customers, partners, colleagues & shareholders Past Experience: Preferably a minimum 2 years experience with at least 1 year in Quantitative Business Analysis/Data Science with experience in handling large data sets Academic Background: Bachelors Degree or equivalent, preferably in a quantitative field Post-graduate degree in a quantitative field will be an added advantage Functional Skills/Capabilities: Must possess strong quantitative and analytical skills and be a conceptual and innovative thinker Project management skills and ability to identify and translate business information needs into insights and information cubes for ease of consumption in reporting and analytics Proven thought leadership, strong communication, relationship management skills Ability to work on multiple projects simultaneously, flexibility and adaptability to work within tight deadlines and changing priorities Data presentation & visualization skills Technical Skills/Capabilities: Excellent programming skills on Hive/SAS/SQL/Teradata is essential with good understanding of Big Data ecosystems Exposure to visualization using Business Intelligence software like Tableau or Qlikview will be an added advantage Knowledge of Platforms: Advanced knowledge of Microsoft Excel and PowerPoint, Word, Access and Project Behavioral Skills/Capabilities: Set The Agenda: Define What Winning Looks Like, Put Enterprise Thinking First, Lead with an External Perspective Bring Others With You: Build the Best Team, Seek & Provide Coaching Feedback, Make Collaboration Essential Do It The Right Way: Communicate Frequently, Candidly & Clearly, Make Decisions Quickly & Effectively, Live the Blue Box Values, Great Leadership Demands Courage Benefits include: Competitive base salaries Bonus incentives Support for financial-we'll-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site we'llness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities
Posted 3 months ago
4 - 6 years
5 - 8 Lacs
Bengaluru
Work from Office
The HiLabs Story HiLabs is a leading provider of AI-powered solutions to clean dirty data, unlocking its hidden potential for healthcare transformation. HiLabs is committed to transforming the healthcare industry through innovation, collaboration, and a relentless focus on improving patient outcomes. HiLabs Team Multidisciplinary industry leaders Healthcare domain experts AI/ML and data science experts Professionals hailing from the worlds best universities, business schools, and engineering institutes including Harvard, Yale, Carnegie Mellon, Duke, Georgia Tech, Indian Institute of Management (IIM), and Indian Institute of Technology (IIT). Job Title : Senior Quality Assurance Engineer Job Location : Bangalore, India Experience Level: 4-6 years Job Summary: We are looking for an experienced Senior Quality Assurance Engineer to lead our QA efforts. This is a client-driven role where the focus will be on ensuring the quality and accuracy of data across multiple systems. Candidate should have experience working with tools like SQL , Snowflake , Teradata , ICDQ , and Solr and should be proficient in both functional and non-functional testing practices. You will play a pivotal role in shaping the testing strategy and mentoring other team members. Responsibilities: Lead Testing Strategy and Planning: Define and implement comprehensive testing strategies for both functional and non-functional testing to ensure product quality and compliance with client expectations. Manual Testing: Execute manual tests, including data validation and integrity checks, ensuring that systems are functioning as expected. Python-Based Automation (This is optional): Develop and maintain automation scripts using Python to streamline repetitive testing tasks, enhance test coverage, and improve overall efficiency in the testing process. Cross-functional Collaboration: Work closely with development, data, and business teams to define testing requirements and ensure alignment across the board. Mentorship and Leadership: Mentor junior QA engineers and provide guidance on testing best practices, tools, and methodologies. Foster a culture of continuous improvement within the QA team. Data Quality Assurance: Leverage your expertise with SQL , Snowflake , Teradata , ICDQ , and Solr for day-to-day data testing activities, ensuring accurate data processing, migration, and integration. Data Validation: Create and execute data validation tests across large datasets using data analysis libraries like Pandas and NumPy to ensure data accuracy and integrity. Defect Management: Identify, document, and manage defects through the defect life cycle, ensuring timely resolution by coordinating with development teams. Collaborate with cross-functional teams to ensure alignment on testing requirements and outcomes. Performance and Scalability Testing: Conduct non-functional testing related to system performance, scalability, reliability, and security, ensuring the system meets quality standards under various conditions. Client Interaction: Serve as a key point of contact for QA-related matters with clients, presenting quality metrics, test strategies, and defect trends as needed. Continuous Improvement: Identify opportunities to improve testing processes, tools, and methodologies. Ensure the team is staying updated with the latest testing techniques and tools. Reporting: Generate and present regular quality assurance reports, including test coverage, defect trends, and testing progress, to stakeholders and clients. Required Skills and Qualifications: Extensive experience in manual testing, with a focus on data testing and validation. Deep understanding of SQL, Snowflake, Teradata, ICDQ, and Solr, with proven experience using these tools in a testing capacity. Expertise in both functional and non-functional testing, including performance, scalability, and security. Proven leadership skills with the ability to mentor and guide junior team members. Strong knowledge of test management tools, defect tracking systems, and QA methodologies. Experience with complex SQL queries and data validation in multiple databases. Exceptional communication and collaboration skills, with experience working closely with clients. Strong problem-solving skills and attention to detail. Preferred Qualifications: Previous experience in a senior or lead QA role in a client-facing environment. Familiarity with automation tools and frameworks (a plus, but not mandatory). Experience with testing in cloud-based environments, particularly using data warehouses like Snowflake. HiLabs is an equal opportunity employer (EOE). No job applicant or employee shall receive less favorable treatment or be disadvantaged because of their gender, marital or family status, color, race, ethnic origin, religion, disability, or age; nor be subject to less favorable treatment or be disadvantaged on any other basis prohibited by applicable law. HiLabs is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce to support individual growth and superior business results. HiLabs Total Rewards Competitive Salary, Accelerated Incentive Policies, H1B sponsorship, Comprehensive benefits package that includes ESOPs, financial contribution for your ongoing professional and personal development, medical coverage for you and your loved ones, 401k, PTOs a collaborative working environment, Smart mentorship, and highly qualified multidisciplinary, incredibly talented professionals from highly renowned and accredited medical schools, business schools, and engineering institutes.
Posted 3 months ago
2 - 6 years
7 - 11 Lacs
Bengaluru
Work from Office
Core Expertise: At least 2+ years of production support experience (Experience in Telecom system will be added advantage) L2 Application Support / Installation / Configuration experience for product IBM DataStage Ability to Troubleshoot Production/Product related issues. Must having knowledge with IBM DataStage administration and Data Integration. Lead the end-to-end design, implementation, and maintenance of the EDH, ensuring scalability, security, and high availability. Good knowledge of Python/Shell scripting Monitor data ingestion, processing, and quality assurance processes to ensure a reliable data pipeline. Technical Skills: Experience in Linux/Unix environments for system monitoring and script execution. Have hands-on experience of Oracle/Teradata Operational Skills: Monitor and manage daily DataStage to meet operational KPIs Address alerts and incidents related to data platform. Manage application administration and operation to meet operational KPIs. Collaborate with IT teams and other stakeholders to address issues Perform root cause analysis and implement permanent fixes for recurring issues. Generate regular reports on Data Platform performance, processed data, and error trends. Maintain up-to-date documentation of processes, configurations, and troubleshooting procedures. Use monitoring tools to track system health and data integrity. Strong troubleshooting skills with a focus on minimizing downtime and operational disruptions. Ability to analyze data and identify discrepancies or anomalies. Should be aware of ITSM processes Soft Skills: Willingness to work in 24 *7 shift based support environment Good verbal and written communication skills, with the ability to communicate effectively with cross-functional teams, stakeholders, and vendors. Strong analytical and problem-solving skills. Effective communication and collaboration abilities with cross-functional teams. High attention to detail and the ability to work under pressure in a fast-paced environment. Good to have Skills: Any relevant certification in administration related to Data Integration Products Minimum Work Experience: 3-6 years of experience in IBM DataStage
Posted 3 months ago
5 - 8 years
13 - 15 Lacs
Chennai
Work from Office
Design, develop, and maintain complex Ab Initio ETL processes for high-volume data warehousing and data integration projects. -Analyze business requirements and translate them into efficient Ab Initio graph designs. -Develop and optimize Ab Initio graphs, including data validation, transformation, and loading processes. - Proficiency in Databases / SQL (Teradata) to write efficient SQL queries for data extraction, manipulation, and validation within Ab Initio. - Shell Scripting: Strong experience with Unix shell scripting for automating tasks and managing Ab Initio jobs. -Excellent problem-solving and debugging skills within the Ab Initio environment. -Experience with Databricks and/or other Big Data technologies handling parquet file formats using Ab Initio ETL is highly preferred. -Strong communication skills and ability to collaborate effectively within a team. - Maintain a high level of availability and responsiveness during on-call shifts.
Posted 3 months ago
8 - 13 years
25 - 27 Lacs
Bengaluru, Mumbai (All Areas)
Hybrid
Urgent Hiring , We are hiring for Top MNC firm with Immediate Offer // Immediate Joiners. #Teradata#SQL#Datawarehosuing#Unix (Mandatory) Location : Bangalore / Mumbai Mode :-Hybrid Mode Experience : 8+ years Mandatory Skills : Teradata Demand :- i) Overall 8+ Years of Experience ii) Good SQL skill with Teradata Hands on Exp of at least 2 years Generic any type of SQL at least 5 years of the total 8 terra data specific usage yes 1 to 2 years is fine iii) Good Unix skill to work with SQL with wrapper script of Unix iv) Good Problem-solving skill and excellent communication skill If anyone is Interested please share Cv at shivam@thehrsolutions.in OR deepali.jain@thehrsolutions.in // 9667114931.
Posted 3 months ago
5 - 8 years
7 - 10 Lacs
Hyderabad
Work from Office
• Expert level knowledge in Informatica Power center(ETL framework) and Teradata with extensive hands on development experience Must have • Experience in Design(Data analysis/ETL logic), deployment and support data warehouse solutions using Informatica Powercenter, Teradata ,Oracle, Unix Must have • Strong SQL development and Teradata database knowledge. • Should have Finance Data Analysis experience (Booking, Revenue) – Will be an Advantage • Control M(Scheduling) – Good to Have • GIT(Version control) – Good to Have • Experience in Jira tool usage - Good to Have • Exceptional communication skills to drive the customer and stakeholders representing the team • Excellent analytical and logical thinking ability to deliver optimistic solutions to deliver optimistic solutions • Good to Have : Knowledge of Snowflake and DBT
Posted 3 months ago
5 - 8 years
7 - 10 Lacs
Hyderabad
Work from Office
• Expert level knowledge in Informatica Power center(ETL framework) and Teradata with extensive hands on development experience Must have • Experience in Design(Data analysis/ETL logic), deployment and support data warehouse solutions using Informatica Powercenter, Teradata ,Oracle, Unix Must have • Strong SQL development and Teradata database knowledge. • Should have Finance Data Analysis experience (Booking, Revenue) – Will be an Advantage • Control M(Scheduling) – Good to Have • GIT(Version control) – Good to Have • Experience in Jira tool usage - Good to Have • Exceptional communication skills to drive the customer and stakeholders representing the team • Excellent analytical and logical thinking ability to deliver optimistic solutions to deliver optimistic solutions • Good to Have : Knowledge of Snowflake and DBT
Posted 3 months ago
10 - 17 years
25 - 37 Lacs
Chennai, Pune, Bengaluru
Hybrid
-10+yrs exp in Data architect role -Strong exp in snowflake architect role -min 5+yrs in snowflake dev role -Exp in presales, RFP for snowflake project is a plus -Exp in IICS and Terradata is must -2-11pm work time hybrid mode
Posted 3 months ago
3 - 6 years
3 - 7 Lacs
Chennai
Work from Office
Developand set up the transformation of data from sources to enable analysis anddecision making. Maintaindata flow from source to the designated target without affecting the crucialdata flow and to play a critical part in the data supply chain, by ensuringstakeholders can access and manipulate data for routine and ad hoc analysis. Implementprojects focused on collecting, aggregating, storing, reconciling, and makingdata accessible from disparate sources. Providesupport during the full lifecycle of data from ingesting through analytics toaction. Analyzeand organize raw data. Evaluate business needs and objectives. Interprettrends and patterns. Conduct complex data analysis and report on results. Coordinatewith source team and end-user and develop solutions. Implementdata governance policies and support data-versioning processes. Maintain security and dataprivacy. Requirements Must Have: Proven hands-on experience inbuilding complex analytical queries in Teradata. 4+ years of extensiveprogramming experience in Teradata Tools and Utilities. Hands-on experience in Teradatautilities such as Fast Load, Multi Load, BTEQ, and TPT. Experience in data qualitymanagement and best practices across data solution implementations. Experience in developmenttesting and deployment, coding standards, and best practices. Experience in preparingtechnical design documentation. Strong team collaboration andexperience working with remote teams. Knowledge in data modelling anddatabase management such as performance tuning of the Enterprise DataWarehouse, Data Mart, and Business Intelligence Reporting environments, andsupport the integration of those systems with other applications. Good to have: Should be good in Unix Shellscripting. Experience in DataTransformation using ETL/ELT tools. Experience in differentrelational databases (i.e. Teradata, Oracle, PostgreSQL). experience with CI/CDdevelopment and deployment tools (i.e. Maven, Jenkins, Git, Kubernetes).
Posted 3 months ago
6 - 11 years
17 - 30 Lacs
Delhi NCR, Bengaluru, Hyderabad
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Microsoft ETL Lead Engineer | Database design |Agile Development Process | Release Management)! Position Overview: We are currently seeking a highly experienced ETL engineer with hands-on experience in Microsoft ETL (Extract, Transform, Load) technologies. The ideal candidate will have a deep understanding of ETL processes, data warehousing, and data integration, with a proven track record of leading successful ETL implementations. As an principal ETL engineer, you will play a pivotal role in architecting, designing, and implementing ETL solutions to meet our organization's data needs. Key Responsibilities: • Lead and design and development of ETL processes using Microsoft ETL technologies, such as SSIS (SQL Server Integration Services) • Mandatory hands on experience (70% development, 30% leadership) • Collaborate with stakeholders to gather and analyze requirements for data integration and transformation. • Design and implement data quality checks and error handling mechanisms within ETL processes. • Lead a team of ETL developers, providing technical guidance, mentorship, and oversight. • Perform code reviews and ensure adherence to best practices and coding standards. • Troubleshoot and resolve issues related to data integration, ETL performance, and data quality. • Work closely with database administrators, data architects, and business analysts to ensure alignment of ETL solutions with business requirements. • Stay up-to-date with the latest trends and advancements in ETL technologies and best practices • Identify and resolve performance bottlenecks.Implement best practices for database performance tuning and optimization. • Ensure data integrity, security, and availability • Create and maintain documentation for database designs, configurations, and procedures • Ensure compliance with data privacy and security regulations Qualifications: Education: • Bachelor’s degree in Computer Science, Information Technology, or a related field. Experience: • experience in designing, developing, and implementing ETL solutions using Microsoft ETL technologies, particularly SSIS. • Strong understanding of data warehousing concepts, dimensional modeling, and ETL design patterns. • Proficiency in SQL and experience working with relational databases, preferably Microsoft SQL Server. • Experience leading ETL development teams and managing end-to-end ETL projects. • Proven track record of delivering high-quality ETL solutions on time and within budget. • Experience with other Microsoft data platform technologies (e.g., SSAS, SSRS) is a plus • Familiarity with version control systems (e.g., Git) • Knowledge of containerization and orchestration (e.g., Docker, Kubernetes) is a plus Soft Skills: o Strong analytical and problem-solving skills o Excellent communication and collaboration abilities o Ability to work independently and as part of a team Preferred Qualifications o Experience with cloud-based database services (e.g., AWS RDS, Google Cloud SQL) o Knowledge of other database systems (e.g., PGSQL, Oracle) o Familiarity with Agile development methodologies Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Teradata is a popular data warehousing platform that is widely used by businesses in India. As a result, there is a growing demand for skilled professionals who can work with Teradata effectively. Job seekers in India who have expertise in Teradata have a wide range of opportunities available to them across different industries.
These cities are known for their thriving tech industries and have a high demand for Teradata professionals.
The average salary range for Teradata professionals in India varies based on experience levels. Entry-level roles can expect to earn around INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.
In the field of Teradata, a typical career path may involve progressing from roles such as Junior Developer to Senior Developer, and eventually to a Tech Lead position. With experience and skill development, professionals can take on more challenging and higher-paying roles in the industry.
In addition to Teradata expertise, professionals in this field are often expected to have knowledge of SQL, data modeling, ETL tools, and data warehousing concepts. Strong analytical and problem-solving skills are also essential for success in Teradata roles.
As you prepare for interviews and explore job opportunities in Teradata, remember to showcase your skills and experience confidently. With the right preparation and determination, you can land a rewarding role in the dynamic field of Teradata in India. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
36723 Jobs | Dublin
Wipro
11788 Jobs | Bengaluru
EY
8277 Jobs | London
IBM
6362 Jobs | Armonk
Amazon
6322 Jobs | Seattle,WA
Oracle
5543 Jobs | Redwood City
Capgemini
5131 Jobs | Paris,France
Uplers
4724 Jobs | Ahmedabad
Infosys
4329 Jobs | Bangalore,Karnataka
Accenture in India
4290 Jobs | Dublin 2