Jobs
Interviews

272 Netezza Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 12.0 years

0 Lacs

Delhi, India

On-site

TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. What we are looking for Role: SQL DBA Experience Range: 6 - 12 years Location: New Delhi / Gurugram Interview Mode: Saturday Virtual Drive Must Have: Minimum 5 mandate details are mandate with two or 3 liners 1. MSSQL Server 2. Azure SQL Server 3. Must have done certifications in SQL Server / Azure SQL Good to Have: Minimum 2 mandate details are mandate with two or 3 liners 1. DB2 2. Netezza 3. PowerShell 4. Azure PostgreSQL Essential: Administer and maintain database systems, with a focus on MS SQL Server along with Azure, PostgreSQL, and DB2. Supporting SQL server in Azure environment as IAAS/SQL MI/PaaS services. Managing Azure SQL databases, SQL Managed Instances, Azure VM in Azure Portal. Monitor database performance and proactively address issues to ensure optimal functionality. Collaborate with project teams to understand database requirements and provide efficient solutions. Participate in the design, implementation, and maintenance of database structures for different applications. Work independently to troubleshoot and resolve database-related issues promptly. Implement best practices to enhance database performance and security Manage databases on Azure Cloud, ensuring seamless integration and optimization for cloud-based solutions Utilize SQL Server tools and other relevant technologies for effective database administration Stay updated on the latest advancements in database tools and incorporate them into daily practices. Collaborate with cross-functional teams, including developers and system administrators, to achieve project goals and Provide guidance and support to team members on database-related issues. Minimum Qualification: •15 years of full-time education •Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable) Show more Show less

Posted 3 months ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less

Posted 3 months ago

Apply

5.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations. Show more Show less

Posted 3 months ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Build the future of the AI Data Cloud. Join the Snowflake team. The Technical Instructor for the Snowflake Customer Education and Training Team will be responsible for creating and delivering compelling education contents and training sets that make complex concepts come alive in instructor-led classroom venues. The senior instructor will be seen as a subject matter expert and leader in transferring knowledge of Snowflake to customers, partners and internals and in accelerating their technical on-boarding journey. This role will also be responsible for the cross-training efforts, program management and help strategically ramp multiple resources within our external stakeholders. This role is a unique opportunity to contribute in a meaningful way to high value and high impact delivery at a very exciting time for the company. Snowflake is an innovative, high-growth, customer-focused company in a large and growing market. If you are an energetic, self-managed professional with experience teaching data courses to customers and possess excellent presentation and communication skills, we’d love to hear from you. AS A TECHNICAL INSTRUCTOR AT SNOWFLAKE, YOU WILL: Teach a breadth of technical courses to onboard customers and partners to Snowflake, the data warehouse built for the Cloud Cross-train a breadth of technical courses to qualified individuals and resources The scope of course concepts may include foundational and advanced courses in the discipline which includes Snowflake data warehousing concepts, novel SQL capabilities, data consumption and connectivity interfaces, data integration and ingestion capabilities, database security features, database performance topics, Cloud ecosystem topics and more Apply database and data warehousing industry/domain/technology expertise and experience during training sessions to help customers and partners ease their organizations into the Snowflake data warehouse from prior database environments Deliver contents and cross train on delivery best practices using a variety of presentation formats including engaging lectures, live demonstration, and technical labs Work with customers and partners that are investing in the train the trainer program to certify their selected trainers ensuring they are well prepared and qualified to deliver the course at their organization Strong eye for design, making complex training concepts come alive in a blended educational delivery model Work with the education content developers to help prioritize, create, integrate, and publish training materials and hands-on exercises to Snowflake end users; drive continuous improvement of training performance Work with additional Snowflake subject-matter-experts in creating new education materials and updates to keep pace with Snowflake product updates OUR IDEAL TECHNICAL INSTRUCTOR WILL HAVE: Strong data warehouse and data-serving platform background Recent experience with using SQL including potentially in complex workloads 5-10 years of experience in technical content training development and delivery Strong desire and ability to teach and train Prior experience with other databases (e.g. Oracle, IBM Netezza, Teradata,…) Excellent written and verbal communication skills Innovative and assertive, with the ability to pick up new technologies Presence: enthusiastic and high energy, but also poised, confident and extremely professional Track record of delivering results in a dynamic start-up environment Experience working cross functionally, ideally with solution architects, technical writers, and support Strong sense of ownership and high attention to detail Candidates with degrees from fields such as Computer Science or Management Information Systems Comfortable with travel up to 75% of the time BONUS POINTS FOR EXPERIENCE WITH THE FOLLOWING: Experience with creating and delivering training programs to mass audiences Experience with other databases (e.g. Teradata, Netezza, Oracle, Redshift,…) Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase,…) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Business Objects, Tableau,…) Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure,…) Experience with ETL pipelines tools Experience using AWS and Microsoft services Participated in Train the Trainer programs Proven success at enterprise software startups Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com Show more Show less

Posted 3 months ago

Apply

6.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Diverse Lynx is looking for Datastage Developer to join our dynamic team and embark on a rewarding career journey Analyzing business requirements and translating them into technical specifications Designing and implementing data integration solutions using Datastage Extracting, transforming, and loading data from various sources into target systems Developing and testing complex data integration workflows, including the use of parallel processing and data quality checks Collaborating with database administrators, data architects, and stakeholders to ensure the accuracy and consistency of data Monitoring performance and optimizing Datastage jobs to ensure they run efficiently and meet SLAs Troubleshooting issues and resolving problems related to data integration Knowledge of data warehousing, data integration, and data processing concepts Strong problem-solving skills and the ability to think creatively and critically Excellent communication and collaboration skills, with the ability to work effectively with technical and non-technical stakeholders

Posted 3 months ago

Apply

8 - 10 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Senior Data Engineer - Google Cloud  7+ years direct experience working in Enterprise Data Warehouse technologies.  7+ years in a customer facing role working with enterprise clients.  Experience with architecting, implementing and/or maintaining technical solutions in virtualized environments.  Experience in design, architecture and implementation of Data warehouses, data pipelines and flows.  Experience with developing software code in one or more languages such as Java, Python and SQL.  Experience designing and deploying large scale distributed data processing systems with few technologies such as Oracle, MS SQL Server, MySQL, PostgreSQL, MongoDB, Cassandra, Redis, Hadoop, Spark, HBase, Vertica, Netezza, Teradata, Tableau, Qlik or MicroStrategy.  Customer facing migration experience, including service discovery, assessment, planning, execution, and operations.  Demonstrated excellent communication, presentation, and problem-solving skills.  Experience in project governance and enterprise.  Mandatory Certifications Required Google Cloud Professional Cloud Architect Google Cloud Professional Data Engineer Mandatory skill sets-GCP Architecture/Data Engineering, SQL, Python Preferred Skill Sets-GCP Architecture/Data Engineering, SQL, Python Year of experience required-8-10 years Qualifications-B.E / B.TECH/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field Of Study Required Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 3 months ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction The Hybrid Data Management (HDM) team is looking for enthusiastic and talented software developers to join us. Our services include Db2 on Cloud, Db2 Warehouse on Cloud, Netezza on Cloud and Data Virtualization as a Service. Our services are tightly integrated with IBM Cloud Pak for Data where customers can access a suite of leading data and AI capabilities in a unified experience. Your Role And Responsibilities Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today - planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of. Design, develop, test, operate and maintain database features in our products and services and tools to provide a secure environment for the product to be used by customers in the cloud. Evaluate new technologies and processes that enhance our service capabilities. Preferred Education Documenting and sharing your experience, mentoring others Bachelor's Degree Required Technical And Professional Expertise 5+ years of relevant experience in software development Strong software programming experience and skills in C/C++ or in equivalent programming language Strong knowledge of data structures, algorithms, object-oriented programming, and test-driven development. Expertise to best practices in design, development Strong problem determination and resolution skills Preferred Technical And Professional Experience Knowledge of Linux/UNIX Operating Systems Exposure to best practices in design, development and testing of software Working experience with SQL databases (Db2, PostgreSQL, MySQL, Oracle, SQL Server etc) Show more Show less

Posted 3 months ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 3 months ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Senior Data Engineer will be Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 8- 12 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Show more Show less

Posted 4 months ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Sr. Software Development Engineer (Hadoop / Python / SQL / Impala Dev) Overview Job Description Summary Mastercard is a technology company in the Global Payments Industry. We operate the world’s fastest payments processing network, connecting consumers, financial institutions, merchants, governments and businesses in more than 210 countries and territories. Mastercard products and solutions make everyday commerce activities – such as shopping, travelling, running a business and managing finances – easier, more secure and more efficient for everyone. Mastercard’s Data & Services team is a key differentiator for MasterCard, providing cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this dynamic team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services. We are currently seeking a Software Development Engineer-II for Location Program within the Data & Services group. You will own end-to-end delivery of engineering projects for some of our analytics and BI solutions that leverage Mastercard dataset combined with proprietary analytics techniques, to help businesses around the world solve multi-million dollar business problems. Roles And Responsibilities Work as a member of support team to resolve issues related to product, should have good troubleshooting skills and good knowledge in support work. Independently apply problem solving skills to identify symptoms and root causes of issues. Make effective and efficient decisions even when data is ambiguous. Provide technical guidance, support and mentoring to more junior team members. Make active contributions to improvement decisions and make technology recommendations that balance business needs and technical requirements. Proactively understand stakeholder needs, goals, expectations and viewpoints, to deliver results. Ensure design thinking accounts for long term maintainability of code. Thrive in a highly collaborative company environment where agility is paramount. Stay up to date with latest technologies and technical advancements through self-study, blogs, meetups, conferences, etc. Perform system maintenance, production incident problem management, identification of root cause & issue remediation. All About You Bachelor's degree in Information Technology, Computer Science or Engineering or equivalent work experience, with a proven track-record of successfully delivering on complex technical assignments. A solid foundation in Computer Science fundamentals, web applications and microservices-based software architecture. Full-stack development experience, including , Databases (Oracle, Netezza, SQL Server), Hands-on experience with Hadoop, Python, Impala, etc,. Excellent SQL skills, with experience working with large and complex data sources and capability of comprehending and writing complex queries. Experience working in Agile teams and conversant with Agile/SAFe tenets and ceremonies. Strong analytical and problem-solving abilities, with quick adaptation to new technologies, methodologies, and systems. Excellent English communication skills (both written and verbal) to effectively interact with multiple technical teams and other stakeholders. High-energy, detail-oriented and proactive, with ability to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-240980 Show more Show less

Posted 4 months ago

Apply

15 - 19 years

15 - 30 Lacs

Noida, Chennai, Bengaluru

Hybrid

Job Description- Experience - 12 Years 16 Years Primary Skill - Delivery management with Data Warehouse background Notice Period - Immediate to 30 Days Work Location - Chennai, Noida & Bangalore Role & Responsibility- Required Skills 12+ Years of experience in managing delivery of Data Warehouse Projects (Development & Modernization/Migration). Strong Delivery background with experience in managing large complex Data Warehouse engagements. Good to have experience on Snowflake, Matillion, DBT, Netezza/DataStage and Oracle. Healthcare Payer Industry experience Extensive experience in Program/Project Management, Iterative, Waterfall and Agile Methodologies. Ability to track and manage complex program budgets Experience in managing the delivery of complex programs to meet the needs and the required timelines set for the defined programs. Communicate program review results to various stakeholders. Experience in building the team, providing guidance, and education as needed to ensure the success of priority programs and promote cross-training within the department. Experience in developing and managing an integrated program plans that incorporate both technical and business deliverables. Verify that critical decision gates are well defined, communicated and monitored for executive approval throughout the program. Verify that work supports the corporate strategic direction. Review resulting vendor proposals and estimates to ensure they satisfy both our functional requirements and technology strategies. Project management methodologies, processes, and tools. Knowledge of Project Development Life Cycle Establish and maintain strong working relationships with various stakeholders including team members, IT resources, resources in other areas of the business and upper management Ability to track and manage complex program budgets Strong business acumen and political savvy Ability to collaborate while dealing with complex situations Ability to think creatively and to drive innovation Ability to motivate, lead and inspire a diverse group to a common goal/solution with multiple stakeholders Ability to convert business strategy into action oriented objectives and measurable results Strong negotiating, influencing, and consensus-building skills Ability to mentor, coach and provide guidance to others Responsibilities: Responsible for the end to end delivery of the Application Development and Support services for the client Coordinate with Enterprise Program Management Office to execute programs following defined standards and governance structure to ensure alignment to the approved project development life cycle (PDLC). Interface regularly with key senior business leaders to enable a smooth transition from strategy development to program identification and execution. Facilitate meetings with task groups or functional areas as required for EPMO supported initiatives and/or to resolve issues. Proactively engage other members of the organization with specific subject knowledge to resolve issues or provide assistance. Lead post implementation review of major initiatives to provide lessons learned and continuous improvement. Develop accurate and timely summary report for executive management that provide consolidated, clear, and concise assessments of strategic initiatives implementation status. Collaborate with business owners to develop divisional business plans that support the overall strategic direction. Supports budget allocation process through ongoing financial tracking reports. Develop & maintain service plans considering the customer requirements. Track and monitor to ensure the adherence to SLA/KPIs Identify opportunities for improvement to service delivery process. Address service delivery issues/escalations/complaints. First point of escalation for customer escalations Oversee shift management for various tracks. Responsible for publishing production support reports & metrics

Posted 4 months ago

Apply

0.0 years

0 Lacs

Bengaluru, Karnataka

On-site

- 5+ years of data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Experience managing a data or BI team - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience hiring, developing and promoting engineering talent - Experience communicating to senior management and customers verbally and in writing We are seeking an ambitious Data Engineering Manager to join our Metrics and Data Platform team. The Metrics and Data Platform team plays a critical role in enabling Amazon Music’s business decisions and data-driven software development by collecting and providing behavioral and operational metrics to our internal teams. We maintain a scalable and robust data platform to support Amazon Music’s rapid growth, and collaborate closely with data producers and data consumers to accelerate innovation using data. As a Data Engineering Manager, you will manage a team of talented Data Engineers. Your team collects billions of events a day, manages petabyte-scale datasets on Redshift and S3, and develops data pipelines with Spark, SQL, EMR, and Airflow. You will collaborate with product and technical stakeholders to solve challenging data modeling, data availability, data quality, and data governance problems. At Amazon Music, engineering managers are the primary drivers of their team’s roadmap, priorities, and goals. You will be deeply involved in your team’s execution, helping to remove obstacles and accelerate progress. A successful candidate will be customer obsessed, highly analytical and detail oriented, able to work effectively in a data-heavy organization, and adept at leading across multiple different complex workstreams at once. Key job responsibilities - Hiring, motivating, mentoring, and growing a high-performing engineering team - Owning and managing big data pipelines, Amazon Music’s foundational datasets, and the quality and operational performance of the datasets - Collaborating with cross-functional teams and customers, including business analysts, marketing, product managers, technical program managers, and software engineers/managers - Defining and managing your team’s roadmap, priorities, and goals in partnership with Product, stakeholders, and leaders - Ensuring timely execution of team priorities and goals by proactively identifying risks and removing blockers - Recognizing and recommending process and engineering improvements that reduce failures and improve efficiency - Clearly communicating business updates, verbally and in writing, to both technical and non-technical stakeholders, peers, and leadership - Effectively influencing other team’s priorities and managing escalations - Owning and improving business and operational metrics of your team's software - Ensuring team compliance with policies (e.g., information security, data handling, service level agreements) - Identifying ways to leverage GenAI to reduce operational overhead and improve execution velocity - Introducing ideas to evolve and modernize our data model to address customer pain points and improve query performance About the team Amazon Music is an immersive audio entertainment service that deepens connections between fans, artists, and creators. From personalized music playlists to exclusive podcasts, concert livestreams to artist merch, Amazon Music is innovating at some of the most exciting intersections of music and culture. We offer experiences that serve all listeners with our different tiers of service: Prime members get access to all the music in shuffle mode, and top ad-free podcasts, included with their membership; customers can upgrade to Amazon Music Unlimited for unlimited, on-demand access to 100 million songs, including millions in HD, Ultra HD, and spatial audio; and anyone can listen for free by downloading the Amazon Music app or via Alexa-enabled devices. Join us for the opportunity to influence how Amazon Music engages fans, artists, and creators on a global scale. Learn more at https://www.amazon.com/music. Experience with AWS Tools and Technologies (Redshift, S3, EC2) Experience in processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 4 months ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu

Work from Office

Job Description Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. Job Description - Grade Specific The involves leading and managing a team of data engineers, overseeing data engineering projects, ensuring technical excellence, and fostering collaboration with stakeholders. They play a critical role in driving the success of data engineering initiatives and ensuring the delivery of reliable and high-quality data solutions to support the organization's data-driven objectives. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted 4 months ago

Apply

5 - 8 years

0 Lacs

Pune, Maharashtra, India

Entity: Technology Job Family Group: IT&S Group Job Description: Responsible for delivering business analysis and consulting activities for the defined specialism using sophisticated technical capabilities, building and maintaining effective working relationships with a range of customers, ensuring relevant standards are defined and maintained, and implementing process and system improvements to deliver business value. Specialisms: Business Analysis; Data Management and Data Science; Digital Innovation!!! Senior Data Engineer will work as part of an Agile software delivery team; typically delivering within an Agile Scrum framework. Duties will include attending daily scrums, sprint reviews, retrospectives, backlog prioritisation and improvements! Will coach, mentor and support the data engineering squad on the full range of data engineering and solutions development activities covering requirements gathering and analysis, solutions design, coding and development, testing, implementation and operational support. Will work closely with the Product Owner to understand requirements / user stories and have the ability to plan and estimate the time taken to deliver the user stories. Proactively collaborate with the Product Owner, Data Architects, Data Scientists, Business Analysts, and Visualisation developers to meet the acceptance criteria Will be very highly skilled and experienced in use of tools and techniques such as AWS Data Lake technologies, Redshift, Glue, Spark SQL, Athena Years of Experience: 13- 15 Essential domain expertise: Experience in Big Data Technologies – AWS, Redshift, Glue, Py-spark Experience of MPP (Massive Parallel Processing) databases helpful – e.g. Teradata, Netezza Challenges involved in Big Data – large table sizes (e.g. depth/width), even distribution of data Experience of programming- SQL, Python Data Modelling experience/awareness – Third Normal Form, Dimensional Modelling Data Pipelining skills – Data blending, etc Visualisation experience – Tableau, PBI, etc Data Management experience – e.g. Data Quality, Security, etc Experience of working in a cloud environment - AWS Development/Delivery methodologies – Agile, SDLC. Experience working in a geographically disparate team Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Commercial Acumen, Communication, Data Analysis, Data cleansing and transformation, Data domain knowledge, Data Integration, Data Management, Data Manipulation, Data Sourcing, Data strategy and governance, Data Structures and Algorithms (Inactive), Data visualization and interpretation, Digital Security, Extract, transform and load, Group Problem Solving Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 4 months ago

Apply

8.0 - 13.0 years

7 - 11 Lacs

bengaluru

Work from Office

Migration of other databases to Db2(eg TERADATA / SNOWFLAKE / SAP/ Cloudera to db2 migration) Create high-level designs, detail level designs, maintaining product roadmaps which includes both modernization and leveraging cloud solutions Design scalable, performant, and cost-effective data architectures within the Lakehouse to support diverse workloads, including reporting, analytics, data science, and AI/ML. Perform health check of the databases, make recommendations and deliver tuning. At the Database and system level. Deploy DB2 databases as containers within Red Hat OpenShift clustersConfigure containerized database instances, persistent storage, and network settings to optimize performance and reliability. Lead the architectural design and implementation of solutions on IBM watsonx.data, ensuring alignment with overall enterprise data strategy and business objectives. Define and optimize the watsonx.data ecosystem, including integration with other IBM watsonx components (watsonx.ai, watsonx.governance) and existing data infrastructure (DB2, Netezza, cloud data sources) Establish best practices for data modeling, schema evolution, and data organization within the watsonx.data lakehouse Act as a subject matter expert on Lakehouse architecture, providing technical leadership and guidance to data engineering, analytics, and development teams. Mentor junior architects and engineers, fostering their growth and knowledge in modern data platforms. Participate in the development of architecture governance processes and promote best practices across the organization. Communicate complex technical concepts to both technical and non-technical stakeholders.

Posted Date not available

Apply

10.0 - 14.0 years

10 - 14 Lacs

bengaluru

Work from Office

Must Haves o Strong experience with SQL Development/NZSQL including Stored Procedures o Must have strong experience with advanced SQL development and SQL optimization . o Must have used external table, NZLOAD for file loading and unloading o Experience on Materialized views, CBTs o Worked in AWS Redshift development or Netezza for at least 2-3 years o Strong on Unix/Linux Shell Scripting o Ability to interpret data models (3NF or Kimbal) and code SQL accordingly o Must have used DevOps Jenkins, Bitbucket/Git/Sourcetree, Automated Test Scripts using Unix etc. o Must have strong analytic and problem-solving skills. o Must have implemented end to end BI DWH application. Good to Have o Good to have an understanding of Control M , IBM CDC, EC2 etc. o Good to have an understanding of AWS S3 , AWS DMS services. o Good to have an understanding of reporting tools like Tableau or Cognos o Insurance Domain Knowledge would be an added advantage. o Good to have experience in Agile ways of working

Posted Date not available

Apply

3.0 - 7.0 years

6 - 14 Lacs

hyderabad

Work from Office

Job Title: OBIEE & Tableau Reporting Analyst Experience: 7+ years preferred Job Summary: We are seeking a highly skilled Reporting Analyst with strong expertise in both OBIEE and Tableau to support our transition from a legacy OBIEE reporting ecosystem to a modern Snowflake + Tableau stack. The ideal candidate will play a pivotal role in understanding and documenting the current OBIEE reporting logic, collaborating with the Data Engineering team, and recreating reports using Tableau on Snowflake. Key Responsibilities: Analyze and document the existing OBIEE reporting logic, including: RPD (Repository) Layer Semantic Layer UI-generated reports Understand the data flow from Netezza Semantic Layer OBIEE UI. Work closely with the Data Engineering team to share and explain reporting logic for optimization and migration. Collaborate in the migration of data from Netezza to Snowflake. Rebuild and validate reports using Tableau based on the logic previously implemented in OBIEE. Ensure data consistency, business logic integrity, and performance optimization during the migration. Support testing, validation, and user acceptance of Tableau reports. Act as a liaison between business users, data engineers, and visualization developers. Required Skills & Qualifications: Strong experience with OBIEE (Oracle Business Intelligence Enterprise Edition), including: RPD development and troubleshooting Logical and physical data modeling Dashboard/report development Hands-on experience with Tableau for building interactive dashboards and reports. Working knowledge of data warehousing concepts and ETL processes. Experience working with Netezza, Snowflake, or similar data platforms. Strong SQL skills; ability to trace data logic through various layers. Excellent communication and documentation skills. Strong analytical and problem-solving mindset. Ability to work independently and with cross-functional teams. Comfortable interfacing with both technical and business stakeholders. Nice to Have: Experience in data migration projects or legacy-to-modern reporting transformations. Exposure to BI tool administration (OBIEE or Tableau). Familiarity with agile methodologies and Jira or similar tracking tools.

Posted Date not available

Apply

3.0 - 6.0 years

11 - 16 Lacs

pune

Work from Office

Summary Overview Enterprise Data Solution (EDS) is focused on enabling insights into Mastercard network and help build data-driven products by curating and preparing data in a secure and reliable manner. Moving to a Unified and Fault-Tolerant Architecture for Data Ingestion and Processing is critical to achieving this mission. As a Software Development Engineer (Data Engineering) in Enterprise Data Solution (EDS), you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to number of business users who help different customers provide answer to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Participant medium-to-large size data engineering projects Discover, ingest, and incorporate new sources of real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test Assist business in utilizing data-driven insights to drive growth and transformation. Build and maintain data processing workflows feeding Mastercard analytics domains. Facilitate reliable integrations with internal systems and third-party API's as needed. Support data analysts as needed, advising on data definitions and helping them derive meaning from complex datasets. Work with cross functional agile teams to drive projects through full development cycle. Help the team improve with the usage of data engineering best practices. Collaborate with other data engineering teams to improve the data engineering ecosystem and talent within Mastercard. All About You At least Bachelor's degree in Computer Science, Computer Engineering or Technology related field or equivalent work experience Experience in Data Warehouse related projects in product or service based organization Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Additional experience of building data pipelines through Spark with Scala/Python/Java on Hadoop is preferred Experience of working on Nifi will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written and strong relationship, collaboration skills and organizational skills Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams.

Posted Date not available

Apply

8.0 - 13.0 years

13 - 17 Lacs

hyderabad

Work from Office

Primary Responsibilities: Ensures that all the standard requirements have been met and is involved in performing the technical analysis Assisting the project manager by compiling information from the current systems, analyzing the program requirements, and ensuring that it meets the specified time requirements Resolves moderate problems associated with the designed programs and provides technical guidance on complex programming Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Graduate degree or equivalent experience 8+ years of experience in database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning Good hands-on experience in Azure, ADF and Databricks Experience in RDBMS like Oracle, SQL Server, DB2 etc. Knowledge of Azure, ADF, Databricks, DWH (Netezza), Unix/Linux, Airflow, AI tool usage in delivery Understanding of the business needs and designing programs and systems that match the complex business requirements and records all the specifications that are involved in the development and coding process Understanding of QA and testing automation process Good understanding of Extract, Transform, and Load (ETL) Architecture, Cloud, Spark Good understanding of Data Architecture Proven ability to participate in agile development projects for batch and real time data ingestion Proven ability to work with business and peers to define, estimate, and deliver functionality Proven ability to be involved in creating proper technical documentation in the work assignments Preferred Qualifications: Knowledge of Agile, Automation, Big Data, DevOps, Python, - Scala/Pyspark programming, Power BI, GenAI Understanding and knowledge of Agile

Posted Date not available

Apply

3.0 - 6.0 years

6 - 10 Lacs

pune

Work from Office

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted Date not available

Apply

3.0 - 6.0 years

6 - 10 Lacs

pune

Work from Office

Data engineers are responsible for building reliable and scalable data infrastructure that enables organizations to derive meaningful insights, make data-driven decisions, and unlock the value of their data assets. - Grade Specific The role support the team in building and maintaining data infrastructure and systems within an organization. Skills (competencies) Ab Initio Agile (Software Development Framework) Apache Hadoop AWS Airflow AWS Athena AWS Code Pipeline AWS EFS AWS EMR AWS Redshift AWS S3 Azure ADLS Gen2 Azure Data Factory Azure Data Lake Storage Azure Databricks Azure Event Hub Azure Stream Analytics Azure Sunapse Bitbucket Change Management Client Centricity Collaboration Continuous Integration and Continuous Delivery (CI/CD) Data Architecture Patterns Data Format Analysis Data Governance Data Modeling Data Validation Data Vault Modeling Database Schema Design Decision-Making DevOps Dimensional Modeling GCP Big Table GCP BigQuery GCP Cloud Storage GCP DataFlow GCP DataProc Git Google Big Tabel Google Data Proc Greenplum HQL IBM Data Stage IBM DB2 Industry Standard Data Modeling (FSLDM) Industry Standard Data Modeling (IBM FSDM)) Influencing Informatica IICS Inmon methodology JavaScript Jenkins Kimball Linux - Redhat Negotiation Netezza NewSQL Oracle Exadata Performance Tuning Perl Platform Update Management Project Management PySpark Python R RDD Optimization SantOs SaS Scala Spark Shell Script Snowflake SPARK SPARK Code Optimization SQL Stakeholder Management Sun Solaris Synapse Talend Teradata Time Management Ubuntu Vendor Management

Posted Date not available

Apply

4.0 - 9.0 years

13 - 23 Lacs

noida, hyderabad, bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Principal Consultant - Netezza We are looking for an experienced Netezza Support Engineer to provide production support, troubleshooting, and performance optimization for our Netezza environment. The ideal candidate will possess strong technical expertise in Netezza, excellent problem-solving skills, and the ability to work effectively in a fast-paced environment to ensure system stability and high availability. Responsibilities: Provide day-to-day support for Netezza appliances, databases, and processes Monitor system health, performance, and resource utilization Troubleshoot and resolve issues related to Netezza databases, connectivity, and performance Perform incident management, root cause analysis, and implement corrective actions Manage and schedule backups, restores, and disaster recovery procedures Work with development teams to deploy code changes and database updates Work with IBM for patching & platform upgrade. Assist with user management, access controls, and security administration Tune queries and workloads to improve performance and reduce resource consumption Develop and maintain scripts for automation and monitoring Maintain operational documentation and knowledge base for support processes Participate in on-call rotation for after-hours support as required Qualifications we seek in you! Minimum qualifications: Bachelor's/Graduation/Equivalent: BE/B- Tech, MCA, MBA • Strong experience in Netezza administration and support Strong SQL skills and experience with query optimization Hands-on experience with Netezza utilities such as nzload, nzsql, nzbackup, and nzrestore Experience in platform patching & upgrades. Knowledge of Netezza architecture and workload management (WLM) Good understanding of data warehouse concepts Experience in troubleshooting and resolving performance issues Familiarity with monitoring tools and alerting mechanisms Excellent communication and problem-solving skills Ability to work independently and as part of a team Preferred qualifications: Experience migrating workloads from Netezza to cloud data platforms (e.g., Snowflake, Redshift, Synapse) Knowledge of IBM Cloud Pak for Data Scripting skills in Shell, Python, or similar languages Experience supporting large-scale data environments in production ITIL or similar certification in support operations Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at www.genpact.com and on X, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies