Jobs
Interviews

137 Netezza Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

India

On-site

Mandate- Adv SQL & ETL exp-4+ Years KNOWLEDGE AND EXPERIENCE • 4 to 8 years relevant work experience in software testing primarily on Database /ETL and exposure towards Big Data Testing • Hands on experience in Testing Big Data Application on: Azure , Cloudera • Understanding of more query languages: Pig, HiveQL, etc. • Excellent skills in writing SQL queries and good knowledge on database [Oracle/ Netezza/SQL ] • Handson on any one scripting languages: Java/Scala/Python etc. • Good to have experience in Unix shell scripting • Experience in Agile development, knowledge on using Jira • Good analytical skills and communications skills. • Prior Health Care industry experience is a plus. • Flexible to work / Adopt quickly with different technologies and tools

Posted 18 hours ago

Apply

5.0 years

2 - 9 Lacs

Hyderābād

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Role: Partner with stakeholders to understand data requirements Developing business intelligence solutions using Agile software development methodologies Develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization and leadership Producing and managing the delivery of activity and value analytics to external stakeholders and clients Team members will typically use business intelligence, data visualization, and query, analytic and statistical software to build solutions, perform analysis and interpret data Work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms Primary Responsibilities: Management and manipulation of mostly structured data Creating business intelligence solutions using Agile methodologies to support analysis of data and subsequent decision making Conducting analysis, performing normalization operations and assuring data quality Creating specifications to bring data into a common structure Creating product specifications and reporting models Analytical - Synthesizes complex or diverse information; Collects and researches data; Uses intuition and experience to complement data; design workflows and procedures Establish, refine and integrate development and test environment tools and software as needed Identify production and non-production application issues Provide technical support and consultation on database and infrastructure questions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelor’s Degree in MIS, Statistics, Mathematics, Computer Science, Business or a related field 5+ years of experience in data preparation, data management, data visualization and data analysis 5+ years of experience in DWBI domain and solid understanding of DWBI concepts 5+ years of experience in PowerBI 2+ years of experience in Python 2+ years of experience working on machine learning algorithms and frameworks 2+ years of experience in natural language processing 2+ years of experience in applying AI to practical and comprehensive technology solutions Hands-on experience in DWBI domain Hands-on experience in Python Hands-on experience with dealing global customers to gather requirements and translate them into solutions using the necessary skills Extensive on-the job experience working with relational database management system (like Microsoft SQL Server, Oracle) Experience in writing TSQL statements, stored procedures and views using industry best practices for security and efficiency Experience in creating data reports, dashboards, scorecards and Metrics that matter most with extensive use of PowerBI Experience in performing data analysis and interpreting data/business insights Experience developing actionable insights and presenting recommendations for use across the internal/external stakeholder and leadership Experience in ETL and hands-on experience in using ETL tools like SSIS Experience and understanding on Database Performance Tuning, Database Management, Requirements Analysis, Software Development Fundamentals Familiar with Software Development Life Cycle Proven excellent Problem Solving, Documentation Skills, Written & Verbal Communication skills Proven excellent Data Maintenance and promoting Process Improvement skills Proven well-developed written and verbal communications and presentation skills Proven to be open for Development and Support Proven ability to write user and technical documentation Proven ability to conduct analysis to ensure quality of data at various stages of its evolution Proven ability to solve complex problems and develop innovative solutions Proven ability to complete projects that require data mining, analysis, and presentation Proven ability to create, test and execute SQL language code Proven ability to work independently with minimum support/mentoring Proven ability to learn new technology Preferred Qualifications: Experience in the health care industry Hands-on experience on ETL processes using SSIS, Netezza, Teradata etc. Experience with ML, deep learning, TensorFlow, NLP Experience in Snowflake Experience developing and maintaining data preparation and validation routines to support data mining and creating complex data mining algorithms General: This is a high visibility role requiring high energy due to the constant interaction with the customers. It’s good to have a candidate who has the below abilities: Self-driven, ability to work independently and facilitate change Ability to work in a fast-paced, technical, cross-functional environment Excellent Visual design sense regarding clear and accurate presentation of data Ability to work on projects from inception to completion Excellent critical thinking and analytical skills Ability to maintain understanding of numerous client processes Possess excellent time management and prioritization skills in order to meet multiple deadlines Comfortable working in a high-paced/high production area Desire and ability to learn new skills, systems and processes Anticipate customer needs and proactively develop solutions to meet them At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work. SM Diversity creates a healthier atmosphere: UnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law. UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Posted 1 day ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

You should have at least 2 years of professional work experience in implementing data pipelines using Databricks and datalake. A minimum of 3 years of hands-on programming experience in Python within a cloud environment (preferably AWS) is necessary for this role. Having 2 years of professional work experience with real-time streaming systems such as Event Grid and Event topics would be highly advantageous. You must possess expert-level knowledge of SQL to write complex, highly-optimized queries for processing large volumes of data effectively. Experience in developing conceptual, logical, and/or physical database designs using tools like ErWin, Visio, or Enterprise Architect is expected. A minimum of 2 years of hands-on experience working with databases like Snowflake, Redshift, Synapse, Oracle, SQL Server, Teradata, Netezza, Hadoop, MongoDB, or Cassandra is required. Knowledge or experience in architectural best practices for building data lakes is a must for this position. Strong problem-solving and troubleshooting skills are necessary, along with the ability to make sound judgments independently. You should be capable of working independently and providing guidance to junior data engineers. If you meet the above requirements and are ready to take on this challenging role, we look forward to your application. Warm Regards, Rinka Bose Talent Acquisition Executive Nivasoft India Pvt. Ltd. Mobile: +91-9632249758 (INDIA) | 732-334-3491 (U.S.A) Email: rinka.bose@nivasoft.com | Web: https://nivasoft.com/,

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Role Partner with stakeholders to understand data requirements Developing business intelligence solutions using Agile software development methodologies Develop tools and models such as segmentation, dashboards, data visualizations, decision aids and business case analysis to support the organization and leadership Producing and managing the delivery of activity and value analytics to external stakeholders and clients Team members will typically use business intelligence, data visualization, and query, analytic and statistical software to build solutions, perform analysis and interpret data Work on predominately descriptive and regression-based analytics and tend to leverage subject matter expert views in the design of their analytics and algorithms Primary Responsibilities Management and manipulation of mostly structured data Creating business intelligence solutions using Agile methodologies to support analysis of data and subsequent decision making Conducting analysis, performing normalization operations and assuring data quality Creating specifications to bring data into a common structure Creating product specifications and reporting models Analytical - Synthesizes complex or diverse information; Collects and researches data; Uses intuition and experience to complement data; design workflows and procedures Establish, refine and integrate development and test environment tools and software as needed Identify production and non-production application issues Provide technical support and consultation on database and infrastructure questions Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Bachelor’s Degree in MIS, Statistics, Mathematics, Computer Science, Business or a related field 5+ years of experience in data preparation, data management, data visualization and data analysis 5+ years of experience in DWBI domain and solid understanding of DWBI concepts 5+ years of experience in PowerBI 2+ years of experience in Python 2+ years of experience working on machine learning algorithms and frameworks 2+ years of experience in natural language processing 2+ years of experience in applying AI to practical and comprehensive technology solutions Hands-on experience in DWBI domain Hands-on experience in Python Hands-on experience with dealing global customers to gather requirements and translate them into solutions using the necessary skills Extensive on-the job experience working with relational database management system (like Microsoft SQL Server, Oracle) Experience in writing TSQL statements, stored procedures and views using industry best practices for security and efficiency Experience in creating data reports, dashboards, scorecards and Metrics that matter most with extensive use of PowerBI Experience in performing data analysis and interpreting data/business insights Experience developing actionable insights and presenting recommendations for use across the internal/external stakeholder and leadership Experience in ETL and hands-on experience in using ETL tools like SSIS Experience and understanding on Database Performance Tuning, Database Management, Requirements Analysis, Software Development Fundamentals Familiar with Software Development Life Cycle Proven excellent Problem Solving, Documentation Skills, Written & Verbal Communication skills Proven excellent Data Maintenance and promoting Process Improvement skills Proven well-developed written and verbal communications and presentation skills Proven to be open for Development and Support Proven ability to write user and technical documentation Proven ability to conduct analysis to ensure quality of data at various stages of its evolution Proven ability to solve complex problems and develop innovative solutions Proven ability to complete projects that require data mining, analysis, and presentation Proven ability to create, test and execute SQL language code Proven ability to work independently with minimum support/mentoring Proven ability to learn new technology Preferred Qualifications Experience in the health care industry Hands-on experience on ETL processes using SSIS, Netezza, Teradata etc. Experience with ML, deep learning, TensorFlow, NLP Experience in Snowflake Experience developing and maintaining data preparation and validation routines to support data mining and creating complex data mining algorithms General This is a high visibility role requiring high energy due to the constant interaction with the customers. It’s good to have a candidate who has the below abilities: Self-driven, ability to work independently and facilitate change Ability to work in a fast-paced, technical, cross-functional environment Excellent Visual design sense regarding clear and accurate presentation of data Ability to work on projects from inception to completion Excellent critical thinking and analytical skills Ability to maintain understanding of numerous client processes Possess excellent time management and prioritization skills in order to meet multiple deadlines Comfortable working in a high-paced/high production area Desire and ability to learn new skills, systems and processes Anticipate customer needs and proactively develop solutions to meet them At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.SM Diversity creates a healthier atmosphere: UnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law. UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Posted 2 days ago

Apply

6.0 years

4 - 6 Lacs

Hyderābād

On-site

Senior Data Modernization Expert Overview We are building a high-impact Data Modernization Center of Excellence (COE) to help clients modernize their data platforms by migrating legacy data warehouses and ETL ecosystems to Snowflake . We are looking for an experienced and highly motivated Data Modernization Architect with deep expertise in Snowflake, Talend, and Informatica . This role is ideal for someone who thrives at the intersection of data engineering, architecture, and business strategy—and can translate legacy complexity into modern, scalable cloud-native solutions . Key Responsibilities Modernization & Migration Lead end-to-end migration of legacy data warehouses (e.g., Teradata, Netezza, Oracle, SQL Server) to Snowflake. Reverse-engineer complex ETL pipelines built in Talend or Informatica , documenting logic and rebuilding using modern frameworks (e.g., DBT, Snowflake Tasks, Streams, Snowpark). Build scalable ELT pipelines using Snowflake-native patterns , improving cost, performance, and maintainability. Design and validate data mapping, transformation logic , and ensure parity between source and target systems . Implement automation wherever possible (e.g., code converters, metadata extractors, migration playbooks). Architecture & Cloud Integration Architect modern data platforms leveraging Snowflake’s full capabilities : Snowpipe, Streams, Tasks, Materialized Views, Snowpark, and Cortex AI. Integrate with cloud platforms (AWS, Azure, GCP) and orchestrate data workflows with Airflow, Cloud Functions, or Snowflake Tasks . Implement secure, compliant architectures with proper use of RBAC, masking, Unity Catalog, SSO , and external integrations. Communication & Leadership Act as a trusted advisor to internal teams and client stakeholders. Present modernization plans, risks, and ROI to both executive and technical audiences . Collaborate with delivery teams, pre-sales teams, and cloud architects to accelerate migration initiatives . Mentor junior engineers and promote standardization, reuse, and COE asset development . Required Experience 6+ years in data engineering or BI/DW architecture. 3+ years of deep, hands-on Snowflake implementation experience. 2+ years of migration experience from Talend and/or Informatica to Snowflake. Strong command of SQL , data modeling , ELT pipeline design, and performance tuning. Practical knowledge of modern orchestration tools (e.g., Airflow , DBT Cloud , Snowflake Tasks ). Familiarity with legacy metadata parsing , parameterized job execution , and parallel processing logic in ETL tools. Good knowledge of cloud data security , data governance, and compliance standards. Strong written and verbal communication skills; capable of explaining technical concepts to CXOs or developers alike . Bonus / Preferred Snowflake certifications: SnowPro Advanced Architect , SnowPro Core . Experience building custom migration tools or accelerators . Hands-on with LLM-assisted code conversion tools . Experience in key verticals like retail, healthcare, or manufacturing . Why Join This Team? Opportunity to be part of a founding core team defining modernization standards. Exposure to cutting-edge Snowflake features and migration accelerators. High-impact role with visibility across sales, delivery, and leadership . Career acceleration through complex problem-solving and ownership .

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Developer - Java and React/Angular, SQL, API Mastercard is a global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. Your primary responsibilities would include designing, developing, and maintaining software applications using Java and related technologies. In addition to your Java development skills, having expertise in React or Angular would be beneficial in building modern and dynamic user interfaces for web applications. This role requires strong skills in HTML, CSS, and JavaScript as well as experience working with libraries and frameworks like Angular or React. Other important skills for an SSE with full stack development experience may include: Knowledge of software design patterns and best practices Experience of working on Unix environment Proficiency in database technologies such as SQL and NoSQL Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Experience with RESTful web services and API design Experience in full-stack Java development, along with proficiency in Angular or React, would be a valuable asset to this team. Knowledge of Redis will be an added advantage Experience of working on Nifi will be an added advantage Experience of working with APIs will be an added advantage Experience of working in Agile teams Experience in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment will be an added advantage Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to multi-task across multiple projects, interface with external / internal resources and provide technical leadership to junior team members Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Software Engineer "Overview Mastercard is a global technology company behind the world’s fastest payments processing network. We are a vehicle for commerce, a connection to financial systems for the previously excluded, a technology innovation lab, and the home of Priceless®. We ensure every employee has the opportunity to be a part of something bigger and to change lives. We believe as our company grows, so should you. We believe in connecting everyone to endless, priceless possibilities. As a Data Engineer in Data Platform and Engineering Services, you will have the opportunity to build high performance data pipelines to load into Mastercard Data Warehouse. Our Data Warehouse provides analytical capabilities to a number of business users who help different customers provide answers to their business problems through data. You will play a vital role within a rapidly growing organization, while working closely with experienced and driven engineers to solve challenging problems. Role Develop high quality, secure and scalable data pipelines using spark, Java/Scala on object storage and Hadoop Follow MasterCard Quality Assurance and Quality Control processes Leverage new technologies and approaches to innovating with increasingly large data sets Work with project team to meet scheduled due dates, while identifying emerging issues and recommending solutions for problems Perform assigned tasks and production incident independently Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency All About You 3 to 5 years of experience in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment Experience of building data pipelines through Spark with Java/Scala on Hadoop or Object storage Experience of working with Databases like Oracle, Netezza and have strong SQL knowledge Experience of working on Nifi will be an added advantage Experience of working with APIs will be an added advantage Experience of working in Agile teams Strong analytical skills required for debugging production issues, providing root cause and implementing mitigation plan Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills Ability to multi-task across multiple projects, interface with external / internal resources and provide technical leadership to junior team members Ability to be high-energy, detail-oriented, proactive and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement Flexibility to work as a member of a matrix based diverse and geographically distributed project teams" Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines.

Posted 2 days ago

Apply

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

As a Data Architect at Beinex located in Kochi, Kerala, you will be responsible for collaborating with the Sales team to build RFPs, Pre-sales activities, Project Delivery, and support. Your role will involve delivering on-site technical engagements with customers, participating in pre-sales visits, understanding customer requirements, defining project timelines, and implementing solutions. Additionally, you will work on both on and off-site projects to assist customers in migrating from their existing data warehouses to Snowflake and other databases. You should have at least 8 years of experience in IT platform implementation, development, DBA, and Data Migration in Relational Database Management Systems (RDBMS). Furthermore, you should possess 5+ years of hands-on experience in implementing and performance tuning MPP databases. Proficiency in Snowflake, Redshift, Databricks, or Azure Synapse is essential, along with the ability to prioritize projects effectively. Experience in analyzing Data Warehouses such as Teradata, Netezza, Oracle, and SAP will be valuable in this role. Your responsibilities will also include designing database environments, analyzing production deployments, optimizing performance, writing SQL, stored procedures, conducting Data Validation and Data Quality tests, and planning migrations to Snowflake. You will be expected to possess strong communication skills, problem-solving abilities, and the capacity to work effectively both independently and as part of a team. At Beinex, you will have access to various perks including comprehensive health plans, learning and development opportunities, workation and outdoor training, a hybrid working environment, and on-site travel opportunities. Join us to be a part of a dynamic team and advance your career in a supportive and engaging work environment.,

Posted 3 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

SQL Lead - COE Location: Pune Experience: 6-10 Years We are looking to hire a Data Engineer with strong hands-on experience in SQL and PL/SQL. Required Skills & Abilities : 6+ Years experience in any Databases (MS-SQL /Oracle /Teradata/Netezza). 4+ Years of experience to manage team and client calls. Strong expertise in writing complex SQL queries, joins, subqueries, and analytical functions. Hands-on experience with stored procedures, functions, triggers, packages, and cursors. Understanding of database design principles, normalization, and partitioning Knowledge of ETL processes, data migration, and data transformation. Experience working with Oracle SQL Developer or other database tools. Ability to analyze requirements and translate them into efficient database solutions Familiarity with UNIX/Linux shell scripting for automation (preferred). Strong problem-solving and debugging skills. Good communication skills and ability to work in a collaborative environment. Key Responsibilities: Develop, optimize, and maintain PL/SQL stored procedures, functions, triggers, and packages. Write complex SQL queries, views, and indexes for data manipulation and reporting. Optimize SQL queries and database performance using indexing, partitioning, and query tuning techniques. Ensure data integrity and security by implementing constraints, validations, and best practices. Work with cross-functional teams to understand business requirements and design efficient database solutions. Troubleshoot database issues, debug PL/SQL code, and improve query performance. Implement ETL processes using SQL and PL/SQL. Perform database schema design, normalization, and optimization. Collaborate with DBA teams for database backup, recovery, and maintenance. Develop and maintain database documentation, coding standards, and best practices. Preferred Qualifications: Experience with cloud databases (GCP or any other cloud is a plus) is a plus. Exposure to big data technologies like Hadoop, Spark (optional)

Posted 1 week ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka

On-site

- 2+ years of processing data with a massively parallel technology (such as Redshift, Teradata, Netezza, Spark or Hadoop based big data solution) experience - 2+ years of relational database technology (such as Redshift, Oracle, MySQL or MS SQL) experience - 2+ years of developing and operating large-scale data structures for business intelligence analytics (using ETL/ELT processes) experience - 5+ years of data engineering experience - Experience managing a data or BI team - Experience communicating to senior management and customers verbally and in writing - Experience leading and influencing the data or BI strategy of your team or organization - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS As a Data Engineering Manager, you will lead a team of data engineers, front end engineers and business intelligence engineers. You will own our internal data products (Yoda), transform to AI, build agents and scale them for IN and emerging stores. You will provide technical leadership, drive application and data engineering initiatives and build end-to-end data solutions that are highly available, scalable, stable, secure, and cost-effective. You strive for simplicity, demonstrate creativity with sound judgement. You deliver data & reporting solutions that are customer focused, easy to consume and create business impact. You are passionate about working with huge datasets and have experience with the organization and curation of data for analytics. You have a strategic and long-term view on architecture of advanced data eco systems. You are experienced in building efficient and scalable data services and have the ability to integrate data systems with AWS tools and services to support a variety of customer use cases/applications Key job responsibilities • Lead a team of data engineers, front end engineers and business intelligence engineers to deliver cross-functional, data and application engineering projects for Databases, Analytics and AI/ML services, • Establish and clearly communicate organizational vision, goals and success measures, • Collaborate with business stakeholders to develop roadmap and product requirements, • Build, Own, Prioritize, Lead and Deliver a roadmap of large and complex multi-functional projects and programs, • Manage AWS infrastructure, IMR cost and RDS/Dynamo instances • Interface with other technology teams to extract, transform, and load data from a wide variety of data sources, • Own the design, development, and maintenance of metrics, reports, dashboards, etc. to drive key business decisions. About the team CoBRA is the Central BI Reporting and Analytics org for IN stores and AI partner for International emerging stores . CoBRA team's mission is to empower Category and Seller orgs including Brand, Account, marketing and product/program teams with self-service products using AI (Yoda and bedrock agents), build actionable insights (Quicksight Q, Cutstom agents, Q- business) and help them make faster and smart decisions using science solutions across Amazon fly wheel on all inputs (Selection, Pricing and Speed). Experience with big data technologies such as: Hadoop, Hive, Spark, EMR Experience with AWS Tools and Technologies (Redshift, S3, EC2) Knowledge of building AI tools, AWS bedrock agents, LLM/foundational models Experience in supporting ML models for data needs Exposure to prompt engineering and upcoming AI technologies and its landscape Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner.

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Introduction At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, let’s talk. Your Role And Responsibilities The Data & AI, India Software Labs is looking for enthusiastic and talented software developers to join us. Our product portfolio includes several competitive offerings in the market like WatsonX, Cloud Pak for Data / Integration, DB2, OpenPages, SPSS Analytics, Information Governance, Netezza to name few. Software Developers at IBM are the backbone of our strategic initiatives to design, code, test, and provide industry-leading solutions that make the world run today – planes and trains take off on time, bank transactions complete in the blink of an eye and the world remains safe because of the work our software developers do. Whether you are working on projects internally or for a client, software development is critical to the success of IBM and our clients worldwide. At IBM, you will use the latest software development tools, techniques and approaches and work with leading minds in the industry to build solutions you can be proud of. Design, develop, test, operate and maintain database features in our products and services and tools to provide a secure environment for the product to be used by customers in the cloud. Evaluate new technologies and processes that enhance our service capabilities. Documenting and sharing your experience, mentoring others Required Technical And Professional Expertise Programming Proficiency: Strong grasp of at least one or two programming languages like Python, Java, GoLang. Ability to write clean, efficient, and logical code. Example: Solving basic coding problems (e.g., array manipulation, string processing) on platforms like LeetCode or HackerRank. Data Structures and Algorithms (DSA): Understanding of fundamental DSA concepts (arrays, linked lists, stacks, queues, trees, graphs, sorting, searching, etc.) and their practical applications. Database Knowledge: Familiarity with relational databases (e.g., MySQL, PostgreSQL) and basic SQL queries. Knowledge of NoSQL databases (e.g., MongoDB) is a plus. Web Development (for relevant roles): Basic understanding of HTML, CSS, JavaScript, and frameworks like React or Node.js for front-end/back-end roles. Software Development Basics: Familiarity with version control (e.g., Git), SDLC (Software Development Life Cycle), and basic debugging skills. Exposure to Emerging Technologies: Awareness of trending areas like cloud computing (AWS, Azure), AI/ML basics, or DevOps tools (Docker, Kubernetes) is a bonus. Preferred Technical And Professional Experience Any Shell scripting languages Bash/Perl/Python/Ruby

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

As a Senior Software Developer specializing in Java and React/Angular, SQL, and API, your main role will involve the design, development, and maintenance of software applications utilizing Java and its related technologies. Your proficiency in React or Angular will be advantageous for creating modern and dynamic user interfaces for web applications. Alongside your Java skills, it is essential to possess a strong understanding of HTML, CSS, and JavaScript, as well as experience with frameworks like Angular or React. For this position, it is crucial to have knowledge in various areas such as software design patterns, Unix environment, database technologies including SQL and NoSQL, and working with databases like Oracle and Netezza. Experience in RESTful web services, API design, full-stack Java development, and familiarity with Angular or React would be highly beneficial. Additional assets include knowledge of Redis and experience with Nifi and APIs. Being part of Agile teams, you will contribute your expertise in Data Engineering and implementing end-to-end DW projects in a Big Data environment. Strong analytical abilities are required for debugging production issues, offering root cause analysis, and implementing mitigation plans. Effective communication, both verbal and written, is essential, along with excellent relationship-building, collaboration, and organizational skills. In this role, you will need to multitask across various projects, interact with internal and external resources, and provide technical guidance to junior team members. Your high-energy, detail-oriented, and proactive approach, combined with the ability to work under pressure independently, will be invaluable. Initiative and self-motivation are key qualities for driving results. You should also be quick to learn and apply new technologies, conducting POCs to identify optimal solutions for problem statements. The flexibility to collaborate in diverse and geographically distributed project teams within a matrix-based environment is also essential for this role.,

Posted 1 week ago

Apply

6.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

As a Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Responsibilities Drive technical conversations with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications Preffered 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Or 5+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and other cloud products (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 week ago

Apply

0 years

0 Lacs

Cochin

On-site

Job Req ID: 47667 Location: Ernakulam, IN Function: Other About: Vodafone Idea Limited is an Aditya Birla Group and Vodafone Group partnership. It is India’s leading telecom service provider. The Company provides pan India Voice and Data services across 2G, 3G and 4G platform. With the large spectrum portfolio to support the growing demand for data and voice, the company is committed to deliver delightful customer experiences and contribute towards creating a truly ‘Digital India’ by enabling millions of citizens to connect and build a better tomorrow. The Company is developing infrastructure to introduce newer and smarter technologies, making both retail and enterprise customers future ready with innovative offerings, conveniently accessible through an ecosystem of digital channels as well as extensive on-ground presence. The Company is listed on National Stock Exchange (NSE) and Bombay Stock Exchange (BSE) in India. We're proud to be an equal opportunity employer. At VIL, we know that diversity makes us stronger. We are committed to a collaborative, inclusive environment that encourages authenticity and fosters a sense of belonging. We strive for everyone to feel valued, connected and empowered to reach their potential and contribute their best. VIL's goal is to build and maintain a workforce that is diverse in experience and background but uniform in reflecting our Values of Passion, Boldness, Trust, Speed and Digital. Consequently, our recruiting efforts are directed towards attracting and retaining best and brightest talents. Our endeavour is to be First Choice for prospective employees. VIL ensures equal employment opportunity without discrimination or harassment based on race, colour, religion, creed, age, sex, sex stereotype, gender, gender identity or expression, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy, veteran or military service status, genetic information, or any other characteristic protected by law. VIL is an equal opportunity employer committed to diversifying its workforce. Role Collections & Retentions BO Job Level/ Designation M2 Function / Department Customer Service Location Ernakulam Job Purpose Be the backbone for the overall Retail Retention and Collection team and provide best possible support and data to drive the front end operations to deliver companies critical KPI’s and performance. Responsible for driving all post-paid mobility retention aspects by bringing in analytics on Churn The job involves to manage back office operations and coordination with all the critical department (SSC, Corporate, CS, VIBS, Network, IT) to ensure smooth functioning of overall operations at all levels. Key Result Areas/Accountabilities Provide input through simulations in monthly target setting and incentive roll out for all KPI's with management team. Allocation of Voluntary and Involuntary Churn request to respective teams Manage cost and payout commission process at circle, including creation of GRN and PO. Address queries from agencies and get them closed from SSC Manage Receipt book process and governance to ensure that there is no fraudulent activity in the collection process. Execute circle level deviation activities i.e. Red together O/S transfer, NCP waiver deactivation etc. Drive nonphysical collection follow through circle i.e. SMS, E-mail, OBD etc. Support collection agencies and other departments with data/operational MIS on provision, collection, churn, wavier, cost etc. Coordination with various department for complaints closure and other activities. Core Competencies, Knowledge, Experience Strong data and analytical skills. Hands on experience of MS Access, Excel and Presentation tools Exposure to BI, Netezza and COGNOS Sound accounting / process / systems knowledge / experience Capability to influence cross functionally under matrix structure Good Communication Skills and presentation skills Must have technical / professional qualifications Graduate / PG in any field Communication skills MS Office & Presentation Skills Vodafone Idea Limited (formerly Idea Cellular Limited) An Aditya Birla Group & Vodafone partnership

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

punjab

On-site

As an ETL-DW Test Manager in the BFSI Domain in Australia, you will be responsible for the successful delivery of multiple testing projects within approved scope, budget, schedule, and quality expectations. You will work closely with external testing service providers to ensure test functions are delivered efficiently. Your role will involve conducting assurance activities across all test deliverables, including reviewing test estimates, strategies, schedules, plans, and completion reports. You will be tasked with managing and influencing key testing stakeholders and project sponsors, promoting test process excellence, efficiency, and cost reduction in project deployment. Additionally, you will define the Test Strategy for BI projects and take on a combined lead and test execution role for large project work as assigned by the Solution Design and Test Governance Manager. Your key tasks will include project planning, resourcing, and scheduling for assigned engagements, providing leadership to internal and external teams, and being the primary point of contact for staff and customer delivery issues. You will ensure that services and product delivery meet client expectations and adhere to customer delivery requirements. Quality assurance for engagement deliverables and risk management in the delivery of engagements will also be part of your responsibilities. To be successful in this role, you should have at least 10 years of experience in governing and managing IT testing functions, preferably within a financial services organization. Experience in leading large teams and programs for delivering technical outcomes, along with expertise in reporting, documentation, Agile delivery, CI/CD, and continuous testing, is essential. You should also have a strong background in managing outcomes within large enterprise environments and risk assessment. Desired skills for this role include compliance/regulatory experience, a banking/finance background, and familiarity with software products from Netezza, AWS, SAS, Cognos, and Tableau. Experience working in the Business Intelligence domain, as well as with JIRA and Confluence, will be beneficial. In summary, the ETL-DW Test Manager in the BFSI Domain in Australia plays a crucial role in ensuring the successful delivery of testing projects, managing stakeholders, promoting process excellence, and driving efficiency and cost reduction in project deployment. Your expertise in test management, software development lifecycle, test processes, and risk management will be key to delivering successful outcomes in a complex environment.,

Posted 1 week ago

Apply

0 years

0 Lacs

Ernakulam, Kerala, India

On-site

Vodafone Idea Limited is an Aditya Birla Group and Vodafone Group partnership. It is India’s leading telecom service provider. The Company provides pan India Voice and Data services across 2G, 3G and 4G platform. With the large spectrum portfolio to support the growing demand for data and voice, the company is committed to deliver delightful customer experiences and contribute towards creating a truly ‘Digital India’ by enabling millions of citizens to connect and build a better tomorrow. The Company is developing infrastructure to introduce newer and smarter technologies, making both retail and enterprise customers future ready with innovative offerings, conveniently accessible through an ecosystem of digital channels as well as extensive on-ground presence. The Company is listed on National Stock Exchange (NSE) and Bombay Stock Exchange (BSE) in India. We're proud to be an equal opportunity employer. At VIL, we know that diversity makes us stronger. We are committed to a collaborative, inclusive environment that encourages authenticity and fosters a sense of belonging. We strive for everyone to feel valued, connected and empowered to reach their potential and contribute their best. VIL's goal is to build and maintain a workforce that is diverse in experience and background but uniform in reflecting our Values of Passion, Boldness, Trust, Speed and Digital. Consequently, our recruiting efforts are directed towards attracting and retaining best and brightest talents. Our endeavour is to be First Choice for prospective employees. VIL ensures equal employment opportunity without discrimination or harassment based on race, colour, religion, creed, age, sex, sex stereotype, gender, gender identity or expression, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy, veteran or military service status, genetic information, or any other characteristic protected by law. VIL is an equal opportunity employer committed to diversifying its workforce. Role Collections & Retentions BO Job Level/ Designation M2 Function / Department Customer Service Location Ernakulam Job Purpose Be the backbone for the overall Retail Retention and Collection team and provide best possible support and data to drive the front end operations to deliver companies critical KPI’s and performance. Responsible for driving all post-paid mobility retention aspects by bringing in analytics on Churn The job involves to manage back office operations and coordination with all the critical department (SSC, Corporate, CS, VIBS, Network, IT) to ensure smooth functioning of overall operations at all levels. Key Result Areas/Accountabilities Provide input through simulations in monthly target setting and incentive roll out for all KPI's with management team. Allocation of Voluntary and Involuntary Churn request to respective teams Manage cost and payout commission process at circle, including creation of GRN and PO. Address queries from agencies and get them closed from SSC Manage Receipt book process and governance to ensure that there is no fraudulent activity in the collection process. Execute circle level deviation activities i.e. Red together O/S transfer, NCP waiver deactivation etc. Drive nonphysical collection follow through circle i.e. SMS, E-mail, OBD etc. Support collection agencies and other departments with data/operational MIS on provision, collection, churn, wavier, cost etc. Coordination with various department for complaints closure and other activities. Core Competencies, Knowledge, Experience Strong data and analytical skills. Hands on experience of MS Access, Excel and Presentation tools Exposure to BI, Netezza and COGNOS Sound accounting / process / systems knowledge / experience Capability to influence cross functionally under matrix structure Good Communication Skills and presentation skills Must Have Technical / Professional Qualifications Graduate / PG in any field Communication skills MS Office & Presentation Skills Vodafone Idea Limited (formerly Idea Cellular Limited) An Aditya Birla Group & Vodafone partnership

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As an experienced professional in the field of data engineering, you will demonstrate a strong proficiency in Netezza and Oracle by adeptly crafting complex SQL queries. Your expertise in Netezza and Oracle will be instrumental in upholding data integrity, security, and quality. It is essential that you possess a good understanding of the latest trends and best practices in data engineering and ETL tools. Your knowledge of Stored Procedures and Data Modeling will be crucial in optimizing database performance and efficiency. In addition to technical skills, excellent interpersonal and communication skills are required to effectively collaborate with team members and stakeholders. Your ability to convey complex technical concepts in a clear and concise manner will be key to your success in this role.,

Posted 1 week ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Minimum qualifications: Bachelor's degree in Computer Science or related technical field or equivalent practical experience. 6 years of experience as a technical sales engineer in a cloud computing environment or a customer-facing role. Experience with Apache Spark and analytic warehouse solutions (e.g., Teradata, Netezza, Vertica, SQL-Server, and Big Data technologies). Experience implementing analytics systems architecture. Preferred qualifications: Master's degree in Computer Science or a related technical field. Experience with technical sales or professional consulting in cloud computing, data, information life-cycle management and Big Data. Experience in data warehousing, data lakes, batch/real-time processing and Extract, Transform, and Load (ETL) workflow including architecture design, implementing, tuning and schema design. Experience with coding languages like Python, JavaScript, C++, Scala, R, or Go. Knowledge of Linux, Web 2.0 development platforms, solutions, and related technologies like HTTP, Basic/NTLM,sessions, XML/XSLT/XHTML/HTML. Understanding of DNS, TCP, Firewalls, Proxy Servers, DMZ, Load Balancing, VPN, VPC. About The Job The Google Cloud Platform team helps customers transform and build what's next for their business — all with technology built in the cloud. Our products are developed for security, reliability and scalability, running the full stack from infrastructure to applications to devices and hardware. Our teams are dedicated to helping our customers — developers, small and large businesses, educational institutions and government agencies — see the benefits of our technology come to life. As part of an entrepreneurial team in this rapidly growing business, you will play a key role in understanding the needs of our customers and help shape the future of businesses of all sizes use technology to connect with customers, employees and partners. Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems. Responsibilities Support local sales teams in pursuing business opportunities by engaging customers to address data life-cycle aspects. Collaborate with business teams to identify business and technical requirements, conduct full technical discovery and architect client solutions. Lead technical projects, including technology advocacy, bid response support, product briefings, proof-of-concept work and co-ordinating technical resources. Leverage Google Cloud Platform products to demonstrate and prototype integrations in customer/partner environments.Travel for meetings, technical reviews, on-site delivery activities as needed. Deliver compelling product messaging to highlight the Google Cloud Platform value proposition through whiteboard and slide presentations, product demonstrations, white papers and Request For Information (RFI) response documents. Google is proud to be an equal opportunity workplace and is an affirmative action employer. We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. See also Google's EEO Policy and EEO is the Law. If you have a disability or special need that requires accommodation, please let us know by completing our Accommodations for Applicants form .

Posted 1 week ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka

Remote

Sr. Digital Cloud Solution Architect Bangalore, Karnataka, India Date posted Jul 18, 2025 Job number 1841686 Work site Up to 50% work from home Travel None Role type Individual Contributor Profession Digital Sales and Solutions Discipline Digital Cloud Solution Architecture Employment type Full-Time Overview Are you excited about Microsoft Azure? We work with customers to help them achieve their business priorities and help guide customers in their Cloud & AI transformation journey. We also support customers in evaluating their applications and business requirements, recommend solutions that meet their requirements, and demonstrate these solutions to win technical decisions. Microsoft believes that Digital Native Startups and Unicorns have the potential to change the world. These companies are transforming e-commerce, fintech, social media, gaming, and more. They have given us cutting edge applications that are changing the way we live and work. With Microsoft Cloud, we aspire to empower every Startup and Unicorn to innovate and scale our platform, and to grow through our regional ecosystem of customers, developers, partners and investors. You will also have an opportunity to work cross-collaboratively while living our shared IPS Culture priorities: Diversity and Inclusivity, Wellbeing, Sustainability, and Learning. If you have been described as customer obsessed and have a passion for digital-first solutions, we invite you to learn more about our organization and the value we deliver to our customers, partners, and one another, every day. We are looking for a Digital Cloud Solution Architect (D-CSA) who is passionate about driving our customers’ application innovation & AI transformation on the Microsoft platform. As part of this role, you will be responsible for technical customer engagements, working with the most challenging and exciting projects within Microsoft Azure and GitHub customer base. This customer-facing position is a hands-on technical role spanning across design, build, and operations with a focus on issue resolution to remove customer technical obstacles and adoption challenges. You will work with customers to lead deep technical architecture discussions and engage with senior customer executives, enterprise architects, platform engineers and developers. You will collaborate with a variety of internal and external teams to develop pilots and oversee implementation projects, ensuring technical blockers are removed on adoption. This role is flexible in that you can work up to 50% from home. Microsoft’s mission is to empower every person and every organization on the planet to achieve more. As employees we come together with a growth mindset, innovate to empower others, and collaborate to realize our shared goals. Each day we build on our values of respect, integrity, and accountability to create a culture of inclusion where everyone can thrive at work and beyond. Qualifications Required Qualifications (RQs) Bachelor's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND 4+ years experience in cloud/infrastructure technologies, information technology (IT) consulting/support, systems administration, network operations, software development/support, technology solutions, practice development, architecture, and/or consulting OR equivalent experience Preferred Qualifications (PQs) Bachelor's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND 8+ years experience in cloud/infrastructure technologies, information technology (IT) consulting/support, systems administration, network operations, software development/support, technology solutions, practice development, architecture, and/or consulting OR Master's Degree in Computer Science, Information Technology, Engineering, Business, or related field AND 6+ years experience in cloud/infrastructure technologies, technology solutions, practice development, architecture, and/or consulting OR equivalent experience 4+ years experience working in a customer-facing role (e.g., internal and/or external) 4+ years experience working on technical projects Technical experience and knowledge in Enterprise-scale, technical experience with cloud and hybrid infrastructures, architecture designs, migrations, and technology management. Proficient in .Net, Java, JavaScript/Node.js or Python development languages and related frameworks Technical Certification in Cloud (e.g., Azure, Amazon Web Services, Google, security certifications) Data Azure Certifications: DP-600 Fabric Analytics Engineer Associate or DP-700 Fabric Data Engineer Associate or DP-420 Azure Cosmos DB Developer or DP-203 Azure Data Engineer Associate or DP-300 Azure Database Administrator Associate Apps Azure Certifications: AI-102 Azure AI Engineer Associate or AZ-204 Azure Developer Associate or AZ-400 DevOps Engineer Expert Azure Certifications: AZ-305 Azure Solutions Architect Expert or AZ-104 Azure Administrator Associate or AZ-700 Azure Network Engineer Associate or AZ-500 Azure Security Engineer Associate or AZ-800 Administering Windows Server Hybrid Core Infrastructure #SMC26 Responsibilities Key Responsibilities include: Gather customer insights to map solutions and services with customer business outcomes. Identify opportunities to improve customer solutions and position services to help customers to achieve their objectives. Help accelerate solution delivery and adoption through Value Based Deliveries and repeatable Intellectual Property IP). Support customer skilling by delivering technical discussions, workshops, etc. that enable operational health and cloud readiness. Contribute to customer satisfaction by providing a positive customer experience. Identify opportunities to drive consumption and grow business with existing customers by initiating conversations, providing demos or quotes, and collaborating with partners or internal teams (e.g., Technical Sales Professionals, Global Black Belts). For licensing transactions and project engagements, ensures rapid and robust deployment plan at point of sale that is validated by services and partners. Identify opportunities to expand and accelerate cloud consumption, drive business results, and help customers get value from their Microsoft investments in alignment with the Customer Success Account Management team or other Account team members. Business Value. Ability to utilize tools such as the Azure Pricing Calculator, Azure ROI Tool, and Azure TCO Tool to generate consumption project cost estimates and demonstrate Cloud economic value to customers, is preferable. Share ideas, insights, and strategic technical input with technical teams and internal communities. Participate in external technical community events and generate new ideas for changes and improvements to existing intellectual property (IP), technologies, and processes. Drives opportunities for IP reuse, best practice sharing, and consumption acceleration as well as obtain relevant accreditations and certifications. The candidate will have depth knowledge in one of the below areas while having breadth knowledge of the other two areas: Apps and AI: Critical to maintain and grow expertise in AI Foundry & App architecture (Agentic AI framework, TensorFlow, Pytorch, Responsible AI) and App architecture/cloud native dev (APIs, containerization, microservices, event-driven). Important to maintain and grow expertise in AI Management & Security (Gen AI Ops, Sentinel, orchestrator, monitoring). Learn new technologies or services that are aligned to customer needs and common patterns seen in Cloud application development and stay current with the latest Azure, AI and DevOps/GitHub capabilities and practices, cloud application patterns and be a practitioner of one or more enterprise languages .Net, Java, JavaScript/Node.js, Python, etc. and related frameworks. Identify issues and advises customers to operate and optimize performance in accordance with Microsoft best practices and resolve customer blockers to accelerate consumption by leveraging technical expertise and knowledge of Microsoft solutions, escalate to support and engineering as appropriate. Data Critical to maintain and grow expertise in Fabric Data Platform, DW, real-time intelligence, BI, Purview and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Important to maintain and grow expertise in on-prem EDW appliance (Teradata, Netezza, Exadata), Hadoop & BI migration, and Azure Databricks Infra: Build trusted advisor relationship with customers’ technical decision makers (TDMs and TDM-1) and use proactive effort to find and understand customers’ pain points and work together with customers to co-develop secure & resilient solution architecture for production scale delivery using Cloud Adoption Framework (CAF) best practices including Unified Support and Cloud Migration Factory (CMF) for Opensource workloads offer for every opportunity. Overcome competitors and technical objections and manage customer escalation with Global Black Belt and engineering team to share insights and best practices for product improvements. Critical to maintain and grow expertise in Cloud Migration (Linux, PGSQL, App workloads, resiliency, security, compliance). Benefits/perks listed below may vary depending on the nature of your employment with Microsoft and the country where you work.  Industry leading healthcare  Educational resources  Discounts on products and services  Savings and investments  Maternity and paternity leave  Generous time away  Giving programs  Opportunities to network and connect Microsoft is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : IBM Netezza Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and deliver effective solutions that meet business needs, while also troubleshooting any issues that arise in the data flow and processing stages. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve data processes to enhance efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM Netezza.- Good To Have Skills: Experience with data warehousing concepts and practices.- Strong understanding of ETL processes and data integration techniques.- Familiarity with data modeling and database design principles.- Experience with performance tuning and optimization of data queries. Additional Information:- The candidate should have minimum 7.5 years of experience in IBM Netezza.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

As a Site Reliability Engineer, you will work in an agile, collaborative environment to build, deploy, configure, and maintain systems for the IBM client business. In this role, you will lead the problem resolution process for our clients, from analysis and troubleshooting, to deploying the latest software updates & fixes. Your primary responsibilities include: 24x7 Observability: Be part of a worldwide team that monitors the health of production systems and services around the clock, ensuring continuous reliability and optimal customer experience. Cross-Functional Troubleshooting: Collaborate with engineering teams to provide initial assessments and possible workarounds for production issues. Troubleshoot and resolve production issues effectively. Deployment and Configuration: Leverage Continuous Delivery (CI/CD) tools to deploy services and configuration changes at enterprise scale. Security and Compliance Implementation: Implementing security measures that meet or exceed industry standards for regulations such as GDPR, SOC2, ISO 27001, PCI, HIPAA, and FBA. Maintenance and Support: Tasks related to applying Couchbase security patches and upgrades, supporting Cassandra and Mongo for pager duty rotation, and collaborating with Couchbase Product support for issue resolution. Required education Bachelor's Degree Preferred education Bachelor's Degree Required technical and professional expertise Bachelor’s degree in Computer Science, IT, or equivalent. 5+ years of experience in any database either Netezza, Db2 or MSSQL etc. 5+ years of experience in DevOps, CloudOps, or SRE roles. Foundational experience with Linux/Unix systems. Hands-on exposure to cloud platforms (IKS, AWS, or Azure). Understanding of networking and databases. Strong troubleshooting and problem-solving skills. Preferred technical and professional experience Databases :Strongly preferred experience in working with Netezza/Db2 databases Adminstration. Monitor and optimize DB performance and reliability. Configure and troubleshoot database issues Kubernetes/OpenShift: Strongly preferred experience in working with production Kubernetes/OpenShift environments. Automation/Scripting: In depth experience with the Ansible, Python, Terraform, and CI/CD tools such as Jenkins, IBM Continuous Delivery, ArgoCD Monitoring/Observability: Hands on experience crafting alerts and dashboards using tools such as Instana, New Relic, Grafana/Prometheus

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Are you insatiably curious, deeply passionate about the realm of databases and analytics, and ready to tackle complex challenges in a dynamic environment in the era of AI? If so, we invite you to join our team as a Cloud & AI Solution Engineer in Innovative Data Platform for commercial customers at Microsoft. Here, you'll be at the forefront of innovation, working on cutting-edge projects that leverage the latest technologies to drive meaningful impact. Join us and be part of a team that thrives on collaboration, creativity, and continuous learning. Databases & Analytics is a growth opportunity for Microsoft Azure, as well as its partners and customers. It includes a rich portfolio of products including IaaS and PaaS services on the Azure Platform in the age of AI. These technologies empower customers to build, deploy, and manage database and analytics applications in a cloud-native way. As an Innovative Data Platform Solution Engineer (SE), you will play a pivotal role in helping enterprises unlock the full potential of Microsoft’s cloud database and analytics stack across every stage of deployment. You’ll collaborate closely with engineering leaders and platform teams to accelerate the Fabric Data Platform, including Azure Databases and Analytics, through hands-on engagements like Proof of Concepts, hackathons, and architecture workshops. This opportunity will allow you to accelerate your career growth, develop deep business acumen, hone your technical skills, and become adept at solution design and deployment. You’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform, all while enjoying flexible work opportunities. As a trusted technical advisor, you’ll guide customers through secure, scalable solution design, influence technical decisions, and accelerate database and analytics migration into their deployment workflows. In summary, you’ll help customers modernize their data platform and realize the full value of Microsoft’s platform. Responsibilities Drive technical sales with decision makers using demos and PoCs to influence solution design and enable production deployments. Lead hands-on engagements—hackathons and architecture workshops—to accelerate adoption of Microsoft’s cloud platforms. Build trusted relationships with platform leads, co-designing secure, scalable architectures and solutions Resolve technical blockers and objections, collaborating with engineering to share insights and improve products. Maintain deep expertise in Analytics Portfolio: Microsoft Fabric (OneLake, DW, real-time intelligence, BI, Copilot), Azure Databricks, Purview Data Governance and Azure Databases: SQL DB, Cosmos DB, PostgreSQL. Maintain and grow expertise in on-prem EDW (Teradata, Netezza, Exadata), Hadoop & BI solutions. Represent Microsoft through thought leadership in cloud Database & Analytics communities and customer forums Qualifications 10+ years technical pre-sales or technical consulting experience OR Bachelor's Degree in Computer Science, Information Technology, or related field AND 4+ years technical pre-sales or technical consulting experience OR Master's Degree in Computer Science, Information Technology, or related field AND 3+ year(s) technical pre-sales or technical consulting experience OR equivalent experience Expert on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernize and creating new AI apps. Expert on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligent, and reporting using integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs, MVPs) that drive production-scale outcomes. 6+ years technical pre-sales, technical consulting, or technology delivery, or related experience OR equivalent experience 4+ years experience with cloud and hybrid, or on premises infrastructure, architecture designs, migrations, industry standards, and/or technology management Proficient on data warehouse & big data migration including on-prem appliance (Teradata, Netezza, Oracle), Hadoop (Cloudera, Hortonworks) and Azure Synapse Gen2. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

P2-C3-TSTS Primary Skills knowledge specially in Risk or finance technology area Strong SQL knowledge, involving complex joins and analytical functions Good understanding of Data Flow , Data Model and database applications working knowledge of Databases like Oracle and Netezza Secondary Skills Conceptual knowledge of ETL and datawarehousing, working knowledge is added advantage Basic knowledge of Java is added advantage JD Seeking professionals with capability to perform thorough analysis and articulation of risk or Finance Tech model data requirements, identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools .He will provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on this basis .He works with large amounts of data: facts, figures, and number crunching. Responsibility Perform thorough analysis and articulation of risk or Finance Tech model data requirements, Identification and understanding of specific data quality issues to ensure effective delivery data to the users using standard tools . provide analysis of internal and external regulations (credit risk) and preparation of functional documentation on external regulations (credit risk) Need to work with large amounts of data: facts, figures, and number crunching.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Delhi, India

On-site

Where Data Does More. Join the Snowflake team. We are looking for a Solutions Architect to be part of our Professional Services team to deploy cloud products and services for our customers' Global Competency Centers located in India. This person must be a hands-on, self-starter who loves solving innovative problems in a fast-paced, agile environment. The ideal candidate will have the insight to connect a specific business problem and Snowflake’s solution and communicate that connection and vision to various technical and executive audiences. This person will have a broad range of skills and experience ranging from data architecture to ETL, security, performance analysis, analytics, etc. He/she will have the insight to make the connection between a customer’s specific business problems and Snowflake’s solution, the customer-facing skills to communicate that connection and vision to a wide variety of technical and executive audiences, and the technical skills to be able to not only build demos and execute proof-of-concepts but also to provide consultative assistance on architecture and implementation. The person we’re looking for shares our passion for reinventing the data platform and thrives in a dynamic environment. That means having the flexibility and willingness to jump in and get it done to make Snowflake and our customers successful. It means keeping up to date on the ever-evolving data and analytics technologies, and working collaboratively with a broad range of people inside and outside the company to be an authoritative resource for Snowflake and its customers. AS A SOLUTIONS ARCHITECT AT SNOWFLAKE YOU WILL: Be a technical expert on all aspects of Snowflake Present Snowflake technology and vision to executives and technical contributors to customers. Position yourself as a Trusted Advisor to key customer stakeholders with a focus on achieving their desired Business Outcomes. Drive project teams towards common goals of accelerating the adoption of Snowflake solutions. Demonstrate and communicate the value of Snowflake technology throughout the engagement, from demo to proof of concept to running workshops, design sessions and implementation with customers and stakeholders. Create repeatable processes and documentation as a result of customer engagement. Collaborate on and create Industry based solutions that are relevant to other customers in order to drive more value out of Snowflake. Deploy Snowflake following best practices, including ensuring knowledge transfer so that customers are correctly enabled and can extend the capabilities of Snowflake on their own. Maintain a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them. Work with System Integrator consultants at a deep technical level to successfully position and deploy Snowflake in customer environments Be able to position and sell the value of Snowflake professional services for ongoing delivery OUR IDEAL SOLUTIONS ARCHITECT WILL HAVE: Minimum 10 years of experience working with customers in a pre-sales or post-sales technical role University degree in computer science, engineering, mathematics or related fields, or equivalent experience Outstanding skills presenting to both technical and executive audiences, whether impromptu on a whiteboard or using presentations and demos Understanding of complete data analytics stack and workflow, from ETL to data platform design to BI and analytics tools Strong skills in databases, data warehouses, and data processing Extensive hands-on expertise with SQL and SQL analytics Proficiency in implementing data security measures, access controls, and design within the Snowflake platform. Extensive knowledge of and experience with large-scale database technology (e.g. Netezza, Exadata, Teradata, Greenplum, etc.) Software development experience with Python, Java , Spark and other Scripting languages Internal and/or external consulting experience. Deep collaboration with Account Executives and Sales Engineers on account strategy. BONUS POINTS FOR EXPERIENCE WITH THE FOLLOWING: 1+ years of practical Snowflake experience Experience with non-relational platforms and tools for large-scale data processing (e.g. Hadoop, HBase) Familiarity and experience with common BI and data exploration tools (e.g. Microstrategy, Looker, Tableau, PowerBI) OLAP Data modeling and data architecture experience Experience and understanding of large-scale infrastructure-as-a-service platforms (e.g. Amazon AWS, Microsoft Azure, GCP, etc.) Experience using AWS services such as S3, Kinesis, Elastic MapReduce, Data pipeline Experience delivering data migration projects Expertise in a core vertical such as Financial Services, Retail, Media & Entertainment, Healthcare, Life-Sciences etc. Hands-on experience with Python, Java or Scala. WHY JOIN OUR PROFESSIONAL SERVICES TEAM AT SNOWFLAKE: Unique opportunity to work on a truly disruptive software product Get unique, hands-on experience with bleeding edge data warehouse technology Develop, lead and execute an industry-changing initiative Learn from the best! Join a dedicated, experienced team of professionals. Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact? For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 15 Lacs

Hyderabad

Remote

Position Summary The Business Intelligence Developer is responsible for ensuring reports, analysis, and dashboards are shared with respective stakeholders, using business intelligence tools such as IBM Cognos, Tableau or Power BI, SQL and PLSQL. The BI Analyst must have the ability to operate on both a strategic and tactical level, touching all aspects of the analytics process, including the analytics discovery phase, tagging/tracking, optimizing, and analysis. Candidate must have very good experience with Cognos Report Development and Supporting Adhoc User Requests. Key Accountabilities: Design and develop BI solution including complex reports, visualizations, cubes, and metadata models using Cognos Analytics. Build analytical reports and dashboards using Tableau, Power BI and/or Cognos based on business requirements Troubleshoot and provide recommend solutions for optimized design techniques to build reports/dashboards and using Cognos 11, Cognos Framework Manager, Unix Script/ Cognos SDK Proven ability to develop complex analysis and then present it in a concise, impactful way to influence senior leadership. Develop Business Intelligence (BI) reports from discovery through deployment Advanced knowledge in SQL skills including stored procedure etc. Create reporting test scripts in RedShift and Netezza to test reports. Identify issues and optimize reports performance using Power BI and IBM Cognos. Ability to support adhoc requests, data analysis Education and Experience: 5+ years of IT experience. 3+ years of Experience with Cognos Suits, TY/LY trends development, marketing strategy. Additional experience with Tableau or Power BI will be a plus Bachelors degree is required, preferably in Computers Science or related field. Retail domain knowledge is a plus Skills and Behaviors Must be detail-oriented with strong mathematical and written abilities Ability to communicate effectively to department teams, cross-functional partners, and upper management Exceptional planning skills with the ability to adapt to a rapidly changing environment Must be able to prioritize work and manage time effectively Understand the business operations and principles of retail business management Must convey information clearly and concisely Must be able to work effectively in teams and make meaningful and relevant contributions

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies