Jobs
Interviews

17230 Spark Jobs - Page 9

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 4.0 years

6 - 7 Lacs

Chennai

On-site

At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. About the role : As a Data Services Analyst II you will report to a Data Services Manager and will be responsible for analyzing business information from the perspective of marketing and sales professionals in order to ensure that ZoomInfo continues to deliver the highest quality data products to our customers. Demonstrate the value of ZoomInfo data through problem-solving, knowledge sharing and data deliverables. We are looking for a data whizz who can effectively communicate, solve complex data problems, and possess a strong understanding of the value of our data. What You'll Do: Data Analysis Apply quantitative analysis and data visualization to tell the story behind the numbers all while supporting data-driven decision making Use technical skills, problem solving and business knowledge to deliver custom datasets to clients that meet or exceed their expectations Implement proactive improvements to processes and methods for gathering and aggregating data. Find creative solutions to problems when limited information is available Business Operations Understand all aspects of ZoomInfo data including all of our applications and tools Create and maintain documentation on internal and client facing business processes Drive internal process improvement to better service client needs Identify opportunities to reduce manual tasks through automation and create operational efficiencies Client Management Define business requirements needs and document rules and logic for use in client implementations Ability to understand and solve qualitative problems and present or explain solutions to an audience using top-quality, audience-appropriate communication. Enable clients to maximize the benefits of their ZoomInfo partnership through best practices, innovative thinking and process improvement What You Bring: Experience : Ideal candidate will have 2-4 years of experience in a technology setting Education : A Bachelors in a quantitative/analytical field (Mathematics, Statistics, Engineering, Computer Science, Economics) Shift - Night Shift (5PM IST to 2AM IST / 7PM IST to 4AM IST) Mandatory skills : Expert in SQL, Python, Microsoft Excel (formulas, pivot tables) and data analysis/visualization tools Preferred : Tableau, Spark, Snowflake or similar technologies and tools Must have proven track record in technology delivery, process improvement, data governance and or client services Proven Ability to work and interact in a fast-paced environment and strong multitasking, organizational and time management skills Highly resourceful and a go-getter attitude Highly organized and careful attention to detail Excellent communication skills. About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller. ZoomInfo may use a software-based assessment as part of the recruitment process. More information about this tool, including the results of the most recent bias audit, is available here. ZoomInfo is proud to be an equal opportunity employer, hiring based on qualifications, merit, and business needs, and does not discriminate based on protected status. We welcome all applicants and are committed to providing equal employment opportunities regardless of sex, race, age, color, national origin, sexual orientation, gender identity, marital status, disability status, religion, protected military or veteran status, medical condition, or any other characteristic protected by applicable law. We also consider qualified candidates with criminal histories in accordance with legal requirements. For Massachusetts Applicants: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. ZoomInfo does not administer lie detector tests to applicants in any location.

Posted 1 day ago

Apply

6.0 years

0 Lacs

India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

15.0 years

0 Lacs

Indore

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Apache Spark Good to have skills : MySQL, Python (Programming Language), Google BigQuery Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to ensure the applications function as intended, while continuously seeking ways to enhance application efficiency and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of application specifications and user guides. - Collaborate with cross-functional teams to gather requirements and provide technical support. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Good To Have Skills: Experience with MySQL, Python (Programming Language), Google BigQuery. - Strong understanding of data processing frameworks and distributed computing. - Experience in developing and deploying applications in cloud environments. - Familiarity with data integration and ETL processes. Additional Information: - The candidate should have minimum 3 years of experience in Apache Spark. - This position is based at our Indore office. - A 15 years full time education is required. 15 years full time education

Posted 1 day ago

Apply

1.0 - 3.0 years

2 - 4 Lacs

India

On-site

Job Summary: We are seeking a Junior Faculty and Researcher - Data Science to support our ongoing research and consulting projects in AI, Data Science, and related fields. This is an excellent opportunity for recent graduates or early-career professionals who want to build a strong foundation in applied research and data-driven development. Key Responsibilities: Primary role is broken down in following 3 activities: Consulting: 30-40% Academics: 40-60% Research: 10-20% As a Consultant (30% - 40%), you'll collaborate with business groups, managing and mentoring data scientists to create cutting-edge data-driven systems. Your role involves constructing advanced analytics solutions, including prediction and recommendation systems, knowledge mining, and business automation. Effective communication of analytical results to various business disciplines is crucial. As a Researcher (10% - 20%), you'll identify problems for the betterment of society, leading Data Scientists in broad research directions using AI/ML. Constructing primary data collection strategies aligning with fields such as Business Application of AI, Environment, Public Good, Economics of Data Science and AI, Ethics, and Law in AI and Data Science. As an Academician (40%-60%), you'll teach structured classes, contribute to cutting-edge academic curriculum development, and play a central role in academic operations and learning activities. Qualities We Value: We seek candidates with a strong intuition for data, Data Science fundamentals, and an ability to engage with the external ecosystem. Problem-solving skills, adaptability, self-learning ability, and innovative thinking are essential. Excellent analytical skills, creativity, proactiveness, and effective communication are highly valued. You should be highly driven, flexible, resourceful, and a team player with strong influencing skills. Must-Have Requirements: Education in Data Science, Machine Learning, and Artificial Intelligence (degree or certification). Minimum education: Bachelor’s degree in STEM. Preferred Education: Master’s degree. Applied Data Science experience: 1 to 3 years. Hands-on experience in creating analytics solutions, including EDA and dashboarding. Experience in building data analytics models using various technologies and/or frameworks (Python, R, H2O, Keras, TensorFlow, Spark ML). Good-to-Have Requirements: Understanding of Computer Vision and Natural Language Processing techniques. Experience in Public Speaking/Teaching/Coaching in Data Science/Technology. Project Management Experience. Recommendation System Experience. Knowledge of Reinforcement Learning System Design. Job Type: Full-time Pay: ₹240,000.00 - ₹420,000.00 per year Benefits: Paid sick time Paid time off Ability to commute/relocate: Vijay Nagar, Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Preferred) Experience: Data science: 1 year (Required) Language: English (Required) Hindi (Required) Work Location: In person

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Job Description Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Common Skils - SQL, GCP BQ, ETL pipelines using Pythin/Airflow, Experience on Spark/Hive/HDFS, Data modeling for Data conversion Resources (4) Prior experience working on a conv/migration HR project is additional skill needed along with above mentioned skills Data Engineer - Knows HR Knowledge , all other requirement from Functional Area given by client

Posted 1 day ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

As an Analyst in the Air Category Team, you will play a critical role in monitoring and formulating the pricing on Cleartrip platform , analyzing the data, benchmarking against the key competition, drive data-driven optimizations and insights on day to day basis and work closely with multiple stakeholders across Bizfin, Category and Supplier relations team Key Responsibilities: Support business decisions by - formulating hypothesis, structuring the problem, analyze relevant data to verify hypothesis and prepare documentation to communicate the results with insights Monitor key business, product and tech metrics such as Airline mix , Segment/ Booking Volumes, Net Revenue, Revenue per Segment ( RPS ), Marketing/ Discounts per Segment across various parameters & take/recommend corrective actions Product: Monitor trends in consumer behavior of VAS( Insurance , Seat , meal baggage ) across stages of the product funnel, prepare and document analysis and insights Operations: Own the offers creation, monitoring and merchandising along with competition insights end-to-end and back the end-results with data backed analyses Automation: Create various dashboards, alerts and datasets to visualize, and drive actionable insights quickly Reporting: Performance reporting to internal and external stakeholders To succeed in this role – you should have the following ○ Bachelors in Engineering, Computer Science, Math, Statistics, or related discipline from a reputed institute or an MBA from a reputed institute. ○1-3 Years of experience in relevant role ○ Ability to translate structured and unstructured problems into analytical framework ○ Ability to experiment with alternate analytical techniques to solve a problem ○ Exceptional written and verbal communication skills Technical capabilities: ○ SQL, Excel, other scripting languages (R, Python, etc.) ○ Basic understanding of statistical modelling and statistical tools such as R, Python, Spark, SAS or SPSS. ○ Good to have: working experience with BI tools (Power BI, Tableau, Qlikview, Datastudio, etc)

Posted 1 day ago

Apply

0 years

0 Lacs

Dehradun, Uttarakhand, India

On-site

Company Description At Dumpum, we create games that bring families closer together by fostering stronger connections and building essential life skills. Our innovative games are designed to encourage meaningful family interactions, address modern challenges like screen time and busy schedules, and support children’s growth through fun, educational play. From card games to board games and small, no-prop activities, our products spark creativity, strengthen relationships, and make parenting moments memorable. Visit us at Dumpum for more information. What You'll Do: As a Creative Intern, you'll work closely with the founder to: Brainstorm and ideate new physical games (board, card, paper-based) for kids aged 4–12 Create story-driven game narratives and fun, age-appropriate mechanics Assist in designing engaging game formats including activity kits, flashcards, role-play games, and challenge cards Collaborate on game playtesting and iterating based on feedback Support the development of game documentation, packaging concepts, and visual storytelling Research trends in educational toys, Indian mythology, parenting, and kid behavior What We’re Looking For: A super creative mind with a love for games, storytelling, or toys Passion for working on children's products, especially 4–12 age group A curious learner who can think from both a kid’s and a parent’s perspective Comfortable with pen & paper prototyping or using digital tools like Canva, Figma, or PowerPoint Bonus: Knowledge of Indian mythology, childhood psychology, design thinking, or teaching Why Join Us? Be part of a purpose-driven startup impacting families across India Get complete freedom to express ideas and test new formats Receive direct mentorship and get credited in game products Opportunity to extend to a full-time creative/product role post internship

Posted 1 day ago

Apply

6.0 years

0 Lacs

Ghaziabad, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

6.0 years

0 Lacs

Agra, Uttar Pradesh, India

Remote

Experience : 6.00 + years Salary : Confidential (based on experience) Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - Netskope) What do you need for this opportunity? Must have skills required: Airflow, LLMs, MLOps, Generative AI, Python Netskope is Looking for: About The Role Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience. The Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs. We support various OLTP and analytics environments, including our Advanced Analytics and Digital Experience Management products. We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various OLTP and analytics environments. What's In It For You You will be part of a growing team of renowned industry experts in the exciting space of Data and Cloud Analytics Your contributions will have a major impact on our global customer-base and across the industry through our market-leading products You will solve complex, interesting challenges, and improve the depth and breadth of your technical and business skills. What You Will Be Doing Lead the design, development, and deployment of AI/ML models for threat detection, anomaly detection, and predictive analytics in cloud and network security. Architect and implement scalable data pipelines for processing large-scale datasets from logs, network traffic, and cloud environments. Apply MLOps best practices to deploy and monitor machine learning models in production. Collaborate with cloud architects and security analysts to develop cloud-native security solutions leveraging platforms like AWS, Azure, or GCP. Build and optimize Retrieval-Augmented Generation (RAG) systems by integrating large language models (LLMs) with vector databases for real-time, context-aware applications. Analyze network traffic, log data, and other telemetry to identify and mitigate cybersecurity threats. Ensure data quality, integrity, and compliance with GDPR, HIPAA, or SOC 2 standards. Drive innovation by integrating the latest AI/ML techniques into security products and services. Mentor junior engineers and provide technical leadership across projects. Required Skills And Experience AI/ML Expertise Proficiency in advanced machine learning techniques, including neural networks (e.g., CNNs, Transformers) and anomaly detection. Experience with AI frameworks like TensorFlow, PyTorch, and Scikit-learn. Strong understanding of MLOps practices and tools (e.g., MLflow, Kubeflow). Experience building and deploying Retrieval-Augmented Generation (RAG) systems, including integration with LLMs and vector databases. Data Engineering Expertise designing and optimizing ETL/ELT pipelines for large-scale data processing. Hands-on experience with big data technologies (e.g., Apache Spark, Kafka, Flink). Proficiency in working with relational and non-relational databases, including ClickHouse and BigQuery. Familiarity with vector databases such as Pinecone and PGVector and their application in RAG systems. Experience with cloud-native data tools like AWS Glue, BigQuery, or Snowflake. Cloud and Security Knowledge Strong understanding of cloud platforms (AWS, Azure, GCP) and their services. Experience with network security concepts, extended detection and response, and threat modeling. Software Engineering Proficiency in Python, Java, or Scala for data and ML solution development. Expertise in scalable system design and performance optimization for high-throughput applications. Leadership and Collaboration Proven ability to lead cross-functional teams and mentor engineers. Strong communication skills to present complex technical concepts to stakeholders. Education BSCS Or Equivalent Required, MSCS Or Equivalent Strongly Preferred How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position Overview: As a Data Architect, you are responsible for designing and managing scalable, secure, and high-performance data architectures that support GEDU and customer needs. This role ensures that the GEDU’s data assets are structured and managed in a way that enables the business to generate insights, make data-driven decisions, and maintain data integrity across the GEDU and Customers. The Data Architect will work closely with business leaders, data engineers, data scientists, and IT teams to align the data architecture with the GEDU’s strategic goals. Key Responsibilities: Data Architecture Design: Design, develop, and maintain the enterprise data architecture, including data models, database schemas, and data flow diagrams. Develop a data strategy and roadmap that aligns with GEDU business objectives and ensures the scalability of data systems. Architect both transactional (OLTP) and analytical (OLAP) databases, ensuring optimal performance and data consistency. Data Integration & Management: Oversee the integration of disparate data sources into a unified data platform, leveraging ETL/ELT processes and data integration tools. Design and implement data warehousing solutions, data lakes, and/or data marts that enable efficient storage and retrieval of large datasets. Ensure proper data governance, including the definition of data ownership, security, and privacy controls in accordance with compliance standards (GDPR, HIPAA, etc.). Collaboration with Stakeholders: Work closely with business stakeholders, including analysts, developers, and executives, to understand data requirements and ensure that the architecture supports analytics and reporting needs. Collaborate with DevOps and engineering teams to optimize database performance and support large-scale data processing pipelines. Technology Leadership: Guide the selection of data technologies, including databases (SQL/NoSQL), data processing frameworks (Hadoop, Spark), cloud platforms (Azure is a must), and analytics tools. Stay updated on emerging data management technologies, trends, and best practices, and assess their potential application within the organization. Data Quality & Security: Define data quality standards and implement processes to ensure the accuracy, completeness, and consistency of data across all systems. Establish protocols for data security, encryption, and backup/recovery to protect data assets and ensure business continuity. Mentorship & Leadership: Lead and mentor data engineers, data modelers, and other technical staff in best practices for data architecture and management. Provide strategic guidance on data-related projects and initiatives, ensuring that all efforts are aligned with the enterprise data strategy. Extensive Data Architecture Expertise: Over 7 years of experience in data architecture, data modeling, and database management. Proficiency in designing and implementing relational (SQL) and non-relational (NoSQL) database solutions. Strong experience with data integration tools (Azure Tools are a must + any other third-party tools), ETL/ELT processes, and data pipelines. Advanced Knowledge of Data Platforms: Expertise in Azure cloud data platform is a must. Other platforms such as AWS (Redshift, S3), Azure (Data Lake, Synapse), and/or Google Cloud Platform (BigQuery, Dataproc) is a bonus. Experience with big data technologies (Hadoop, Spark) and distributed systems for large-scale data processing. Hands-on experience with data warehousing solutions and BI tools (e.g., Power BI, Tableau, Looker). Data Governance & Compliance: Strong understanding of data governance principles, data lineage, and data stewardship. Knowledge of industry standards and compliance requirements (e.g., GDPR, HIPAA, SOX) and the ability to architect solutions that meet these standards. Technical Leadership: Proven ability to lead data-driven projects, manage stakeholders, and drive data strategies across the enterprise. Strong programming skills in languages such as Python, SQL, R, or Scala. Pre-Sales Responsibilities: Stakeholder Engagement: Work with product stakeholders to analyze functional and non-functional requirements, ensuring alignment with business objectives. Solution Development: Develop end-to-end solutions involving multiple products, ensuring security and performance benchmarks are established, achieved, and maintained. Proof of Concepts (POCs): Develop POCs to demonstrate the feasibility and benefits of proposed solutions. Client Communication: Communicate system requirements and solution architecture to clients and stakeholders, providing technical assistance and guidance throughout the pre-sales process. Technical Presentations: Prepare and deliver technical presentations to prospective clients, demonstrating how proposed solutions meet their needs and requirements. To know our privacy policy, please click the link below: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf

Posted 1 day ago

Apply

12.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About The Role OSTTRA India The Role: Technical Architect The Team: The OSTTRA Technology team is composed of Capital Markets Technology professionals, who build, support and protect the applications that operate our network. The technology landscape includes high-performance, high-volume applications as well as compute intensive applications, leveraging contemporary microservices, cloud-based architectures. The Impact: Together, we build, support, protect and manage high-performance, resilient platforms that process more than 100 million messages a day. Our services are vital to automated trade processing around the globe, managing peak volumes and working with our customers and regulators to ensure the efficient settlement of trades and effective operation of global capital markets. What’s in it for you: The current objective is to identify individuals with 12+ years of experience who have high expertise, to join their existing team of experts who are spread across the world. This is your opportunity to start at the beginning and get the advantages of rapid early growth. This role is based out in Gurgaon and expected to work with different teams and colleagues across the globe. Responsibilities The role shall be responsible for establishing, maintaining, socialising, and realising the target state of Product Architecture for Post trade businesses of Osttra. This shall encompass all services that Osttra offers for these businesses and all the systems which enable those services. Looking for a person who is high on energy and motivation. Should feel challenged by difficult problems. The role shall partner with portfolio delivery leads, programme managers, portfolio business leads and horizontal technical architects to frame the strategy, to provide solutions for planned programmes and to guide the roadmaps. He/she shall able to build high level Design and log-level techicnal solutions, considerting factors such as scalablity, performance, security, maintanlibity and cost-effectiveness The role shall own the technical and architectural decisions for the projects & products. He / she shall review the designs and own the design quality. They will ensure that there is a robust code / implementation review practice in the product. Likewise, they shall be responsible for the robust CI / CD and robust DevSecOps engineering pipelines being used in the projects. He / she shall provide the ongoing support on design and architecture problems to the delivery teams The role shall manage the tech debt log and plan for their remediation across deliveries and roadmaps. The role shall maintain the living Architecture Reference Documents for the Products. They shall actively partner with Horizontal Technical Architects to factor tech constructs within their portfolios and also to ensure the vibrant feedback to the technical strategies. They shall be responsible for guiding the L3 / L2 teams when needed in the resolution of the production situations and incidents. They shall be responsible for various define guidelines and system design for DR strategies and BCP plan for the proudcts. They shall be responsible for architecting key mission critical systems components, review designs and help uplift He/ She should performs critical technical review of changes on app or infra on system. The role shall enable an ecosystem such that the functional API, message, data and flow models within the products of the portfolio are well documented. And shall also provide the strong governance / oversight of the same What We’re Looking For Rich domain experience of financial services industry preferably with financial markets within Pre/post trade life cycles or large-scale Buy / Sell / Brokerage organisations Should have experience architecture design for the muitple products and of large-scale change programmes. Should be adept with application development and engineering methods and tools. Should have robust experience with micro services applications and services development and integration. Should be adept with development tools, contemporary runtime, and observability stacks for micro services. Should have experience of modelling for APIs, Messages and may be data. Should have experience of complex migration, which include data migration Should have experience in architecture & design of highly resilient, high availability, high volume applications. Should be able to initiates or contributes to initiatives around reliability & resilience of application Rich experience of architectural patterns like MVC based front end applications, API & Event driven architectures, Event streaming, Message processing/orchestrations, CQRS and possibly Event sourcing etc. Experience of protocols or integration technologies like HTTP, MQ, FTP, REST/API and possibly FIX/SWIFT etc. Experience of messaging formats and paradigms like XSD, XML, XSLT, JSON, REST and possibly gRPC, GraphQL etc. Experience of technology like Kafka, Spark streams, Kubernetes / EKS, API Gateways, Web & Application servers, message queuing infrastructure, data transformation / ETL tools Experience of languages like Java, python; application development frameworks like Spring Boot/Family, Apache family and common place AWS / other cloud provider services. Experience of engineering methods like CI/CD, build deploy automation, infra as code and unit / integration testing methods and tools. Should have appetite to review / code for complex problems and should find interests / energy in doing design discussions and reviews. Experience of development with NoSQL and Relational databases is required. Should have an active/prior experience with MVC web development or with contemporary React/Angular frameworks. Should have an experice of migrating monolithic application to a cloud based solution with understanding of defning domain based services responsibliity. Should have an rich experience of designing cloud-natvie architecture including microservices, serverless computing, containerization( docker, kubernets ) on relevent platforms ( GCP/AWS) and monitoring aspects. The Location: Gurgaon, India About Company Statement OSTTRA is a market leader in derivatives post-trade processing, bringing innovation, expertise, processes and networks together to solve the post-trade challenges of global financial markets. OSTTRA operates cross-asset post-trade processing networks, providing a proven suite of Credit Risk, Trade Workflow and Optimisation services. Together these solutions streamline post-trade workflows, enabling firms to connect to counterparties and utilities, manage credit risk, reduce operational risk and optimise processing to drive post-trade efficiencies. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. These businesses have an exemplary track record of developing and supporting critical market infrastructure and bring together an established community of market participants comprising all trading relationships and paradigms, connected using powerful integration and transformation capabilities. About OSTTRA Candidates should note that OSTTRA is an independent firm, jointly owned by S&P Global and CME Group. As part of the joint venture, S&P Global provides recruitment services to OSTTRA - however, successful candidates will be interviewed and directly employed by OSTTRA, joining our global team of more than 1,200 post trade experts. OSTTRA was formed in 2021 through the combination of four businesses that have been at the heart of post trade evolution and innovation for the last 20+ years: MarkitServ, Traiana, TriOptima and Reset. OSTTRA is a joint venture, owned 50/50 by S&P Global and CME Group. With an outstanding track record of developing and supporting critical market infrastructure, our combined network connects thousands of market participants to streamline end to end workflows - from trade capture at the point of execution, through portfolio optimization, to clearing and settlement. Joining the OSTTRA team is a unique opportunity to help build a bold new business with an outstanding heritage in financial technology, playing a central role in supporting global financial markets. Learn more at www.osttra.com. What’s In It For You? Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), BSMGMT203 - Entry Professional (EEO Job Group) Job ID: 315820 Posted On: 2025-07-10 Location: Gurgaon, Haryana, India

Posted 1 day ago

Apply

3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do Our Global HR Shared Services Center (HRSSC), located across three global hubs—India, Costa Rica, and Portugal—deliver centralized and efficient support for HR processes worldwide. By working here, you’ll be part of our team that’s transforming how we deliver world-class HR services to our employees, globally. We support the full employee lifecycle with precision, enable efficiency gains through smart systems and collaboration, whilst delivering measurable outcomes that enhance every employee’s journey at BCG. You will be a key member of our Global HR Shared Services Center (HRSSC), supporting regional and local HR teams and employees worldwide with administrative HR processes. You’ll collaborate with colleagues across multiple geographies and time zones, forming part of a close-knit global HR network that values teamwork, ownership, and continuous learning. Key Responsibilities Include Communicate and coordinate with internal stakeholders and external immigration vendors to manage case details, timelines, documentation, and employee communications. Support employees on immigration matters, addressing routine queries and escalating complex issues as needed. Support Transfer-In/Out processes by coordinating immigration documentation and preparing visa invitation/support letters in line with policy requirements. Maintain immigration trackers, logs, and documentation archives Deliver reporting (eg Track vendor performance for service quality and turnaround time; escalate issues as needed ) Collaborate with Local HR team to manage employee documentation and personnel files in compliance with legal requirements and internal standards. What You'll Bring A graduation degree. 1–3+ years of experience in an immigration role, with exposure to India inbound/outbound processes including visas, work permits, FRRO compliance, and letter preparation. Proficiency in Microsoft Office (Excel, PowerPoint, Outlook, Word, Visio). Experience working in a professional services or multinational environment. Fluent verbal and written English language skills. Who You'll Work With Be part of a respected global brand that invests in its people. Exposure to world-class HR systems, like Workday. Work in a culture that prioritizes learning, diversity, and inclusion. Join a growing team where your work directly drives global impact. Additional info You’re Good At Thriving under pressure with exceptional attention to detail. Staying flexible and reliable in a dynamic and changing environment. Managing multiple tasks with structure and discipline. Handling sensitive data with confidentiality and professionalism. Communicating clearly and professionally, both in writing and speech. Creating meaningful experiences for every customer through exceptional service. Collaborating across cultures and time zones. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify.

Posted 1 day ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Established in 2004, OLIVER is the world’s first and only specialist in designing, building, and running bespoke in-house agencies and marketing ecosystems for brands. We partner with over 300 clients in 40+ countries and counting. Our unique model drives creativity and efficiency, allowing us to deliver tailored solutions that resonate deeply with audiences. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: GenAI Creative Optimisation Analyst (Senior Data Analyst) Location: Mumbai, India About the role: As the only company to exclusively design, build and run in-house marketing agencies, OLIVER (part of the Inside Ideas Group) holds a unique position in the market. Join us at U-Studio, where we're pioneering the future of generative AI with our brand-new global AI studio function. As a Content Optimisation Analyst, you'll be at the forefront of this innovation, uncovering insights to drive peak performance of our content. Join our dynamic team and be part of shaping the next generation of AI-powered marketing for one of the world's most renowned FMCG brands. What you will be doing: Defining, Measuring & Evaluating Success Set, track, and evaluate KPIs / effectiveness of cross-channel gen AI content (pilots, campaigns, and overall programmes of work). Craft test, learn, and optimise plans to driving continuous content improvements at every turn and provide robust learning outcomes. Combine diverse data sources (including generation of meta data) to paint an insightful, actionable picture of content performance, empowering data-driven decision-making. Utilise various measurement techniques to evaluate effectiveness. Comfortable with various techniques (from last-click attribution to brand lift studies and MMM). Hacking Insight & Performance Analyse performance data and map it to creative recommendations, uncovering insights and opportunities to optimise content. Maximise in-platform audience insights (prompt recommendations), and integrate external sources (e.g. social listening, search, panel data) for improved audience understanding / creative outputs. Design and maintain dashboards for real-time performance tracking, ensuring we're always ahead of the curve. Craft analytics solutions to unlock better creative outputs – including 'out of the box' pre-trained predictive models through to building of bespoke models with data science colleagues as required. Unlocking Potential Explore and assess emerging data analytics capabilities of Gen AI platforms, leading innovation in content analytics / optimisation. Stay at the forefront of AI and data analytics advancements, fuelling our pursuit of excellence and identifying novel approaches to enhance analytics / creative outputs. What you need to be great in this role: AI, Analytics & Strategic Skills Hands on experience with the major LLMs and other providers for analytics, data, and insight work. Possess strong analytics and data-driven experience, driving insights that push boundaries. Demonstrated expertise in setting diverse measurement frameworks for content, working with data across digital media, content, brand, and meta data. Embrace an innovative approach to data and consumer insights, pioneering new paths to success. Leverage interest in applying data to understand audience behaviour. Technical Proficiency Deep understanding of ad account setups, metrics, evaluation and how this drives results. Comfortable aggregating multiple data sources to evaluate effectiveness. Proficiency with tools such as Brandwatch, Global Web Index, Audiense. Excel in data storytelling for non-technical audiences, ensuring insights resonate. Personal Attributes Possess a keen eye for detail and critical thinking, driving impactful decision-making. Comfortable thriving in a fast-paced, changing environment. Has a thirst for continual learning and pushing boundaries, driving innovation at every turn. Embrace a collaborative, open-minded approach, fostering a culture of teamwork and excellence. Req ID: 14180 Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER, a part of the Brandtech Group, is an equal opportunity employer committed to creating an inclusive working environment where all employees are encouraged to reach their full potential, and individual differences are valued and respected. All applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodivergence, disability status, or any other characteristic protected by local laws. OLIVER has set ambitious environmental goals around sustainability, with science-based emissions reduction targets. Collectively, we work towards our mission, embedding sustainability into every department and through every stage of the project lifecycle.

Posted 1 day ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Key Responsibilities Designed and developed scalable ETL pipelines using Cloud Functions, Cloud Dataproc (Spark), and BigQuery as the central data warehouse for large-scale batch and transformation workloads. Implemented efficient data modeling techniques in BigQuery (including star/snowflake schemas, partitioning, and clustering) to support high-performance analytics and reduce query costs. Built end-to-end ingestion frameworks leveraging Cloud Pub/Sub and Cloud Functions for real-time and event-driven data capture. Used Apache Airflow (Cloud Composer) for orchestration of complex data workflows and dependency management. Applied Cloud Data Fusion and Datastream selectively for integrating specific sources (e.g., databases and legacy systems) into the pipeline. Developed strong backtracking and troubleshooting workflows to quickly identify data issues, job failures, and pipeline bottlenecks, ensuring consistent data delivery and SLA compliance. Integrated robust monitoring, alerting, and logging to ensure data quality, integrity, and observability. Tech stack GCP: BigQuery, Cloud Functions, Cloud Dataproc (Spark), Pub/Sub, Data Fusion, Datastream Orchestration: Apache Airflow (Cloud Composer) Languages: Python, SQL, PySpark Concepts: Data Modeling, ETL/ELT, Streaming & Batch Processing, Schema Management, Monitoring & Logging Some of the most important data sources: (need to know ingestion technique on these) CRM Systems (cloud-based and internal) Salesforce Teradata MySQL API Other 3rd-party and internal operational systems Skills: etl/elt,cloud data fusion,schema management,sql,pyspark,cloud dataproc (spark),monitoring & logging,data modeling,bigquery,etl,cloud pub/sub,python,gcp,bigquerry,streaming & batch processing,datastream,cloud functions,spark,apache airflow (cloud composer)

Posted 1 day ago

Apply

5.0 years

12 - 25 Lacs

Pune, Maharashtra, India

On-site

Job Title: Sr Software Engineer - Products Location : Pune About Improzo At Improzo (Improve + Zoe; meaning Life in Greek), we believe in improving life by empowering our customers. Founded by seasoned Industry leaders, we are laser focused for delivering quality-led commercial analytical solutions to our clients. Our dedicated team of experts in commercial data, technology, and operations has been evolving and learning together since our inception. Here, you won't find yourself confined to a cubicle; instead, you'll be navigating open waters, collaborating with brilliant minds to shape the future. You will work with leading Life Sciences clients, seasoned leaders and carefully chosen peers like you! People are at the heart of our success, so we have defined our CARE values framework with a lot of effort, and we use it as our guiding light in everything we do. We CARE! Customer-Centric: Client success is our success. Prioritize customer needs and outcomes in every action. Adaptive: Agile and Innovative, with a growth mindset. Pursue bold and disruptive avenues that push the boundaries of possibilities. Respect: Deep respect for our clients & colleagues. Foster a culture of collaboration and act with honesty, transparency, and ethical responsibility. Execution: Laser focused on quality-led execution; we deliver! Strive for the highest quality in our services, solutions, and customer experiences. About The Role We are seeking a highly skilled and motivated full-stack Sr. Python Product Engineer to join our team and play a pivotal role in the development of our next-generation Analytics Platform for the Life Sciences industry . This platform, featuring a suite of innovative AI-Apps, helps users solve critical problems across the life sciences value chain, from product launch and brand management to salesforce optimization. As a Senior Engineer, you will be a key contributor, responsible for designing, building, and deploying the core components of the platform. You will blend your deep expertise in full-stack Python development, data engineering, and AI/ML to create a scalable and impactful product that delivers actionable insights. Key Responsibilities Design and deliver a modern, AI-first analytical applications platform using Python, leveraging frameworks like Django or Flask. Design, develop, test, deploy, and maintain robust, scalable, and efficient software applications using Python. Develop and implement server-side logic, integrating user-facing elements developed by front-end developers. Design and implement data storage solutions, working with various databases (SQL and NoSQL). Develop and integrate APIs (RESTful, GraphQL) and other third-party services. Optimize applications for maximum speed, scalability, and security. Participate in the entire software development life cycle (SDLC), from requirements gathering and analysis to deployment and post-launch support. Conduct code reviews, provide constructive feedback, and mentor junior developers. Troubleshoot, debug, and resolve complex software defects and issues. Build scalable data pipelines and services, integrating technologies like Spark, Kafka, and Databricks/Snowflake, to handle large-scale life sciences datasets from sources like Iqvia and Veeva. Implement and manage CI/CD pipelines using tools like Jenkins or GitLab CI and containerization with Docker and Kubernetes to ensure high-quality and reliable deployments. Collaborate closely with product managers and architects to translate product vision into technical requirements and deliver high-quality, client-centric features. Integrate and operationalize advanced AI/ML models, including generative AI and agents built with Crew.Ai and Langchain, into the platform to power new applications. Ensure the platform provides robust capabilities for data exploration, analysis, visualization, and reporting, meeting the needs of our users. Uphold engineering best practices, conduct thorough code reviews, and champion a culture of technical excellence and continuous improvement. Qualifications Bachelor's or Master's degree in Computer Science or a related technical field. 5+ years of hands-on experience in fullstack python product development, building and scaling complex applications in a product-focused environment. Past experience leveraging Java and .NET is desired. Expert proficiency in Python for backend development, with extensive experience in Django including the ORM, migrations, and the Django REST Framework (DRF). In-depth knowledge and experience with python core principles, including object-oriented programming (OOP), data structures, and algorithms. Experience with big-data ecosystem for data processing, analysis, backend development: e.g., Flask/Django, Sql/NoSql, Spark, AirByte/Databricks/Snowflake, Spark, Kafka, Hadoop, etc. Strong experience with big-data technologies such as Spark, AirByte, Databricks, Snowflake, relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra) Solid experience with front-end technologies like React or Angular. Hands-on experience with cloud-based platforms (AWS preferred), including services for compute, storage, and databases. Proven experience with CI/CD tools (Jenkins, GitLab CI), containerization (Docker, Kubernetes), and logging/monitoring tools (Grafana, Prometheus). Experience with advanced analytics, including integrating AI/ML models into production applications. Experience with testing frameworks (e.g., Pytest, Unittest) and a commitment to writing unit and integration tests. Knowledge of the life sciences and biopharma industry, including commercial datasets and compliance requirements (HIPAA, CCPA), is highly desirable. Excellent problem-solving, communication, and collaboration skills. Attention to details, biased for quality and client centricity. Ability to work independently and as part of a cross-functional team. Strong leadership, mentoring, and coaching skills. Benefits Competitive salary and benefits package. Opportunity to work on cutting-edge Analytics projects, transforming the life sciences industry Collaborative and supportive work environment. Opportunities for professional development and growth. Skills: sql,restful apis,python,databricks,spark,data engineering,front-end technologies (react, angular),django,product development,kafka,docker,ci/cd (jenkins, gitlab ci),flask,kubernetes,nosql,ai/ml integration,snowflake,aws,graphql

Posted 1 day ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Skills: Database programming - SQL/ PL-SQL/ T-SQL ETL - Data pipeline, data preparation Analytics - BI Tool Roles & Responsibilities • Implement some of the world's largest data size big data analytics projects using Kyvos platcirm • Preparation of data for BI modeling using Spark, Hive, SQL and other ETL/ELT OLAP Data Modelling • Tuning of models for fastest and sub second query performance from business intelligense tools • Communicate with customer stake holders for busin

Posted 1 day ago

Apply

2.5 years

0 Lacs

Mumbai Metropolitan Region

On-site

JD - Senior Influencer Marketing Executive About Slidein Media We are a leading Influencer Marketing Firm. At our agency, marketing isn't just a job—it's an art form. We’re all about creating next-level campaigns that turn heads, spark conversations, and break through the noise. From partnering with top-tier influencers to collaborating with innovative brands, we’re in the business of building brands that people actually care about. Job Summary The Senior Influencer Marketing role is responsible for planning, implementing, and managing influencer marketing strategies to enhance brand awareness, engage with target audiences, and drive business results. This role involves identifying and building relationships with influencers, creating and executing campaigns, analysing performance metrics, and providing exceptional client servicing. This includes handling client details, briefing clients and influencers on campaign progress, and ensuring clients satisfaction. Roles and responsibilities Identify and build relationships with relevant influencers across various niches. Plan, execute, and manage influencer marketing campaigns, ensuring alignment with client goals. Handle client details, providing regular updates and detailed campaign reports. Maintain strong, long-term relationships with clients and influencers. Monitor campaign deliverables, timelines, brand briefs and budgets for successful execution. Negotiate compensation and terms with influencers for cost-effectiveness. Stay informed about industry trends and identify new influencer partnership opportunities. Ensure client satisfaction through exceptional communication and service. Experience - 2.5+ years Location - Mumbai (Malad West) Interested candidates can share your resume at priyanka.kundaikar@slideinmedia.com / connect@slideinmedia.com If you love turning creative ideas into viral sensations, managing projects with ninja-level precision, and working with a team that’s as passionate as you are about driving results—this is the place for you. We're all about timelines, budgets, and hitting the ground running (but we promise, it never gets boring).

Posted 1 day ago

Apply

2.0 - 4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. About the role : As a Data Services Analyst II you will report to a Data Services Manager and will be responsible for analyzing business information from the perspective of marketing and sales professionals in order to ensure that ZoomInfo continues to deliver the highest quality data products to our customers. Demonstrate the value of ZoomInfo data through problem-solving, knowledge sharing and data deliverables. We are looking for a data whizz who can effectively communicate, solve complex data problems, and possess a strong understanding of the value of our data. What You'll Do: Data Analysis Apply quantitative analysis and data visualization to tell the story behind the numbers all while supporting data-driven decision making Use technical skills, problem solving and business knowledge to deliver custom datasets to clients that meet or exceed their expectations Implement proactive improvements to processes and methods for gathering and aggregating data. Find creative solutions to problems when limited information is available Business Operations Understand all aspects of ZoomInfo data including all of our applications and tools Create and maintain documentation on internal and client facing business processes Drive internal process improvement to better service client needs Identify opportunities to reduce manual tasks through automation and create operational efficiencies Client Management Define business requirements needs and document rules and logic for use in client implementations Ability to understand and solve qualitative problems and present or explain solutions to an audience using top-quality, audience-appropriate communication. Enable clients to maximize the benefits of their ZoomInfo partnership through best practices, innovative thinking and process improvement What You Bring: Experience : Ideal candidate will have 2-4 years of experience in a technology setting Education : A Bachelors in a quantitative/analytical field (Mathematics, Statistics, Engineering, Computer Science, Economics) Shift - Night Shift (5PM IST to 2AM IST / 7PM IST to 4AM IST) Mandatory skills : Expert in SQL, Python, Microsoft Excel (formulas, pivot tables) and data analysis/visualization tools Preferred : Tableau, Spark, Snowflake or similar technologies and tools Must have proven track record in technology delivery, process improvement, data governance and or client services Proven Ability to work and interact in a fast-paced environment and strong multitasking, organizational and time management skills Highly resourceful and a go-getter attitude Highly organized and careful attention to detail Excellent communication skills. About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller. ZoomInfo may use a software-based assessment as part of the recruitment process. More information about this tool, including the results of the most recent bias audit, is available here. ZoomInfo is proud to be an equal opportunity employer, hiring based on qualifications, merit, and business needs, and does not discriminate based on protected status. We welcome all applicants and are committed to providing equal employment opportunities regardless of sex, race, age, color, national origin, sexual orientation, gender identity, marital status, disability status, religion, protected military or veteran status, medical condition, or any other characteristic protected by applicable law. We also consider qualified candidates with criminal histories in accordance with legal requirements. For Massachusetts Applicants: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. ZoomInfo does not administer lie detector tests to applicants in any location.

Posted 1 day ago

Apply

12.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Visa is a world leader in payments and technology, with over 259 billion payments transactions flowing safely between consumers, merchants, financial institutions, and government entities in more than 200 countries and territories each year. Our mission is to connect the world through the most innovative, convenient, reliable, and secure payments network, enabling individuals, businesses, and economies to thrive while driven by a common purpose – to uplift everyone, everywhere by being the best way to pay and be paid. Make an impact with a purpose-driven industry leader. Join us today and experience Life at Visa. Job Description Team Summary Visa Consulting and Analytics (VCA) drives tangible, impactful and financial results for Visa’s network clients, including both financial services and merchants. Drawing on our expertise in strategy consulting, data analytics, brand management, marketing, operational and macroeconomics, Visa Consulting and Analytics solves the most strategic problems for our clients. The India & South Asia (INSA) Consulting Market team within Visa Consulting & Analytics provides consulting and solution services for Visa’s largest issuers in India, Sri Lanka, Bangladesh, Nepal, Bhutan & Maldives. We apply deep expertise in the payments industry to provide solutions to assist clients with their key business priorities, drive growth and improve profitability. The VCA team provides a comprehensive range of consulting services to deliver solutions that address unique challenges in areas such as improving profitability, strategic growth, customer experience, digital payments and running risk. The individual will be part of VCA Data Science geographic team cluster of India and South Asia (INSA) markets and will be responsible for sales and delivery of data science and analytics based solutions to Visa Clients. What the Director Data Science, Visa Consulting & Analytics does at Visa: The Director, Data Science at Visa Consulting & Analytics (VCA) blends technical expertise with business acumen to deliver impactful, data-driven solutions to Visa’s clients, shaping the future of payments through analytics and innovation. This role combines hands-on modeling with strategic leadership, leading the adoption of Generative AI (Gen AI) and Agentic AI into Visa’s offerings. This is onsite role based out of Mumbai. The role will require travel. Key Responsibilities Commercial Acumen/Business Development Collaborate with internal and external clients to comprehend their strategic business inquiries, leading project scoping and design to effectively address those questions by leveraging Visa's data. Drive revenue outcomes for VCA, particularly focusing on data science offerings such as ML Model solutions , data collaboration, and managed service verticals within data science. Technical Leadership Design, develop, and implement advanced analytics and machine learning models to solve complex business challenges for Visa’s clients leveraging Visanet data as well as Client Data Drive the integration and adoption of Gen AI and Agentic AI technologies within Visa’s data science offerings. Ensure the quality, performance, and scalability of data-driven solutions. Strategic Business Impact Translate client needs and business challenges into actionable data science projects that deliver measurable value. Collaborate with cross-functional teams including Consulting, Sales, Product, and Data Engineering to align analytics solutions with business objectives. Present insights and recommendations to both technical and non-technical stakeholders. Team Leadership & Development Mentor and manage a team of data scientists and analysts, fostering a culture of innovation, collaboration, and continuous learning. Set priorities, provide technical direction, and oversee the end-to-end delivery of analytics projects. Innovation & Best Practices Stay abreast of emerging trends in AI and data science, particularly in Gen AI and Agentic AI. Champion the adoption of new methodologies and tools to enhance Visa’s analytics capabilities and value to clients. Represent VCA as a thought leader in internal and external forums. This is a hybrid position. Expectation of days in office will be confirmed by your Hiring Manager. Qualifications Basic Qualifications: • Advanced degree (MS/PhD) in Computer Science, Statistics, Mathematics, Engineering, or a related filed from Tier-1 institute e.g. IIT, ISI, DSE, IISc, etc. • 12+ years of experience in data science, analytics, or related fields, including 3 + years in a leadership/management role. • Proven track record of building and leading high-performing data science teams. • Expertise in statistical analysis, machine learning, data mining, and predictive modeling. • Proficiency in programming languages such as Python, R, or Scala, and experience with ML frameworks (e.g., scikit-learn, TensorFlow, PyTorch). • Excellent communication, presentation, and stakeholder management skills. Preferred Qualifications: • Exposure/prior work experience in payments and/or banking industry • Experience in consulting space or matrix team structure • Familiarity with cloud platforms (AWS, Azure, GCP) and big data technologies (Spark, Hadoop). • Publication or conference experience in the data science/AI community. Additional Information Visa is an EEO Employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status. Visa will also consider for employment qualified applicants with criminal histories in a manner consistent with EEOC guidelines and applicable local law.

Posted 1 day ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Overview We are seeking a Platform Architect with expertise in Informatica PowerCenter and Informatica Intelligent Cloud Services (IICS) to design, implement, and optimize enterprise-level data integration platforms. The ideal candidate will have a strong background in ETL/ELT architecture, cloud data integration, and platform modernization, ensuring scalability, security, and performance across on-prem and cloud environments. Responsibilities Platform Engineering & Administration Oversee installation, configuration, and optimization of PowerCenter and IICS environments. Manage platform scalability, performance tuning, and troubleshooting. Implement data governance, security, and compliance (e.g., role-based access, encryption, data lineage tracking). Optimize connectivity and integrations with various sources (databases, APIs, cloud storage, SaaS apps). Cloud & Modernization Initiatives Architect and implement IICS-based data pipelines for real-time and batch processing. Migrate existing PowerCenter workflows to IICS, leveraging serverless and cloud-native features. Ensure seamless integration with cloud platforms (AWS, Azure, GCP) and modern data lakes/warehouses (Snowflake, Redshift, BigQuery). Qualifications 4 years of experience in data integration and ETL/ELT architecture. Expert-level knowledge of Informatica PowerCenter and IICS (Cloud Data Integration, API & Application Integration, Data Quality). Hands-on experience with cloud platforms (AWS, Azure, GCP) and modern data platforms (Snowflake, Databricks, Redshift, BigQuery). Strong SQL, database tuning, and performance optimization skills. Deep understanding of data governance, security, and compliance best practices. Experience in automation, DevOps (CI/CD), and Infrastructure-as-Code (IaC) tools for data platforms. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications Informatica certifications (IICS, PowerCenter, Data Governance). Proficient to Power Center to IDMC Conversions Understanding on real-time streaming (Kafka, Spark Streaming). Knowledge of API-based integration and event-driven architectures. Familiarity with Machine Learning and AI-driven data processing.

Posted 1 day ago

Apply

0.0 - 1.0 years

0 Lacs

India

On-site

A few weeks ago, an event organizer stumbled upon our Email Campaigns tool. They set up a campaign, hit send… and sold out their event in 48 hours. No support. No onboarding. No playbook. And that’s when we realized something big: 🧠 The product works. Now it’s time to tell its story to the world. 🚀 So now we’re hiring a Product Marketing Intern at AllEvents And not just any intern — someone crazy enough to turn features into talk of the town. We’re looking for: Marketing Intern with Tech Knowledge Experience: 0-1 year A storyteller who loves simplifying the complex A growth hacker mindset who’s hungry to experiment A builder who wants to own features beyond the launch As a Product Marketing Ninja, you’ll: - Craft compelling product narratives that spark action - Design onboarding experiences and in-app communication - Run experiments across landing pages, email flows, and campaigns - Collaborate directly with the product and marketing team to shape how our tools are experienced 🎓 This is a perfect fit for someone who: - Thinks like a builder, writes like a storyteller - Loves working across teams and turning ideas into experiments - Wants to make a real impact — not just learn, but ship

Posted 1 day ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

What you’ll do? Design, develop, and operate high scale applications across the full engineering stack. Design, develop, test, deploy, maintain, and improve software. Apply modern software development practices (serverless computing, microservices architecture, CI/CD, infrastructure-as-code, etc.) Work across teams to integrate our systems with existing internal systems, Data Fabric, CSA Toolset. Participate in technology roadmap and architecture discussions to turn business requirements and vision into reality. Participate in a tight-knit, globally distributed engineering team. Triage product or system issues and debug/track/resolve by analyzing the sources of issues and the impact on network, or service operations and quality. Research, create, and develop software applications to extend and improve on Equifax Solutions. Manage sole project priorities, deadlines, and deliverables. Collaborate on scalability issues involving access to data and information. Actively participate in Sprint planning, Sprint Retrospectives, and other team activity What experience you need? Bachelor's degree or equivalent experience 5+ years of software engineering experience 5+ years experience writing, debugging, and troubleshooting code in Java & SQL 2+ years experience with Cloud technology: GCP, AWS, or Azure 2+ years experience designing and developing cloud-native solutions 2+ years experience designing and developing microservices using Java, SpringBoot, GCP SDKs, GKE/Kubernetes 3+ years experience deploying and releasing software using Jenkins CI/CD pipelines, understand infrastructure-as-code concepts, Helm Charts, and Terraform constructs What could set you apart? Knowledge or experience with Apache Beam for stream and batch data processing. Familiarity with big data tools and technologies like Apache Kafka, Hadoop, or Spark. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Exposure to data visualization tools or platforms.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Apache Spark Good to have skills : NA Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Apache Spark. - Strong understanding of data pipeline architecture and design. - Experience with ETL processes and data integration techniques. - Familiarity with data warehousing concepts and technologies. - Knowledge of data quality frameworks and best practices. Additional Information: - The candidate should have minimum 7.5 years of experience in Apache Spark. - This position is based in Chennai. - A 15 years full time education is required.

Posted 1 day ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you’ll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express Join Team Amex and let’s lead the way together. Business Overview: Credit and Fraud Risk (CFR) team helps drive profitable business growth by reducing the risk of fraud and maintaining industry the lowest credit loss rates. It applies an array of tools and ever-evolving technology to detect and combat fraud, minimize the disruption of good spending and provide a world-class customer experience. The team leads efforts that leverage data and digital advancements to improve risk management as well as enable commerce and bring innovation. A single decision can have many outcomes. And when that decision affects millions of cardmembers and merchants, it needs to be the right one. That’s where AiDa Product team comes in who is part of the Credit & Fraud Risk (CFR) Global Data Science (GDS) CoE. The product specializes in powering a seamless unified experience for its AI / ML users and responsibly leverage enterprise data assets to support critical decisions for the company. As a part of this team, you’ll have the opportunity to work in one of the best talent of product owners and managers in the industry. You will solve real world business problems while getting exposure to the industry’s top leaders in AI / ML product management, decision science and technology. If you are passionate in getting to know all areas of our business and can translate business needs into remarkable solutions that can impact millions, you should consider a career in Product teams. Job Responsibilities · Contribute to defining and articulation of long-term AI product strategy and roadmap with clearly defined business metrics and outcomes. · Solve complicated business problems by prioritization & ownership of products and solutions to meet business objectives. · Prioritize and manage product backlogs, by balancing the requirements of partners and stakeholders. Evaluate prospective features in AI Products pipeline against changing requirements in the direction of AI adoptions. · Contribute to all product lifecycle processes including market (external) research, competitive analysis, planning, positioning, roadmap development, requirements finalization and product development. · Translate product roadmap into well defined requirements and acceptance test criteria. · Drive end-to-end ML/AI product developments with a team of engineers and designers. Transform MVPs to production grade capabilities in collaboration with engineering teams · Contribute to ideation and launch of innovative ML/AI products and capabilities. Innovate ways to evangelize product to drive Amex wide user adoption · (For Learn): Curate and Deliver technical trainings in AI, Cloud, Hive and Spark for beginners to advance level users · Create POCs for best in class AI-ML innovative products with the potential to scale Qualifications and Skills Required: · Strong quantitative, analytical, and structured problem-solving skills. · Strong technical background in AI / ML with background on python, SQL, data analytics and data visualization · Familiarity with ML model development lifecycle (MDLC): feature selection and engineering, different ML model algorithm families - Decision Trees, Boosting algorithms, optimizations considerations for ML models, deployment and serving challenges · Knowledge of Google Cloud Platform (GCP), Big Query, GCP AI / ML capabilities such as Vertex AI. · Knowledge of Big Data Platforms such as Hadoop and PySpark. · Knowledge of designing and building big data tools and frameworks · Demonstrate creativity and self-sufficiency along with strong interpersonal/ collaborative skills and experience working in global teams · Understanding of the various ML Model deployment systems and processes with a basic knowledge of various model regulatory and governance policies. · Ability to prioritize well, communicate clearly and compellingly and understand how to drive a high level of focus and excellence with a strong, talented opinionated engineering, UX and QA teams · Knowledge of Notebook based IDE for performing AI / ML tasks such as Jupyter and Airflow. · Familiarity with product management tools such as Rally, JIRA, and Confluence · Excellent verbal and written communications skills · Undergraduate/Master’s in Computer Science / Information Technology / Mathematics from institutes of global repute. Primary Job Location : Gurugram Hybrid – depending on business requirements We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies