Home
Jobs

9426 Spark Jobs - Page 6

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Req ID: 327296 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GCP Solution Architect to join our team in Noida, Uttar Pradesh (IN-UP), India (IN). Job Description: Primary Skill: Cloud-Infrastructure-Google Cloud Platform Minimum work experience: 8+ yrs Total Experience: 8+ Years Must have GCP Solution Architect Certification& GKE Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc.. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. Manage Kubernetes Objects Declarative and imperative paradigms for interacting with the Kubernetes API. Managing Secrets Managing confidential settings data using Secrets. Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. Configure networking for your cluster. Hands-on experience with terraform. Ability to write reusable terraform modules. Hands-on Python and Unix shell scripting is required. understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. Experience with GCP Services and writing cloud functions. Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus Experience using Docker within container orchestration platforms such as GKE. Knowledge of setting up splunk Knowledge of Spark in GKE Certification: GCP solution architect & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here . Show more Show less

Posted 13 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

We are hiring for one the IT big4 consulting Designation: - Associate/Associate Consultant Location : - Chennai/Gurgaon/Pune Skills Req-: AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Key Responsibilities: - Work with business stakeholders to understand their business needs. - Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. - Clean, filter, and validate data to ensure it meets quality and format standards. - Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. - Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. - Optimize data pipelines and data storage for performance and efficiency. - Participate in code reviews and provide meaningful feedback to other team members. - Provide technical support and troubleshoot issue(s). Qualifications: - Bachelor’s degree in computer science, Information Technology, or a related field, or equivalent work experience. - Experience working in the AWS cloud platform. - Data engineer with expertise in developing big data and data warehouse platforms. - Experience working with structured and semi-structured data. - Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. - Experience working directly with technical and business teams. - Able to create technical documentation. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities. Skillset (good to have) - Experience in data modeling. - Certified in AWS platform for Data Engineer skills. - Experience with ITSM processes/tools such as ServiceNow, Jira - Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow Show more Show less

Posted 13 hours ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Proficiency in AI tools used to prepare and automate data pipelines and ingestion Apache Spark, especially with MLlib PySpark and Dask for distributed data processing Pandas and NumPy for local data wrangling Apache Airflow – schedule and orchestrate ETL/ELT jobs Google Cloud (BigQuery, Vertex AI) Python (most popular for AI and data tasks) Show more Show less

Posted 13 hours ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Job Title: SDET Experience 4 to 8 Years Location: Bangalore Skills Big Data Platform ( Spark Athena Redshift EMR) CI/CD pipelines Database Testing Java QA Automattion (Selenium) SQL queries EMR Jmeter kafka kinesis Project Highlights The project involved ensuring the quality and reliability of large-scale data systems through the development and automation of comprehensive test frameworks. You will design, automate, and execute tests for large-scale data applications, leveraging tools like Selenium, Great Expectations, and cloud platforms. Roles and Responsibilites Develop and maintain automated test frameworks using Selenium, Java, and data quality tools. Automate testing for data platforms (Spark, Athena, Redshift, EMR). Collaborate with development teams to resolve quality issues. Integrate automated tests into CI/CD pipelines. Conduct code reviews and provide feedback on best practices. Requirements 4-8 years in software testing with 2+ years in a senior role. Expertise in Selenium, Java, and data validation tools (Great Expectations). Experience with big data platforms (Spark, Athena, Redshift, AWS). Knowledge of SQL and database testing. Familiarity with CI/CD tools (Jenkins, GitLab Cl) Skills: amazon redshift,ci/cd pipelines,big data platform,java,emr,redshift,sql queries,sql,database testing,kafka,selenium,big data,jmeter,athena,ci/cd,kinesis,qa automation,spark Show more Show less

Posted 13 hours ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

🚀 Copywriter – Words That Move People (and Metrics) We’re looking for a copywriter who can do more than just write — someone who can think , create , and convert . If you're obsessed with words, ideas, headlines, scroll-stopping content, and performance-backed storytelling, this one’s for you. As our Copywriter, you won’t just write copy — you’ll build narratives that elevate the brand, spark conversations, and drive results across all things marketing. From punchy social posts to full-funnel ad campaigns — you’ll be the voice behind the voice. What you’ll do: Develop scroll-stopping, thumb-pausing creative ideas that align with our brand tone (and break the internet occasionally) Write copy that sells, tells, compels — across ad campaigns, social, website, emails, WhatsApp, video scripts, and more Translate briefs into bold narratives and performance-friendly copy Dive into audience mindsets, brand insights, and cultural moments to craft messages that actually matter Collaborate with designers, marketers, and growth teams to bring campaigns to life Edit, tweak, and polish until every word is tight and every message lands Keep testing — what works, what doesn’t, and why — and double down on what drives results What you bring: A Bachelor’s degree in English, Journalism, Communications, Marketing, or anything that sharpened your pen 2+ years of experience slinging copy in fast-moving setups (EdTech or FinTech? Big plus!) A portfolio packed with smart, compelling work across digital formats A sixth sense for audience behavior, and how to speak their language Strong command over tone, structure, and storytelling Comfortable juggling creativity with conversion goals Bonus: Understanding of SEO, performance marketing, and how to write for both humans and algorithms If you think brand-first and write conversion-friendly — let’s talk. Bring your boldest ideas, your sharpest lines, and let’s make some noise. Show more Show less

Posted 14 hours ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 14 hours ago

Apply

3.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 14 hours ago

Apply

10.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Hiring for " Data Governance Lead" in a leading Bank. Location-Kolkata The Data Governance Lead will be responsible for establishing, implementing, and maintaining data governance policies, frameworks, and processes to ensure data quality, compliance, and security within the bank. Require any Bachelor’s/Master’s degree in Computer Science, Information Management, Data Science, Engineering, or a related field with Overall 10+ years of experience in data governance, data management, or related roles within the banking or financial services sector with experience in enterprise data integration and management and working with Data warehouse technologies and Data governance solutions. Having 3+ years of practical experience configuring business glossaries, making dashboards, creating policies etc and Executing at least 2 large Data Governance, Quality and Master Data Management projects from inception to production, working as technology expert. Also with 3+ years’ experience in Data Standardization, Cleanse, transform and parse data. Preferred Certifications-CDMP, DAMA, EDM Council, IQINT Proficiency in Data Governance/Quality tools(e.g., Collibra, Informatica, Alation etc.). Proficient in Data warehousing and pipelining&Good to have Data Engineering experience Experience in working on Python, Big Data, Spark etc. &Cloud platforms (AWS, GCP etc.) Show more Show less

Posted 14 hours ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Who we are and what do we do NPST is a fintech company bridging the banking and fintech worlds with its product suite of technology and payments, for over 10 years. We provide software and digital payment solutions to the BFSI Industry as a Technology service provider. We function as a Technology Service Provider (TSP) and a Third-Party Aggregator Provider (TPAP), catering to stakeholders across the financial value chain, including banks, merchant aggregators, merchants, and consumers. We got listed targeting SME IPO in Aug – 2021 on the NSE Emerge platform with a market cap of 2000 Cr (as of Mar’24) and became NPCI- an approved Merchant Payment Service Provider, acquiring merchants and facilitating payment. NPST has a marquee clientele having 10 Banks and 30+ PAPG and Merchants. What will you do We are looking for Java Developers with experience in building high-performing, scalable, enterprise-grade applications for Banking/Fintech domain specifically IMPS/UPI. You will be part of a talented software team. Roles and responsibilities include managing JAVA/ JAVA EE application development while providing expertise in full software development lifecycle, from concept and design to testing. Job responsibilities: (Need Immediate joiners only) Hands-on experience in design and defining architecture of complex web-based applications. Building distributed application using Core JAVA, Spring, Spring boot. Experience in ORM frameworks such as Hibernate/JPA. Working experience with SQL and NoSQL databases such as MySQL, PostgreSQL, Oracle, MongoDB/Cassandra. Hands-on experience on Microservices, REST. Continuous Integration tools like Jenkins, GitHub. Knowledge of Cloud such as Amazon Web Services. Hands-on knowledge of build tools like Maven, Gradle. Experience working with Git, Junit, Mockito, JIRA and similar tools. Experience on Apache Spark & other Machine Learning tools will be added advantage. What are we looking for: Hands on experience inn Spring, Spring Boot, Hibernate, RDBMS, NoSQL, multithreading, Java, REST, xml parsers, Web Services, MySQL, Oracle, Cassandra, MongoDB Entrepreneurial skills, ability to observe, innovate and own your work. Detail-oriented and organized with strong time management skills. Influencing skills and the ability to create positive working relationships with team members at all levels. Excellent communication and interpersonal skills. A collaborative approach and work with perfection as a group effort to achieve organization goal. Education Qualification - Btech/BCA/MCA/MTech Experience – 3-5 years. Industry - IT/Software/BFSI/ Banking /Fintech Work arrangement – 5 days working from office Location – Noida, Bangalore What do we offer: An organization where we strongly believe in one organization, one goal. A fun workplace which compels us to challenge ourselves and aim higher. A team that strongly believes in collaboration and celebrating success together. Benefits that resonate ‘We Care’. If this opportunity excites you, we invite you to apply and contribute to our success story. If your resume is shortlisted, you will hear back from us. Show more Show less

Posted 14 hours ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Title: Senior Data Engineer (AWS Expert) Location: Ahmedabad Experience: 5+ Years Company: IGNEK Shift Time: 2 PM - 11 PM IST About IGNEK: IGNEK is a fast-growing custom software development company with over a decade of industry experience and a passionate team of 25+ experts. We specialize in crafting end-to-end digital solutions that empower businesses to scale efficiently and stay ahead in an ever-evolving digital world. At IGNEK, we believe in quality, innovation, and a people-first approach to solving real-world challenges through technology. We are looking for a highly skilled and experienced Data Engineer with deep expertise in AWS cloud technologies and strong hands-on experience in backend development, data pipelines, and system design. The ideal candidate will take ownership of delivering robust and scalable solutions while collaborating closely with cross-functional teams and the tech lead. Key Responsibilities: ● Lead and manage the end-to-end implementation of cloud-native data solutions on AWS. ● Design, build, and maintain scalable data pipelines (PySpark/Spark) and data lake architectures (Delta Lake 3.0 or similar). ● Migrate on-premises systems to modern, scalable AWS-based services. hr@ignek.com +91-9328495160 www.ignek.com ● Engineer robust relational databases using Postgres or Oracle with a strong understanding of procedural languages. ● Collaborate with the tech lead to understand business requirements and deliver practical, scalable solutions. ● Integrate newly developed features following defined SDLC standards using CI/CD pipelines. ● Develop orchestration and automation workflows using tools like Apache Airflow. ● Ensure all solutions comply with security best practices, performance benchmarks, and cloud architecture standards. ● Monitor, debug, and troubleshoot issues across multiple environments. ● Stay current with new AWS features, services, and trends to drive continuous platform improvement. Required Skills and Experience: ● 5+ years of professional experience in data engineering and backend development. ● Strong expertise in Python, Scala, and PySpark. ● Deep knowledge of AWS services: EC2, S3, Lambda, RDS, Kinesis, IAM, API Gateway, and others. hr@ignek.com +91-9328495160 www.ignek.com ● Hands-on experience with Postgres or Oracle, and building relational data stores. ● Experience with Spark clusters, Delta Lake, Glue Catalogue, and large-scale data processing. ● Proven track record of end-to-end project delivery and third-party system integrations. ● Solid understanding of microservices, serverless architectures, and distributed computing. ● Skilled in Java, Bash scripting, and search tools like Elasticsearch. ● Proficient in using CI/CD tools (e.g., GitLab, GitHub, AWS CodePipeline). ● Experience working with Infrastructure as Code (Iac) using Terraform. ● Hands-on experience with Docker, containerization, and cloud-native deployments. Preferred Qualifications: ● AWS Certifications (e.g., AWS Certified Solutions Architect or similar). ● Exposure to Agile/Scrum project methodologies. ● Familiarity with Kubernetes, advanced networking, and cloud security practices. ● Experience managing or collaborating with onshore/offshore teams. hr@ignek.com +91-9328495160 www.ignek.com Soft Skills: ● Excellent communication and stakeholder management. ● Strong leadership and problem-solving abilities. ● Team player with a collaborative mindset. ● High ownership and accountability in delivering quality outcomes. Why Join IGNEK? ● Work on exciting, large-scale digital transformation projects. ● Be part of a people-centric, innovation-driven culture. ● A flexible work environment and opportunities for continuous learning. How to Apply: Please send your resume and a cover letter detailing your experience to hr@ignek.com Show more Show less

Posted 14 hours ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Hiring: Chief Technology Officer (CTO) Location: Pune, India | Type: Full-Time We are a fast-growing AdTech company seeking a visionary Chief Technology Officer (CTO) to lead our technology strategy and drive innovation across our platforms. This is a unique opportunity to shape the future of programmatic advertising and real-time bidding (RTB) by leading high-impact engineering and product teams. As CTO, you will define and execute the technology roadmap, oversee the architecture and scaling of RTB, DSP, and SSP systems, and drive advancements in AI/ML, big data, and real-time analytics. You’ll collaborate closely with leadership and cross-functional teams to deliver scalable, secure, and high-performance tech solutions. Requirements: 10+ years of experience in software engineering, with 5+ years in leadership roles Deep expertise in AdTech (RTB, DSP, SSP, header bidding, OpenRTB) Strong background in AI/ML, cloud platforms (AWS/GCP), and big data technologies (Kafka, Spark, Hadoop) Proven track record in building scalable backend systems and leading agile teams Why Join Us: Competitive compensation with equity options Lead the tech vision of a high-growth company Work on cutting-edge products with global impact Show more Show less

Posted 14 hours ago

Apply

10.0 years

0 Lacs

Barmer, Rajasthan, India

On-site

Linkedin logo

Attention Rig Drillers! Flash Tech Consulting is on the hunt for a seasoned Rig Driller for our onshore operations. This is your gateway to an exciting opportunity where your expertise drives excellence in the field. What We’re Looking For: • 10+ Years of Experience & 4-5 Years Experience as a Driller: Show us your deep-rooted skills and a robust track record in drilling. • Proven Land Rig Experience: You've mastered the challenges of onshore drilling. • High Pressure & High Temperature Expertise: You know how to safely and efficiently handle the toughest well conditions. • Cutting-Edge Technology Savvy: Experience with the Amphion Cyber Chair and NOV operating system is a must. How to Apply: 1. Prepare Your Resume: Highlight your drilling accomplishments and technical expertise. 2. Detail Your Experience: Explain how you’ve handled high-pressure and high-temperature wells on land rigs. 3. Showcase Your Tech Know-How: Let us know about your experience with industry-leading tools like the Amphion Cyber Chair and NOV operating system. Step up and seize this opportunity to be a part of our dynamic team. Apply Now and take your career to new heights! #India #FlashTechConsulting #Hiring #RigDriller #Driller #onshore #ApplyNow Elevate your journey in the drilling world—your expertise is the spark that ignites our future! Show more Show less

Posted 14 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Our technology services client is seeking multiple Azure Data & Analytics Engineer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: Azure Data & Analytics Engineer Mandatory Skills: Agile Methodoligies, Python, Databricks, Azure Cloud, Data Factory, Data Validations Experience: 5- 8 Years Location: PAN India Notice Period: 0-15 days Job Description: 5 years of software solution development using agile, devops, product model that includes designing, developing, and implementing large-scale applications or data engineering solutions. 5+ years of Data Analytics experience using SQL 5+ years full-stack development experience, preferably in Azure 5+ years of cloud development (prefer Microsoft Azure) including Azure EventHub, Azure Data Factory, Azure Functions, ADX, ASA, Azure Databricks, Azure DevOps, Azure Blob Storage, Azure Power Apps, and Power BI. 1+ years of FAST API experience is a plus Airline Industry Experience Skills, Licenses & Certifications Expertise with the Azure Technology stack for data management, data ingestion, capture, processing, curation and creating consumption layers. Azure Development Track Certification (preferred) Spark Certification (preferred) If you are interested, share the updated resume to sushmitha.r@s3staff.com Show more Show less

Posted 15 hours ago

Apply

6.0 years

0 Lacs

India

Remote

Linkedin logo

Job Title: Azure Data Engineer (6 Years Experience) Location: Remote Employment Type: Full-time Experience Required: 6 years Job Summary: We are seeking an experienced Azure Data Engineer to join our data team. The ideal candidate will have strong expertise in designing and implementing scalable data solutions on Microsoft Azure, with a solid foundation in data integration, data warehousing, and ETL processes. Key Responsibilities: Design, build, and manage scalable data pipelines and data integration solutions in Azure Develop and optimize data lake and data warehouse solutions using Azure Data Lake, Azure Synapse, and Azure SQL Create ETL/ELT processes using Azure Data Factory Implement data modeling and data architecture best practices Collaborate with data analysts, data scientists, and business stakeholders to deliver reliable and efficient data solutions Monitor and troubleshoot data pipelines for performance and reliability Ensure data security, privacy, and compliance with organizational standards Automate data workflows and optimize performance using cloud-native tools Required Skills: 6 years of experience in data engineering roles Strong experience with Azure Data Factory, Azure Synapse Analytics, Azure Data Lake, and Azure SQL Proficiency in SQL, T-SQL, and scripting languages (Python or PowerShell) Hands-on experience with data ingestion, transformation, and orchestration Good understanding of data modeling (dimensional, star/snowflake schema) Experience with version control (Git) and CI/CD in data projects Familiarity with Azure DevOps and monitoring tools Preferred Skills: Experience with Databricks or Spark on Azure Knowledge of data governance and metadata management tools Understanding of big data technologies and architecture Microsoft Certified: Azure Data Engineer Associate (preferred) Show more Show less

Posted 15 hours ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 15 hours ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Role: Senior Databricks Engineer / Databricks Technical Lead/ Data Architect Location: Bangalore, Chennai, Delhi, Pune, Kolkata Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Mandatory Skills: Python/ PySpark / Spark with Azure/ AWS Databricks Skills: neo4j,pig,mongodb,pl/sql,architect,terraform,hadoop,pyspark,impala,apache kafka,adfs,etl,data warehouse,spark,azure,data bricks,databricks,rdbms,cassandra,aws,unix shell scripting,circleci,python,azure synapse,hive,git,kinesis,sql Show more Show less

Posted 15 hours ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Greeting from Infosys BPM Ltd., Exclusive Women's Walkin drive We are hiring for Content and Technical writer, ETL DB Testing, ETL Testing Automation, .NET, Python Developer skills. Please walk-in for interview on 20th June 2025 at Chennai location Note: Please carry copy of this email to the venue and make sure you register your application before attending the walk-in. Please use below link to apply and register your application. Please mention Candidate ID on top of the Resume *** https://career.infosys.com/jobdesc?jobReferenceCode=PROGEN-HRODIRECT-215140 Interview details Interview Date: 20th June 2025 Interview Time: 10 AM till 1 PM Interview Venue:TP 1/1, Central Avenue Techno Park, SEZ, Mahindra World City, Paranur, TamilNadu Please find below Job Description for your reference: Work from Office*** Rotational Shifts Min 2 years of experience on project is mandate*** Job Description: Content and Technical writer Develop high-quality technical documents, including user manuals, guides, and release notes. Collaborate with cross-functional teams to gather requirements and create accurate documentation. Conduct functional testing and manual testing to ensure compliance with FDA regulations. Ensure adherence to ISO standards and maintain a clean, organized document management system. Strong understanding of Infra domain Technical writer that can convert complex technical concepts into easy to consume documents for the targeted audience. In addition, will also be a mentor to the team with technical writing. Job Description: ETL DB Testing Strong experience in ETL testing, data warehousing, and business intelligence. Strong proficiency in SQL. Experience with ETL tools (e.g., Informatica, Talend, AWS Glue, Azure Data Factory). Solid understanding of Data Warehousing concepts, Database Systems and Quality Assurance. Experience with test planning, test case development, and test execution. Experience writing complex SQL Queries and using SQL tools is a must, exposure to various data analytical functions. Familiarity with defect tracking tools (e.g., Jira). Experience with cloud platforms like AWS, Azure, or GCP is a plus. Experience with Python or other scripting languages for test automation is a plus. Experience with data quality tools is a plus. Experience in testing of large datasets. Experience in agile development is must Understanding of Oracle Database and UNIX/VMC systems is a must Job Description: ETL Testing Automation Strong experience in ETL testing and automation. Strong proficiency in SQL and experience with relational databases (e.g., Oracle, MySQL, PostgreSQL, SQL Server). Experience with ETL tools and technologies (e.g., Informatica, Talend, DataStage, Apache Spark). Hands-on experience in developing and maintaining test automation frameworks. Proficiency in at least one programming language (e.g., Python, Java). Experience with test automation tools (e.g., Selenium, PyTest, JUnit). Strong understanding of data warehousing concepts and methodologies. Experience with CI/CD pipelines and version control systems (e.g., Git). Experience with cloud-based data warehouses like Snowflake, Redshift, BigQuery is a plus. Experience with data quality tools is a plus. Job Description: .Net Should have worked on .Net development/implementation/Support project Must have experience in .NET, ASP.NET MVC, C#, WPF, WCF, SQL Server, Azure Must have experience in Web services, Web API, REST services, HTML, CSS3 Understand Architecture Requirements and ensure effective Design, Development, Validation and Support activities. REGISTRATION PROCESS: The Candidate ID & SHL Test(AMCAT ID) is mandatory to attend the interview. Please follow the below instructions to successfully complete the registration. (Talents without registration & assessment will not be allowed for the Interview). Candidate ID Registration process: STEP 1: Visit: https://career.infosys.com/joblist STEP 2: Click on "Register" and provide the required details and submit. STEP 3: Once submitted, Your Candidate ID(100XXXXXXXX) will be generated. STEP 4: The candidate ID will be shared to the registered Email ID. SHL Test(AMCAT ID) Registration process: This assessment is proctored, and talent gets evaluated on Basic analytics, English Comprehension and writex (email writing). STEP 1: Visit: https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fautologin-talentcentral.shl.com%2F%3Flink%3Dhttps%3A%2F%2Famcatglobal.aspiringminds.com%2F%3Fdata%3DJTdCJTIybG9naW4lMjIlM0ElN0IlMjJsYW5ndWFnZSUyMiUzQSUyMmVuLVVTJTIyJTJDJTIyaXNBdXRvbG9naW4lMjIlM0ExJTJDJTIycGFydG5lcklkJTIyJTNBJTIyNDE4MjQlMjIlMkMlMjJhdXRoa2V5JTIyJTNBJTIyWm1abFpUazFPV1JsTnpJeU1HVTFObU5qWWpRNU5HWTFOVEU1Wm1JeE16TSUzRCUyMiUyQyUyMnVzZXJuYW1lJTIyJTNBJTIydXNlcm5hbWVfc3E5QmgxSWI5NEVmQkkzN2UlMjIlMkMlMjJwYXNzd29yZCUyMiUzQSUyMnBhc3N3b3JkJTIyJTJDJTIycmV0dXJuVXJsJTIyJTNBJTIyJTIyJTdEJTJDJTIycmVnaW9uJTIyJTNBJTIyVVMlMjIlN0Q%3D%26apn%3Dcom.shl.talentcentral%26ibi%3Dcom.shl.talentcentral%26isi%3D1551117793%26efr%3D1&data=05%7C02%7Comar.muqtar%40infosys.com%7Ca7ffe71a4fe4404f3dac08dca01c0bb3%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C0%7C0%7C638561289526257677%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C&sdata=s28G3ArC9nR5S7J4j%2FV1ZujEnmYCbysbYke41r5svPw%3D&reserved=0 STEP 2: Click on "Start new test" and follow the instructions to complete the assessment. STEP 3: Once completed, please make a note of the AMCAT ID( Access you Amcat id by clicking 3 dots on top right corner of screen). NOTE: During registration, you'll be asked to provide the following information: Personal Details: Name, Email Address, Mobile Number, PAN number. Availability: Acknowledgement of work schedule preferences (Shifts, Work from Office, Rotational Weekends, 24/7 availability, Transport Boundary) and reason for career change. Employment Details: Current notice period and total annual compensation (CTC) in the format 390000 - 4 LPA (example). Candidate Information: 10-digit candidate ID starting with 100XXXXXXX, Gender, Source (e.g., Vendor name, Naukri/LinkedIn/Found it, or Direct), and Location Interview Mode: Walk-in Attempt all questions in the SHL Assessment app. The assessment is proctored, so choose a quiet environment. Use a headset or Bluetooth headphones for clear communication. A passing score is required for further interview rounds. 5 or above toggles, multi face detected, face not detected, or any malpractice will be considered rejected Once you've finished, submit the assessment and make a note of the AMCAT ID (15 Digit) used for the assessment. Documents to Carry: Please have a note of Candidate ID & AMCAT ID along with registered Email ID. Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Please carry 2 set of updated Resume/CV (Hard Copy). Please carry original ID proof for security clearance. Please carry individual headphone/Bluetooth for the interview. Pointers to note: Please do not carry laptops/cameras to the venue as these will not be allowed due to security restrictions. Original Government ID card is must for Security Clearance. Regards, Infosys BPM Recruitment team. Show more Show less

Posted 15 hours ago

Apply

5.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Description We are seeking an experienced Social Media Marketing Manager to lead and execute our social media strategies across various platforms. This role will focus on creating engaging content, growing kPaisa’s social media presence, and building meaningful connections with our audience to drive brand awareness, user acquisition, and customer retention. Roles And Responsibilites Develop and implement social media strategies to increase brand visibility and drive user engagement. Create, curate, and manage content across all social media platforms (Facebook, Instagram, Twitter, LinkedIn, YouTube, etc.). Collaborate with the marketing team to align social media campaigns with broader marketing goals and objectives. Monitor social media trends and adapt strategies to stay ahead of competitors and emerging trends. Engage with followers and respond to comments, messages, and mentions to maintain a positive brand image. Analyze social media performance using analytics tools and provide regular reports on key metrics (reach, engagement, conversion). Work closely with influencers and partners for collaborations and cross-promotions. Run paid social media campaigns to drive traffic, increase brand awareness, and boost conversions. Skills Required 2–5 years of experience in social media marketing or content management, ideally in fintech or BFSI. Strong understanding of social media platforms, their algorithms, and trends. Excellent written and verbal communication skills. Experience with social media management tools (e.g., Hootsuite, Buffer, Sprout Social) and analytics tools (e.g., Google Analytics, Facebook Insights). Ability to create visually appealing content and work with design tools (e.g., Canva, Adobe Spark). Knowledge of paid social media advertising and running campaigns on Facebook Ads, LinkedIn Ads, etc. Strong creative mindset with an ability to think outside the box and engage audiences in a crowded digital space. Good project management skills with an ability to multitask and meet deadlines. What we have to offer Flexible work hours First hand fintech development opportunity Meritocracy driven, candid startup culture Show more Show less

Posted 15 hours ago

Apply

0.0 - 2.0 years

0 Lacs

Gurugram, Haryana

On-site

Indeed logo

Job Overview We are looking for a dynamic and innovative Full Stack Data Scientist with 2+ years of experience who excels in end-to-end data science solutions. The ideal candidate is a tech-savvy professional passionate about leveraging data to solve complex problems, develop predictive models, and drive business impact in the MarTech domain. Key Responsibilities 1. Data Engineering & Preprocessing Collect, clean, and preprocess structured and unstructured data from various sources. Perform advanced feature engineering, outlier detection, and data transformation. Collaborate with data engineers to ensure seamless data pipeline development. 2. Machine Learning Model Development Design, train, and validate machine learning models (supervised, unsupervised, deep learning). Optimize models for business KPIs such as accuracy, recall, and precision. Innovate with advanced algorithms tailored to marketing technologies. 3. Full Stack Development Build production-grade APIs for model deployment using frameworks like Flask, FastAPI, or Django. Develop scalable and modular code for data processing and ML integration. 4. Deployment & Operationalization Deploy models on cloud platforms (AWS, Azure, or GCP) using tools like Docker and Kubernetes. Implement continuous monitoring, logging, and retraining strategies for deployed models. 5. Insight Visualization & Communication Create visually compelling dashboards and reports using Tableau, Power BI, or similar tools. Present insights and actionable recommendations to stakeholders effectively. 6. Collaboration & Teamwork Work closely with marketing analysts, product managers, and engineering teams to solve business challenges. Foster a collaborative environment that encourages innovation and shared learning. 7. Continuous Learning & Innovation Stay updated on the latest trends in AI/ML, especially in marketing automation and analytics. Identify new opportunities for leveraging data science in MarTech solutions. Qualifications Educational Background Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. Technical Skills Programming Languages: Python (must-have), R, or Julia; familiarity with Java or C++ is a plus. ML Frameworks: TensorFlow, PyTorch, Scikit-learn, or XGBoost. Big Data Tools: Spark, Hadoop, or Kafka. Cloud Platforms: AWS, Azure, or GCP for model deployment and data pipelines. Databases: Expertise in SQL and NoSQL (e.g., MongoDB, Cassandra). Visualization: Mastery of Tableau, Power BI, Plotly, or D3.js. Version Control: Proficiency with Git for collaborative coding. Experience 2+ years of hands-on experience in data science, machine learning, and software engineering. Proven expertise in deploying machine learning models in production environments. Experience in handling large datasets and implementing big data technologies. Soft Skills Strong problem-solving and analytical thinking. Excellent communication and storytelling skills for technical and non-technical audiences. Ability to work collaboratively in diverse and cross-functional teams. Preferred Qualifications Experience with Natural Language Processing (NLP) and Computer Vision (CV). Familiarity with CI/CD pipelines and DevOps for ML workflows. Exposure to Agile project management methodologies. Why Join Us? Opportunity to work on innovative projects with cutting-edge technologies. Collaborative and inclusive work environment that values creativity and growth. If you're passionate about turning data into actionable insights and driving impactful business decisions, we’d love to hear from you! Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Fixed shift Monday to Friday Experience: Data science: 2 years (Required) Location: Gurugram, Haryana (Preferred) Work Location: In person

Posted 15 hours ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Bangalore & Gurugram YOE - 7+ years We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark, and SQL, to join our data team. You’ll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities: Design and implement ETL/ELT pipelines using Databricks and PySpark. Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop high-performance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines Required Skills: Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong hands-on skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications: Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt. Exposure to streaming data and real-time processing. Knowledge of DevOps practices for data engineering. Interested candidate can submit details at https://forms.office.com/r/g2h52X7Bt9 Show more Show less

Posted 16 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Coupa makes margins multiply through its community-generated AI and industry-leading total spend management platform for businesses large and small. Coupa AI is informed by trillions of dollars of direct and indirect spend data across a global network of 10M+ buyers and suppliers. We empower you with the ability to predict, prescribe, and automate smarter, more profitable business decisions to improve operating margins. Why join Coupa? 🔹 Pioneering Technology: At Coupa, we're at the forefront of innovation, leveraging the latest technology to empower our customers with greater efficiency and visibility in their spend. 🔹 Collaborative Culture: We value collaboration and teamwork, and our culture is driven by transparency, openness, and a shared commitment to excellence. 🔹 Global Impact: Join a company where your work has a global, measurable impact on our clients, the business, and each other. Learn more on Life at Coupa blog and hear from our employees about their experiences working at Coupa. The Impact of Lead AI Engineer to Coupa: XXXX What you will do: Collaborate closely with product managers, engineers, and cross-functional stakeholders to define metrics and outcomes for AI-driven product features and capabilities Design, develop, and deploy sophisticated AI models, agents, and intelligent assistants leveraging Large Language Models (LLMs), including both language processing and advanced reasoning tasks Engineer robust, scalable, and reliable prompt-based solutions to effectively integrate LLMs into production-grade SaaS applications Implement experimentation frameworks, conduct rigorous evaluations, and iterate rapidly based on user feedback and data-driven insights Establish automated monitoring and quality assurance mechanisms for continuous evaluation and enhancement of model performance Provide strategic vision and technical leadership to navigate evolving trends and emerging methodologies within AI, particularly focusing on leveraging generative AI capabilities What you will bring to Coupa: Strong analytical and problem-solving skills focused on creating impactful AI-driven product experiences Proven expertise in Python for developing, deploying, and optimizing complex AI models and pipelines Extensive experience with Large Language Models (e.g., GPT-4o, Claude, LLaMA), including prompt engineering, fine-tuning, and evaluating performance Solid understanding of various machine learning paradigms (supervised, unsupervised, reinforcement learning) and practical experience deploying these methods at scale Deep knowledge of reasoning frameworks, agent architectures, retrieval-augmented generation (RAG), and related AI techniques Excellent proficiency in modern data architectures and experience with distributed data frameworks (e.g., Spark, Hadoop) Familiarity with cloud infrastructure (AWS, Azure, GCP) for deploying scalable AI/ML services Ability to clearly communicate complex technical concepts to both technical and non-technical stakeholders A strong drive to stay abreast of cutting-edge AI research and proactively integrate relevant innovations into production systems 5+ years of experience building, deploying, and managing AI/ML models in production environments Educational background in Computer Science, Data Science, Statistics, Mathematics, or related fields, or equivalent professional experience Experience with modern ML and AI development tools (e.g., Hugging Face, LangChain, PyTorch, TensorFlow) and visualization platforms (e.g., Streamlit, D3, Plotly, Matplotlib) Coupa complies with relevant laws and regulations regarding equal opportunity and offers a welcoming and inclusive work environment. Decisions related to hiring, compensation, training, or evaluating performance are made fairly, and we provide equal employment opportunities to all qualified candidates and employees. Please be advised that inquiries or resumes from recruiters will not be accepted. By submitting your application, you acknowledge that you have read Coupa’s Privacy Policy and understand that Coupa receives/collects your application, including your personal data, for the purposes of managing Coupa's ongoing recruitment and placement activities, including for employment purposes in the event of a successful application and for notification of future job opportunities if you did not succeed the first time. You will find more details about how your application is processed, the purposes of processing, and how long we retain your application in our Privacy Policy. Show more Show less

Posted 16 hours ago

Apply

15.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Job Title : Technical Architect - Data Governance & MDM Experience: 15+ years Location: Mumbai/Pune/Bangalore Role Overview: The Technical Architect specializing in Data Governance and Master Data Management (MDM) designs, implements, and optimizes enterprise data solutions. The jobholder has expertise in tools like Collibra, Informatica, InfoSphere, Reltio, and other MDM platforms, ensuring data quality, compliance, and governance across the organization. Responsibilities: Architect and optimize strategies for data quality, metadata management, and data stewardship. Design and implement data governance frameworks and MDM solutions using tools like Collibra, Informatica, InfoSphere, and Reltio. Develop strategies for data quality, metadata management, and data stewardship. Collaborate with cross-functional teams to integrate MDM solutions with existing systems. Establish best practices for data governance, security, and compliance. Monitor and troubleshoot MDM environments for performance and reliability. Provide technical leadership and guidance to data teams. Stay updated on advancements in data governance and MDM technologies. Key Technical Skills. 10+ years of experience working on DG/MDM projects Strong on Data Governance concepts Hands-on different DG tools/services Hands-on Reference Data, Taxonomy Strong understanding of Data Governance, Data Quality, Data Profiling, Data Standards, Regulations, Security Match and Merge strategy Design and implement the MDM Architecture and Data Models Usage of Spark capabilities Statistics to deduce meanings from vast enterprise level data Different data visualization means of analyzing huge data sets Good to have knowledge of Python/R/Scala languages Experience on DG on-premise and on-cloud Understanding of MDM, Customer, Product, Vendor Domains and related artifacts Experience of working on proposals, customer workshops, assessments etc is preferred Must have good communication and presentation skills Technology Stack - Collibra, IBM MDM, Reltio, Infosphere Eligibility Criteria 15+ years of total experience. Bachelor’s degree in Computer Science, Data Management, or a related field. Proven experience as a Technical Architect in Data Governance and MDM. Certifications in relevant MDM tools (e.g., Collibra Data Governance, Informatica / InfoSphere / Reltio MDM, ). Experience with cloud platforms like AWS, Azure, or GCP. Proficiency in tools like Collibra, Informatica, InfoSphere, Reltio, and similar platforms. Strong understanding of data modeling, ETL/ELT processes, and cloud integration. Interested candidates can apply directly. Alternatively, you can also send your resume to ansari.m@atos.net Show more Show less

Posted 16 hours ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

OLIVER+ is a global team of creative thinkers, tech-savvy trendsetters, and production pros specialising in film, CGI, automation, AI, motion design, and digital/print content. We partner with over 300 clients in 40+ countries and counting. Our focus is to connect clients with high-quality solutions, talent and ambitious opportunities worldwide. As a part of The Brandtech Group , we're at the forefront of leveraging cutting-edge AI technology to revolutionise how we create and deliver work. Our AI solutions enhance efficiency, spark creativity, and drive insightful decision-making, empowering our teams to produce innovative and impactful results. Role: Project Manager (Creative Account Manager) Location: Mumbai A Little Bit About the Role: The Project Manager is responsible for running creative and production processes and working alongside our existing talented Project Management team to drive a project from brief to delivery. Leading on projects while working with our dynamic teams across the business, our project managers successfully deliver projects following OLIVER+’ ways of working. What we want is a passionate, talented individual who can showcase their skills of managing multiple mid to high complexity projects. You should have meticulous attention to detail, understand the importance of the profitability of your projects for the agency and you will have proven yourself as a safe pair of hands on the day-to-day running of multiple prestigious projects. What you will be doing: GENERAL TASKS AND RESPONSIBILITIES: Have integrated experience across various disciplines which can include one or more of the following: Digital, Technology, Film, CGI, Motion Design and/or Print projects across different sized accounts and across different time zones; from initial brief to final delivery (based on experience) Manage your team to deliver a wide range of deliverables from email marketing to website content Strive to follow and implement the defined project management and production processes within OLIVER+ and with partners Handle multiple projects simultaneously and thrive in a fast-paced, deadline-driven environment Face adversity, setbacks and negativity with a resilient mindset and attitude Embody the company values, instil these behaviours within all team members Drive continuous improvement through each step of the process and consult on process improvements Close off projects to set standards/requirements INITIATE & PLAN THE PROJECT: Serve as the point of contact to receive new briefs and manage the process of transforming unclear briefs into well prepared briefs where applicable Manage the scoping, costing and planning of projects across different briefs Work to the OLIVER+ Project Management Way of Working while executing projects Identify stakeholders and create a communication plan to ensure each of them has access to the right level of information Set deadlines with partners and challenge unrealistic timelines to ensure workload is managed based on creative processes Create project plan and identify key milestones for each of the projects you are assigned Work closely with the Delivery Lead and Resource/Studio Manager to staff the project correctly based on the required deliverables and deadlines Identify risks and possible issues and create risk registers Work closely with the creative team to define a cost, scope and time plan for the projects Prepare and run effective and structured client and team kick-off meetings Create and manage the delivery of project documentation MANAGE PROJECT EXECUTION, MONITORING AND CONTROL: Coordinate development and delivery among various project participants and stakeholders Prioritize and manage workload of the project team Partner liaison when required, presenting project scopes, cost estimates and timing plans Build partner relationship and ensure their needs and requirements are addressed, while following the OLIVER+ ways of working Monitor the progress of the project delivery within scope and budget with the planned resources Prepare status reports for the stakeholders and actively manage the control of project progress using weekly action points Follow the project management change control process for any changes needed in the scope, budget, timelines or resource requirement Create and keep up to date all related project documentation and ensure compliance in the project management system Proactively problem-solve, mitigate risks and plan for future issues Be accountable for the financial profitability of the project and actively manage cost overburn and time logged daily Ensure final deliverables are quality driven and comply to the design and production requirements and expectations Monitor utilization and output of team What you need to be great in this role: Willingness to accept feedback and iterate over processes in a highly-collaborative, low-ego team environment. Process driven and continuous improvement mindset. Curiosity, creativity, and ambition. Attention to detail. The ability to take a project from brief to completion, good communication, organizational, and time management skills are essential. Impeccable problem-solving skills and a love for client satisfaction. 3+ years of experience and proven track record of successfully managed projects from start to end Strong communication skills. Strong organisational skills – able to multi-task and manage multiple projects with different deadlines at one time. Good eye for detail and quality control experience. Software competency – Microsoft Word, PowerPoint, Excel, Zoom, Teams and OMG (Oliver Marketing Gateway internal approval system - training will be provided). Req ID: 12838 #LI-PG1 A little bit about us: We are OLIVER+ (previously known as MORE), part of OLIVER and the Inside Ideas Group. We’re a global collective of creatives, technologists and production experts who create and maintain world-class content in film, CGI, motion design, digital, print and tech. We connect opportunities for clients by providing high quality solutions and capabilities to ambitious businesses all over the world. You can find us here: https://oliverplus.agency/ https://vimeo.com/oliverplus https://www.linkedin.com/company/oliverplus/mycompany/ https://www.instagram.com/__oliver__plus/ Our values shape everything we do: Be Ambitious to succeed Be Imaginative to push the boundaries of what’s possible Be Inspirational to do groundbreaking work Be always learning and listening to understand Be Results-focused to exceed expectations Be actively pro-inclusive and anti-racist across our community, clients and creations OLIVER is committed to advancing Diversity, Equity, and Inclusion (DEI). We actively work to create equal opportunities for everyone, and our DEI initiatives are woven into the fabric of our company. We've set ambitious environmental goals around sustainability, and have committed to be net zero by 2030. We expect everyone to contribute to our mission, embedding sustainability into every department and through every stage of the project lifecycle. Inside Ideas Group and its affiliates are equal opportunity employers committed to creating an inclusive working environment where all our employees are encouraged to reach their full potential, and individual differences are valued and respected. All [suitable] applicants shall be considered for employment without regard to race, ethnicity, religion, gender, sexual orientation, gender identity, age, neurodiversity, disability status, or any other characteristic protected by local laws. teristic protected by local laws. Show more Show less

Posted 16 hours ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation Show more Show less

Posted 16 hours ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

About Us: Planet Spark is reshaping the EdTech landscape by equipping kids and young adults with future-ready skills like public speaking, and more. We're on a mission to spark curiosity, creativity, and confidence in learners worldwide. If you're passionate about meaningful impact, growth, and innovation—you're in the right place. Location: Gurgaon (On-site) Experience Level: Entry to Early Career (Freshers welcome!) Shift Options: Domestic | Middle East | International Working Days: 5 days/week (Wednesday & Thursday off) | Weekend availability required Target Joiners: Any (Bachelor’s or Master’s) 🔥 What You'll Be Owning (Your Impact): Lead Activation: Engage daily with high-intent leads through dynamic channels—calls, video consults, and more. Sales Funnel Pro: Own the full sales journey—from first hello to successful enrollment. Consultative Selling: Host personalized video consultations with parents/adult learners, pitch trial sessions, and resolve concerns with clarity and empathy. Target Slayer: Consistently crush weekly revenue goals and contribute directly to Planet Spark’s growth engine. Client Success: Ensure a smooth onboarding experience and transition for every new learner. Upskill Mindset: Participate in hands-on training, mentorship, and feedback loops to constantly refine your game. 💡 Why Join Sales at Planet Spark? Only Warm Leads: Skip the cold calls—our leads already know us and have completed a demo session. High-Performance Culture: Be part of a fast-paced, energetic team that celebrates success and rewards hustle. Career Fast-Track: Unlock rapid promotions, performance bonuses, and leadership paths. Top-Notch Training: Experience immersive onboarding, live role-plays, and access to ongoing L&D programs. Rewards & Recognition: Weekly shoutouts, cash bonuses, and exclusive events to celebrate your wins. Make Real Impact: Help shape the minds of tomorrow while building a powerhouse career today. 🎯 What You Bring to the Table: Communication Powerhouse: You can build trust and articulate ideas clearly in both spoken and written formats. Sales-Driven: You know how to influence decisions, navigate objections, and close deals with confidence. Empathy First: You genuinely care about clients’ goals and tailor your approach to meet them. Goal-Oriented: You’re self-driven, proactive, and hungry for results. Tech Fluent: Comfortable using CRMs, video platforms, and productivity tools. ✨ What’s in It for You? 💼 High-growth sales career with serious earning potential 🌱 Continuous upskilling in EdTech, sales, and communication 🧘 Supportive culture that values growth and well-being 🎯 Opportunity to work at the cutting edge of education innovation Show more Show less

Posted 16 hours ago

Apply

Exploring Spark Jobs in India

The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities have a high concentration of tech companies and startups actively hiring for Spark roles.

Average Salary Range

The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Salaries may vary based on the company, location, and specific job requirements.

Career Path

In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect

Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.

Related Skills

Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases

Having a combination of these skills can make a candidate more competitive in the job market.

Interview Questions

  • What is Apache Spark and how is it different from Hadoop? (basic)
  • Explain the difference between RDD, DataFrame, and Dataset in Spark. (medium)
  • How does Spark handle fault tolerance? (medium)
  • What is lazy evaluation in Spark? (basic)
  • Explain the concept of transformations and actions in Spark. (basic)
  • What are the different deployment modes in Spark? (medium)
  • How can you optimize the performance of a Spark job? (advanced)
  • What is the role of a Spark executor? (medium)
  • How does Spark handle memory management? (medium)
  • Explain the Spark shuffle operation. (medium)
  • What are the different types of joins in Spark? (medium)
  • How can you debug a Spark application? (medium)
  • Explain the concept of checkpointing in Spark. (medium)
  • What is lineage in Spark? (basic)
  • How can you monitor and manage a Spark application? (medium)
  • What is the significance of the Spark Driver in a Spark application? (medium)
  • How does Spark SQL differ from traditional SQL? (medium)
  • Explain the concept of broadcast variables in Spark. (medium)
  • What is the purpose of the SparkContext in Spark? (basic)
  • How does Spark handle data partitioning? (medium)
  • Explain the concept of window functions in Spark SQL. (advanced)
  • How can you handle skewed data in Spark? (advanced)
  • What is the use of accumulators in Spark? (advanced)
  • How can you schedule Spark jobs using Apache Oozie? (advanced)
  • Explain the process of Spark job submission and execution. (basic)

Closing Remark

As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies