Home
Jobs

7524 Spark Jobs - Page 41

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 10.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Veersa - Veersa Technologies is a US-based IT services and AI enablement company founded in 2020, with a global delivery center in Noida (Sector 142). Founded by industry leaders with an impressive 85% YoY growth A profitable company since inception Team strength: Almost 400 professionals and growing rapidly Our Services Include Digital & Software Solutions: Product Development, Legacy Modernization, Support Data Engineering & AI Analytics: Predictive Analytics, AI/ML Use Cases, Data Visualization Tools & Accelerators: AI/ML-embedded tools that integrate with client systems Tech Portfolio Assessment: TCO analysis, modernization roadmaps, etc. Tech Stack - * AI/ML, IoT, Blockchain, MEAN/MERN stack, Python, GoLang, RoR, Java Spring Boot, Node.js Databases: PostgreSQL, MySQL, MS SQL, Oracle Cloud: AWS & Azure (Serverless Architecture) Website: https://veersatech.com LinkedIn: Feel free to explore our company profile About The Role We are seeking a highly skilled and experienced Data Engineer & Lead Data Engineer to join our growing data team. This role is ideal for professionals with 2 to 10 years of experience in data engineering, with a strong foundation in SQL, Databricks, Spark SQL, PySpark, and BI tools like Power BI or Tableau. As a Data Engineer, you will be responsible for building scalable data pipelines, optimizing data processing workflows, and enabling insightful reporting and analytics across the organization. Key Responsibilities Design and develop robust, scalable data pipelines using PySpark and Databricks. Write efficient SQL and Spark SQL queries for data transformation and analysis. Work closely with BI teams to enable reporting through Power BI or Tableau. Optimize performance of big data workflows and ensure data quality. Collaborate with business and technical stakeholders to gather and translate data requirements. Implement best practices for data integration, processing, and governance. Required Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field. 2–10 years of experience in data engineering or a similar role. Strong experience with SQL, Spark SQL, and PySpark. Hands-on experience with Databricks for big data processing. Proven experience with BI tools such as Power BI and/or Tableau. Strong understanding of data warehousing and ETL/ELT concepts. Good problem-solving skills and the ability to work in cross-functional teams. Nice To Have Experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with CI/CD pipelines and version control tools (e.g., Git). Understanding of data governance, security, and compliance standards. Exposure to data lake architectures and real-time streaming data pipelines. Show more Show less

Posted 5 days ago

Apply

10.0 - 15.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Excited to stay in the IT software product space and working with a team, building cutting-edge products at the intersection of GenAI and data transformation? Our client is seeking a Data Management Lead to join their R&D team in Kolkata. The ideal candidate will have 10-15 years of experience in data management and will lead their efforts in developing and maintaining their data management related modules. Key Responsibilities: Drive and deliver the design, development, and deployment of our data management and storage related modules in their product Oversee data architecture and data integration processes across modules. Enablement of real-time and batch ETL and ELT processes, ensuring data quality, performance and optimal usage of appropriate storage technologies. Ensure the platform’ data management and storage modules enable our customers to follow their data governance principles and best practices, including authorization, logging and data security. Ensure the platform has consistent and secure data access controls, data masking, and encryption processes. Establish and develop/integrate a central technical and business meta data layer leveraging modern, knowledge graph and AI based technologies Lead a team of developers to deliver above capabilities and features in a matrix environment Partner with product management to ensure alignment with market and customer trends. Partner with QA to deliver industry standard software quality Qualifications: Bachelor’s or master’s degree in computer science or a related field (BTech preferred). 10-15 years of experience in data management, with at least 10 years in a similar role. Proven experience leading teams and projects in data management. Strong proficiency in big data technologies (e.g., Spark, Kafka), SQL, and data warehousing solutions. Experience with ETL and ELT tools and processes. Proficiency in data modelling and data warehouse / data lake / data Lakehouse concepts. Knowledge and experience of data quality concepts and metadata management. Understanding of data governance principles and best practices, like GDPR and HIPAA. Good understanding of product lifecycle and roadmap planning Experience with large product or services companies is preferred. Show more Show less

Posted 5 days ago

Apply

12.0 - 18.0 years

0 Lacs

Tamil Nadu, India

Remote

Linkedin logo

Join us as we work to create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. This position requires expertise in designing, developing, debugging, and maintaining AI-powered applications and data engineering workflows for both local and cloud environments. The role involves working on large-scale projects, optimizing AI/ML pipelines, and ensuring scalable data infrastructure. As a PMTS, you will be responsible for integrating Generative AI (GenAI) capabilities, building data pipelines for AI model training, and deploying scalable AI-powered microservices. You will collaborate with AI/ML, Data Engineering, DevOps, and Product teams to deliver impactful solutions that enhance our products and services. Additionally, it would be desirable if the candidate has experience in retrieval-augmented generation (RAG), fine-tuning pre-trained LLMs, AI model evaluation, data pipeline automation, and optimizing cloud-based AI deployments. Responsibilities AI-Powered Software Development & API Integration Develop AI-driven applications, microservices, and automation workflows using FastAPI, Flask, or Django, ensuring cloud-native deployment and performance optimization. Integrate OpenAI APIs (GPT models, Embeddings, Function Calling) and Retrieval-Augmented Generation (RAG) techniques to enhance AI-powered document retrieval, classification, and decision-making. Data Engineering & AI Model Performance Optimization Design, build, and optimize scalable data pipelines for AI/ML workflows using Pandas, PySpark, and Dask, integrating data sources such as Kafka, AWS S3, Azure Data Lake, and Snowflake. Enhance AI model inference efficiency by implementing vector retrieval using FAISS, Pinecone, or ChromaDB, and optimize API latency with tuning techniques (temperature, top-k sampling, max tokens settings). Microservices, APIs & Security Develop scalable RESTful APIs for AI models and data services, ensuring integration with internal and external systems while securing API endpoints using OAuth, JWT, and API Key Authentication. Implement AI-powered logging, observability, and monitoring to track data pipelines, model drift, and inference accuracy, ensuring compliance with AI governance and security best practices. AI & Data Engineering Collaboration Work with AI/ML, Data Engineering, and DevOps teams to optimize AI model deployments, data pipelines, and real-time/batch processing for AI-driven solutions. Engage in Agile ceremonies, backlog refinement, and collaborative problem-solving to scale AI-powered workflows in areas like fraud detection, claims processing, and intelligent automation. Cross-Functional Coordination and Communication Collaborate with Product, UX, and Compliance teams to align AI-powered features with user needs, security policies, and regulatory frameworks (HIPAA, GDPR, SOC2). Ensure seamless integration of structured and unstructured data sources (SQL, NoSQL, vector databases) to improve AI model accuracy and retrieval efficiency. Mentorship & Knowledge Sharing Mentor junior engineers on AI model integration, API development, and scalable data engineering best practices, and conduct knowledge-sharing sessions. Education & Experience Required 12-18 years of experience in software engineering or AI/ML development, preferably in AI-driven solutions. Hands-on experience with Agile development, SDLC, CI/CD pipelines, and AI model deployment lifecycles. Bachelor’s Degree or equivalent in Computer Science, Engineering, Data Science, or a related field. Proficiency in full-stack development with expertise in Python (preferred for AI), Java Experience with structured & unstructured data: SQL (PostgreSQL, MySQL, SQL Server) NoSQL (OpenSearch, Redis, Elasticsearch) Vector Databases (FAISS, Pinecone, ChromaDB) Cloud & AI Infrastructure AWS: Lambda, SageMaker, ECS, S3 Azure: Azure OpenAI, ML Studio GenAI Frameworks & Tools: OpenAI API, Hugging Face Transformers, LangChain, LlamaIndex, AutoGPT, CrewAI. Experience in LLM deployment, retrieval-augmented generation (RAG), and AI search optimization. Proficiency in AI model evaluation (BLEU, ROUGE, BERT Score, cosine similarity) and responsible AI deployment. Strong problem-solving skills, AI ethics awareness, and the ability to collaborate across AI, DevOps, and data engineering teams. Curiosity and eagerness to explore new AI models, tools, and best practices for scalable GenAI adoption. About Athenahealth Here’s our vision: To create a thriving ecosystem that delivers accessible, high-quality, and sustainable healthcare for all. What’s unique about our locations? From an historic, 19th century arsenal to a converted, landmark power plant, all of athenahealth’s offices were carefully chosen to represent our innovative spirit and promote the most positive and productive work environment for our teams. Our 10 offices across the United States and India — plus numerous remote employees — all work to modernize the healthcare experience, together. Our Company Culture Might Be Our Best Feature. We don't take ourselves too seriously. But our work? That’s another story. athenahealth develops and implements products and services that support US healthcare: It’s our chance to create healthier futures for ourselves, for our family and friends, for everyone. Our vibrant and talented employees — or athenistas, as we call ourselves — spark the innovation and passion needed to accomplish our goal. We continue to expand our workforce with amazing people who bring diverse backgrounds, experiences, and perspectives at every level, and foster an environment where every athenista feels comfortable bringing their best selves to work. Our size makes a difference, too: We are small enough that your individual contributions will stand out — but large enough to grow your career with our resources and established business stability. Giving back is integral to our culture. Our athenaGives platform strives to support food security, expand access to high-quality healthcare for all, and support STEM education to develop providers and technologists who will provide access to high-quality healthcare for all in the future. As part of the evolution of athenahealth’s Corporate Social Responsibility (CSR) program, we’ve selected nonprofit partners that align with our purpose and let us foster long-term partnerships for charitable giving, employee volunteerism, insight sharing, collaboration, and cross-team engagement. What can we do for you? Along with health and financial benefits, athenistas enjoy perks specific to each location, including commuter support, employee assistance programs, tuition assistance, employee resource groups, and collaborative workspaces — some offices even welcome dogs. In addition to our traditional benefits and perks, we sponsor events throughout the year, including book clubs, external speakers, and hackathons. And we provide athenistas with a company culture based on learning, the support of an engaged team, and an inclusive environment where all employees are valued. We also encourage a better work-life balance for athenistas with our flexibility. While we know in-office collaboration is critical to our vision, we recognize that not all work needs to be done within an office environment, full-time. With consistent communication and digital collaboration tools, athenahealth enables employees to find a balance that feels fulfilling and productive for each individual situation. Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Unessa Foundation is a purpose-driven movement uplifting underserved communities through unity, compassion, and empowerment. Rooted in empathy and inclusion, we strive to close opportunity gaps, restore dignity, and spark lasting change—one life at a time. Join us in shaping a more just, connected world. The Role As a Campaign Ambassador at Unessa Foundation, you will support our outreach efforts by helping spread awareness and contributing to ongoing fundraising activities through your personal and social networks. You will receive guidance on how to represent the cause effectively and will get the opportunity to be part of a mission-driven campaign. Ideal Profile You have working knowledge of Fundraising, Outreach, Communication skills and Pursuasive You are a strong networker & relationship builder You pay strong attention to detail and deliver work that is of a high standard You enjoy finding creative solutions to problems What's on Offer? Opportunity to make a positive impact Flexible working options Work alongside & learn from best in class talent Show more Show less

Posted 5 days ago

Apply

7.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Linkedin logo

ECI is the leading global provider of managed services, cybersecurity, and business transformation for mid-market financial services organizations across the globe. From its unmatched range of services, ECI provides stability, security and improved business performance, freeing clients from technology concerns and enabling them to focus on running their businesses. More than 1,000 customers worldwide with over $3 trillion of assets under management put their trust in ECI. At ECI, we believe success is driven by passion and purpose. Our passion for technology is only surpassed by our commitment to empowering our employees around the world . The Opportunity: ECI has an exciting Opportunity for Cloud Data Engineer. The full time position is open for an experienced Sr DataEngineer that will support several of our clients systems. Client satisfaction is our primary objective; all available positions are customer facing requiring EXCELLENT communication and people skills. A positive attitude, rigorous work habits and professionalism in the work place are a must. Fluency in English, both written and verbal are required. This is an Onsite role. What you will do: A senior cloud data engineer with 7+ years of experience Strong knowledge and hands on experience with Azure data services such as Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Data Lake, Logic apps, Azure Synapse Analytics, Apache spark and Snowflake Datawarehouse, Azure Fabric Good to have Azure Databricks, Azure Cosmos DB, etc, Azure AI Must have experience in developing could base application. Should be able to analyze problem and provide solution. Experience in designing, implementing, and managing data warehouse solutions using Azure Synapse Analytics or similar technologies. Experience in migrating the data from On-Premises to Cloud. Proficiency in data modeling techniques and experience in designing and implementing complex data models. Experience in designing and developing ETL/ELT processes to move data between systems and transform data for analytics. Strong programming skills in languages such as SQL, Python, or Scala, with experience in developing and maintaining data pipelines. Experience in at least one of the reporting tools such as Power BI / Tableau Ability to work effectively in a team environment and communicate complex technical concepts to non-technical stakeholders. Experience in managing and optimizing databases, including performance tuning, troubleshooting, and capacity planning Understand business requirements and convert them to technical design for implementation. Understand business requirement, perform analysis and develop and test code. Design and develop could base application using Python on serverless framework Strong communication, analytical, and troubleshoot skills Create, maintain and enhance applications Work independently as individual contributor with minimum or no help. Follow Agile Methodology (SCRUM Who you are: Experience in developing could base data application. Hands on in Azure data services, data warehousing, ETL etc Understanding of cloud architecture principles and best practices, including scalability, high availability, disaster recovery, and cost optimization, with a focus on designing data solutions for the cloud. Experience in developing pipelines using ADF, Synapse. Hands on experience in migrating data from On_premises to cloud. Strong experience in writing the complex SQL Scripts, transformations. Able to analyze problem and provide solution. Knolwedge in CI/CD pipelines is plus. Knowledge in Python and API Gateway is an added advantage Bonus (Nice to have): Product Management/BA experience nice to have. ECI’s culture is all about connection - connection with our clients, our technology and most importantly with each other. In addition to working with an amazing team around the world, ECI also offers a competitive compensation package and so much more! If you believe you would be a great fit and are ready for your best job ever, we would like to hear from you! Love Your Job, Share Your Technology Passion, Create Your Future Here! Show more Show less

Posted 5 days ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

What You'll Do Avalara, Inc., (), is the leading provider of cloud-based software that delivers a broad array of compliance solutions related to sales tax and other transactional taxes. We are building cloud-based tax compliance solutions to handle every transaction in the world. Every transaction you make, physical or digital, has a unique and nuanced tax calculation that accompanies it. We do those and we want to do all of them. Avalara is building the global cloud compliance platform, and the Build and Deployment Tooling Team contributes in allowing the development of this platform. Our engineering teams are diverse in their engineering practices, culture, and background. We create the systems that allow them to produce quality products at an increasing pace. As a member of the team, you will take part in architecting the tooling that lowers the barriers for development. You will report to Manager, Site Reliability Engineer This might be a good fit for you, if Helping people do their best resonates with you. you love platform engineering you want to build cool things with cool people. you love automating everything you love building high impact tools and software which everyone depends on you love automating everything! What Your Responsibilities Will Be Some areas of work are Create tools that smooth the journey from idea to running in production Learn and promote best practices related to the build, test and deployment of software What You'll Need to be Successful Qualifications Software Engineering : Understand software engineering fundamentals and have experience developing software among a team of engineers. Experience practicing testing. Build Automation: Experience getting artifacts in many languages packaged and tested so that they can be trusted to go into Production. Automatically. Release Automation: Experience in getting artifacts running in production. Automatically. Observability : Experience developing service level indicators and goals, instrumenting software, and building meaningful alerts. Troubleshooting : Experience tracking down technical causes of distributed software. Containers/Container Orchestration Systems : A understanding of how to manage container-based systems especially on Kubernetes. Artificial Intelligence : A grounding in infrastructure for and the use of Agentic Systems. Infrastructure-as-Code : Experience deploying and maintaining infrastructure as code tools such as Terraform and Pulumi. Technical Writing : We will need to build documentation and diagrams for other engineering teams. Customer Satisfaction : Experience ensuring that code meets all functionality and acceptance criteria for customer satisfaction (our customers are other engineering teams and Avalara customers). GO : Our tooling is developed in GO Distributed Computing : Experience architecting distributed services across regions and clouds. GitLab : Experience working with, managing, and deploying. Artifactory : Experience working with, managing, and deploying. Technical Writing : write technical documents that people love and adore. Open Source: Build side-projects or contribute to other open-source projects. Experience Minimum 8 years of experience in a SaaS environment Bachelor's degree in computer science or equivalent Participate in an on-call rotation Experience with a data warehouse like Snowflake, Redshift, or Spark

Posted 5 days ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Position Overview: We are seeking a skilled Big Data Developer to join our growing delivery team, with a dual focus on hands-on project support and mentoring junior engineers. This role is ideal for a developer who not only thrives in a technical, fast-paced environment but is also passionate about coaching and developing the next generation of talent. You will work on live client projects, provide technical support, contribute to solution delivery, and serve as a go-to technical mentor for less experienced team members. Key Responsibilities: • Perform hands-on Big Data development work, including coding, testing, troubleshooting, and deploying solutions. • Support ongoing client projects, addressing technical challenges and ensuring smooth delivery. • Collaborate with junior engineers to guide them on coding standards, best practices, debugging, and project execution. • Review code and provide feedback to junior engineers to maintain high quality and scalable solutions. • Assist in designing and implementing solutions using Hadoop, Spark, Hive, HDFS, and Kafka. • Lead by example in object-oriented development, particularly using Scala and Java. • Translate complex requirements into clear, actionable technical tasks for the team. • Contribute to the development of ETL processes for integrating data from various sources. • Document technical approaches, best practices, and workflows for knowledge sharing within the team. Required Skills and Qualifications: • 8+ years of professional experience in Big Data development and engineering. • Strong hands-on expertise with Hadoop, Hive, HDFS, Apache Spark, and Kafka. • Solid object-oriented development experience with Scala and Java. • Strong SQL skills with experience working with large data sets. • Practical experience designing, installing, configuring, and supporting Big Data clusters. • Deep understanding of ETL processes and data integration strategies. • Proven experience mentoring or supporting junior engineers in a team setting. • Strong problem-solving, troubleshooting, and analytical skills. • Excellent communication and interpersonal skills. Preferred Qualifications: • Professional certifications in Big Data technologies (Cloudera, Databricks, AWS Big Data Specialty, etc.). • Experience with cloud Big Data platforms (AWS EMR, Azure HDInsight, or GCP Dataproc). • Exposure to Agile or DevOps practices in Big Data project environments. What We Offer: Opportunity to work on challenging, high-impact Big Data projects. Leadership role in shaping and mentoring the next generation of engineers. Supportive and collaborative team culture. Flexible working environment Competitive compensation and professional growth opportunities. Show more Show less

Posted 5 days ago

Apply

15.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Your Team Responsibilities MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Your Skills And Experience That Will Help You Excel Prior senior Software Architecture roles Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. Drive the development of conceptual, logical, and physical data models aligned with business requirements. Lead the implementation and optimization of data technologies, including Apache Spark. Experience with one of the table formats, such as Delta, Iceberg. Strong hands-on experience in data architecture, database design, and data modeling. Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Ability to dive into details, hands on technologist with strong core computer science fundamentals. Strong preference for financial services experience Proven leadership of large-scale distributed software teams that have delivered great products on deadline Experience in a modern iterative software development methodology Experience with globally distributed teams and business partners Experience in building and maintaining applications that are mission critical for customers M.S. in Computer Science, Management Information Systems or related engineering field 15+ years of software engineering experience Demonstrated consensus builder and collegial peer About MSCI What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic recruitment agency dedicated to connecting talented individuals with leading organizations across India. We focus on delivering exceptional talent solutions while fostering a culture of collaboration, integrity, and excellence. Our mission is to empower businesses by providing them with the right skill sets to succeed in their industry, while promoting growth and enhancing employee satisfaction. We pride ourselves on our professional approach to hiring and talent management. Job Title: AWS Data Engineer Location: On-site in India We are seeking a skilled AWS Data Engineer to join our team. In this role, you will be responsible for designing and implementing data solutions on AWS infrastructure. You will work closely with cross-functional teams to ensure data integrity and availability, while also developing and optimizing data pipelines to support analytics and reporting needs. Role Responsibilities Design, develop and implement scalable data architectures on AWS. Develop ETL processes for data ingestion from various sources. Optimize AWS data pipelines for performance and cost efficiency. Collaborate with data analysts to understand data requirements. Create and maintain data models for efficient query performance. Conduct data quality assessments and implement necessary improvements. Implement data security and compliance measures on AWS. Maintain documentation for all systems and processes. Monitor and troubleshoot data-related issues on AWS. Integrate third-party services and APIs for data retrieval. Train and mentor junior data engineers. Participate in code reviews and maintain software engineering best practices. Stay up-to-date with AWS features and best practices in data engineering. Contribute to refining data engineering standards and processes. Support migration of legacy data systems to AWS. Qualifications Bachelor’s degree in Computer Science or related field. 3+ years of experience in data engineering or similar roles. Extensive experience with AWS services (S3, Redshift, Lambda, Glue). Strong proficiency in SQL and relational databases. Experience with Python and/or Java for data processing. Knowledge of Big Data technologies like Hadoop or Spark. Familiarity with data warehousing concepts and design. Experience in data modeling and database design. Strong analytical and problem-solving skills. Ability to work collaboratively in a team environment. Excellent communication skills, both verbal and written. Experience with analytics tools and reporting. Understanding of data privacy and security regulations. Hands-on experience with CI/CD processes. Ability to adapt to fast-paced environments. Certifications in AWS (e.g., AWS Certified Data Analytics) are a plus. Skills: data modeling,java,data analysis,scala,aws,ci/cd,python,hadoop,sql proficiency,problem solving,data engineering,sql,big data technologies,data warehousing,etl,spark Show more Show less

Posted 5 days ago

Apply

5.0 - 9.0 years

19 - 23 Lacs

Mumbai

Work from Office

Naukri logo

Overview MSCI has an immediate opening in of our fastest growing product lines. As a Lead Architect within Sustainability and Climate, you are an integral part of a team that works to develop high-quality architecture solutions for various software applications on modern cloud-based technologies. As a core technical contributor, you are responsible for conducting critical architecture solutions across multiple technical areas to support project goals. The systems under your responsibility will be amongst the most mission critical systems of MSCI. They require strong technology expertise and a strong sense of enterprise system design, state-of-the-art scalability and reliability but also innovation. Your ability to take technology decisions in a consistent framework to support the growth of our company and products, lead the various software implementations in close partnerships with global leaders and multiple product organizations and drive the technology innovations will be the key measures of your success in our dynamic and rapidly growing environment. At MSCI, you will be operating in a culture where we value merit and track record. You will own the full life-cycle of the technology services and provide management, technical and people leadership in the design, development, quality assurance and maintenance of our production systems, making sure we continue to scale our great franchise. Responsibilities Engages technical teams and business stakeholders to discuss and propose technical approaches to meet current and future needs • Defines the technical target state of the product and drives achievement of the strategy • As the Lead Architect you will be responsible for leading the design, development, and maintenance of our data architecture, ensuring scalability, efficiency, and reliability. • Create and maintain comprehensive documentation for the architecture, processes, and best practices including Architecture Decision Records (ADRs). • Evaluates recommendations and provides feedback on new technologies • Develops secure and high-quality production code, and reviews and debugs code written by others Informaon Classificaon: GENERAL • Identifies opportunities to eliminate or automate remediation of recurring issues to improve overall operational stability of software applications and systems • Collaborating with a cross functional team to draft, implement and adapt the overall architecture of our products and support infrastructure in conjunction with software development managers, and product management teams • Staying abreast of new technologies and issues in the software-as-a-service industry, including current technologies, platforms, standards and methodologies • Being actively engaged in setting technology standards that impact the company and its offerings • Ensuring the knowledge sharing of engineering best practices across departments; and developing and monitoring technical standards to ensure adherence to them. Qualifications Prior senior Software Architecture roles • Demonstrate proficiency in programming languages such as Python/Java/Scala and knowledge of SQL and NoSQL databases. • Drive the development of conceptual, logical, and physical data models aligned with business requirements. • Lead the implementation and optimization of data technologies, including Apache Spark. • Experience with one of the table formats, such as Delta, Iceberg. • Strong hands-on experience in data architecture, database design, and data modeling. • Proven experience as a Data Platform Architect or in a similar role, with expertise in Airflow, Databricks, Snowflake, Collibra, and Dremio. • Experience with cloud platforms such as AWS, Azure, or Google Cloud. • Ability to dive into details, hands on technologist with strong core computer science fundamentals. • Strong preference for financial services experience • Proven leadership of large-scale distributed software teams that have delivered great products on deadline • Experience in a modern iterative software development methodology • Experience with globally distributed teams and business partners • Experience in building and maintaining applications that are mission critical for customers • M.S. in Computer Science, Management Information Systems or related engineering field • 15+ years of software engineering experience • Demonstrated consensus builder and collegial peer What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 5 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic recruitment agency dedicated to connecting talented individuals with leading organizations across India. We focus on delivering exceptional talent solutions while fostering a culture of collaboration, integrity, and excellence. Our mission is to empower businesses by providing them with the right skill sets to succeed in their industry, while promoting growth and enhancing employee satisfaction. We pride ourselves on our professional approach to hiring and talent management. Job Title: AWS Data Engineer Location: On-site in India We are seeking a skilled AWS Data Engineer to join our team. In this role, you will be responsible for designing and implementing data solutions on AWS infrastructure. You will work closely with cross-functional teams to ensure data integrity and availability, while also developing and optimizing data pipelines to support analytics and reporting needs. Role Responsibilities Design, develop and implement scalable data architectures on AWS. Develop ETL processes for data ingestion from various sources. Optimize AWS data pipelines for performance and cost efficiency. Collaborate with data analysts to understand data requirements. Create and maintain data models for efficient query performance. Conduct data quality assessments and implement necessary improvements. Implement data security and compliance measures on AWS. Maintain documentation for all systems and processes. Monitor and troubleshoot data-related issues on AWS. Integrate third-party services and APIs for data retrieval. Train and mentor junior data engineers. Participate in code reviews and maintain software engineering best practices. Stay up-to-date with AWS features and best practices in data engineering. Contribute to refining data engineering standards and processes. Support migration of legacy data systems to AWS. Qualifications Bachelor’s degree in Computer Science or related field. 3+ years of experience in data engineering or similar roles. Extensive experience with AWS services (S3, Redshift, Lambda, Glue). Strong proficiency in SQL and relational databases. Experience with Python and/or Java for data processing. Knowledge of Big Data technologies like Hadoop or Spark. Familiarity with data warehousing concepts and design. Experience in data modeling and database design. Strong analytical and problem-solving skills. Ability to work collaboratively in a team environment. Excellent communication skills, both verbal and written. Experience with analytics tools and reporting. Understanding of data privacy and security regulations. Hands-on experience with CI/CD processes. Ability to adapt to fast-paced environments. Certifications in AWS (e.g., AWS Certified Data Analytics) are a plus. Skills: data modeling,java,data analysis,scala,aws,ci/cd,python,hadoop,sql proficiency,problem solving,data engineering,sql,big data technologies,data warehousing,etl,spark Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic recruitment agency dedicated to connecting talented individuals with leading organizations across India. We focus on delivering exceptional talent solutions while fostering a culture of collaboration, integrity, and excellence. Our mission is to empower businesses by providing them with the right skill sets to succeed in their industry, while promoting growth and enhancing employee satisfaction. We pride ourselves on our professional approach to hiring and talent management. Job Title: AWS Data Engineer Location: On-site in India We are seeking a skilled AWS Data Engineer to join our team. In this role, you will be responsible for designing and implementing data solutions on AWS infrastructure. You will work closely with cross-functional teams to ensure data integrity and availability, while also developing and optimizing data pipelines to support analytics and reporting needs. Role Responsibilities Design, develop and implement scalable data architectures on AWS. Develop ETL processes for data ingestion from various sources. Optimize AWS data pipelines for performance and cost efficiency. Collaborate with data analysts to understand data requirements. Create and maintain data models for efficient query performance. Conduct data quality assessments and implement necessary improvements. Implement data security and compliance measures on AWS. Maintain documentation for all systems and processes. Monitor and troubleshoot data-related issues on AWS. Integrate third-party services and APIs for data retrieval. Train and mentor junior data engineers. Participate in code reviews and maintain software engineering best practices. Stay up-to-date with AWS features and best practices in data engineering. Contribute to refining data engineering standards and processes. Support migration of legacy data systems to AWS. Qualifications Bachelor’s degree in Computer Science or related field. 3+ years of experience in data engineering or similar roles. Extensive experience with AWS services (S3, Redshift, Lambda, Glue). Strong proficiency in SQL and relational databases. Experience with Python and/or Java for data processing. Knowledge of Big Data technologies like Hadoop or Spark. Familiarity with data warehousing concepts and design. Experience in data modeling and database design. Strong analytical and problem-solving skills. Ability to work collaboratively in a team environment. Excellent communication skills, both verbal and written. Experience with analytics tools and reporting. Understanding of data privacy and security regulations. Hands-on experience with CI/CD processes. Ability to adapt to fast-paced environments. Certifications in AWS (e.g., AWS Certified Data Analytics) are a plus. Skills: data modeling,java,data analysis,scala,aws,ci/cd,python,hadoop,sql proficiency,problem solving,data engineering,sql,big data technologies,data warehousing,etl,spark Show more Show less

Posted 5 days ago

Apply

5.0 - 7.0 years

18 - 20 Lacs

Pune

Work from Office

Naukri logo

Required skills Experience in Data Warehousing domain Mandatory - Experience and deep knowledge of Spark, Kafka, open table, NOSQL and SQL DBs. Experience in Kubernetes Hands-on Experience in Java. microservices and K8S Should have hands-on coding experience Considered a plus: Monitoring tools like Prometheus and Grafana Working with Copilot writing prompts to enhance efficiency during development

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

We're Hiring: Coding Teachers for K-12 at PopSkool Are you passionate about teaching kids how to code? Do you believe that platforms like Python, Scratch, Minecraft, and Roblox can spark creativity, problem-solving, and real-world skills in young minds? Join PopSkool , an innovative online learning platform bringing playful, hands-on education to students around the world. Fill this form apart from doing an EASY APPLY on linkedin: https://forms.gle/E6PsdBiwAVvSjG9n7 What We’re Looking For: Experience teaching Python, Scratch , and other programming languages to K-12 students Passion for working with kids and making learning fun and engaging Willingness to upskill in Minecraft and Roblox as teaching tools Strong communication skills with excellent spoken English Prior online teaching experience is a plus Must be based in India and available to teach during US time zones Laptop with minimum 4GB RAM and a graphics card required Why Join Us: Teach students globally with 100% remote, flexible hours Earn ₹600/hour for group classes and ₹450/hour for 1:1 sessions, with performance-based bonuses Join a fast-growing edtech platform with global partnerships Co-create classes and curriculum with autonomy and creativity Get access to training, support, and a vibrant educator community We're on a mission to reimagine how coding and technology are taught—and we’re looking for educators who are excited to be part of that journey. #Hiring #CodingTeacher #Python #Scratch #OnlineTeaching #RemoteJobs #EdTech #Minecraft #Roblox #PopSkool Show more Show less

Posted 5 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Lead the movement for mental clarity, reflection, and purpose-driven student life. Are you someone who believes that clarity, consistency, and emotional strength should matter just as much as grades and placements? Do you want to spark meaningful conversations on mental fitness and goal alignment within your college community? Then E-Journaling invites you to be part of something transformative, join us as a Campus Ambassador. As part of India’s first structured journaling initiative at the national level, backed by scientific research, we are building a network of Student Reflection Leaders committed to promoting emotional well-being and purposeful living. About the Program : The E-Journaling Campus Ambassador Program is a certified, remote leadership opportunity for college students who want to champion mindful habits, personal growth, and clarity on campus. As a Campus Ambassador, you will: Promote mental clarity and emotional strength through online and on-campus engagement Organize journaling challenges, reflection circles, and awareness campaigns Inspire peers to explore the power of writing as a tool for self-discovery and life design What’s in It for You? Certificate of Excellence Resume-worthy title: “Student Reflection Lead” Free access to the E-Journaling 1-on-1 mentorship from mental wellness professionals Be featured in our national magazine Join a growing network of growth-minded student leaders across India Your Role at a Glance Share journaling challenges/prompts on social and WhatsApp groups Encourage students to begin their reflection journey Organize one journaling circle or activity at your campus Submit one short journal story or feedback during your term We’ll support you every step of the way, with ready-made templates, training, design assets, and personal guidance. Apply Now: This is your chance to inspire real change, starting with your campus. Selected students will be contacted via email for onboarding. Because the students who reflect today... are the leaders who build tomorrow. For any inquiries, please send us a message or contact us at hr@ejournaling.com. Show more Show less

Posted 5 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Applications Development Senior Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Conduct tasks related to feasibility studies, time and cost estimates, IT planning, risk technology, applications development, model development, and establish and implement new or revised applications systems and programs to meet specific business needs or user areas Monitor and control all phases of development process and analysis, design, construction, testing, and implementation as well as provide user and operational support on applications to business users Utilize in-depth specialty knowledge of applications development to analyze complex problems/issues, provide evaluation of business process, system process, and industry standards, and make evaluative judgement Recommend and develop security measures in post implementation analysis of business usage to ensure successful system design and functionality Consult with users/clients and other technology groups on issues, recommend advanced programming solutions, and install and assist customer exposure systems Ensure essential procedures are followed and help define operating standards and processes Serve as advisor or coach to new or lower level analysts Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Required Qualifications 8+ years of experience as a Software Engineer/Developer using Java Multiple years of experience with software engineering best practices (unit testing, automation, design patterns, peer review, etc.) Clear understanding of Data Structures and Object Oriented Principles using Java Multiple years of experience with Cloud-native development and Container Orchestration tools (Serverless, Docker, Kubernetes, OpenShift, etc.). Multiple years of experience on Service Oriented and MicroServices architectures, including REST implementations Multiple years of experience with frameworks like Spring Boot. Exposure to Continuous Integration and Continuous Delivery (CI/CD) pipelines, either on-premise or public cloud (i.e., Tekton, Harness, CircleCI, Cloudbees Jenkins, etc.) Multiple years of experience with agile and iterative software delivery Preferred Qualifications Exposure to architecture experience in building horizontally scalable, highly available, highly resilient, and low latency applications Exposure to Cloud infrastructure both on-premise and public cloud (i.e., OpenShift, AWS, etc.) Exposure to event-driven design and architecture (i.e., Kafka, Spark Flink, etc.) Exposure to API Management tools Exposure to Infrastructure as Code tools (i.e., Terraform, Cloudformation, etc.) Experience mentoring and providing technical leadership for teams of 5 or more developers Exposure to database concepts (RDBMS, NoSQL) and web-based technologies (Angular/React) is a plus Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 5 days ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

The Role The Data Engineer is accountable for developing high quality data products to support the Bank’s regulatory requirements and data driven decision making. A Data Engineer will serve as an example to other team members, work closely with customers, and remove or escalate roadblocks. By applying their knowledge of data architecture standards, data warehousing, data structures, and business intelligence they will contribute to business outcomes on an agile team. Responsibilities Developing and supporting scalable, extensible, and highly available data solutions Deliver on critical business priorities while ensuring alignment with the wider architectural vision Identify and help address potential risks in the data supply chain Follow and contribute to technical standards Design and develop analytical data models Required Qualifications & Work Experience First Class Degree in Engineering/Technology (4-year graduate course) 9 to 11 years’ experience implementing data-intensive solutions using agile methodologies Experience of relational databases and using SQL for data querying, transformation and manipulation Experience of modelling data for analytical consumers Ability to automate and streamline the build, test and deployment of data pipelines Experience in cloud native technologies and patterns A passion for learning new technologies, and a desire for personal growth, through self-study, formal classes, or on-the-job training Excellent communication and problem-solving skills An inclination to mentor; an ability to lead and deliver medium sized components independently Technical Skills (Must Have) ETL: Hands on experience of building data pipelines. Proficiency in two or more data integration platforms such as Ab Initio, Apache Spark, Talend and Informatica Big Data : Experience of ‘big data’ platforms such as Hadoop, Hive or Snowflake for data storage and processing Data Warehousing & Database Management : Expertise around Data Warehousing concepts, Relational (Oracle, MSSQL, MySQL) and NoSQL (MongoDB, DynamoDB) database design Data Modeling & Design : Good exposure to data modeling techniques; design, optimization and maintenance of data models and data structures Languages : Proficient in one or more programming languages commonly used in data engineering such as Python, Java or Scala DevOps : Exposure to concepts and enablers - CI/CD platforms, version control, automated quality control management Data Governance: A strong grasp of principles and practice including data quality, security, privacy and compliance Technical Skills (Valuable) Ab Initio : Experience developing Co>Op graphs; ability to tune for performance. Demonstrable knowledge across full suite of Ab Initio toolsets e.g., GDE, Express>IT, Data Profiler and Conduct>IT, Control>Center, Continuous>Flows Cloud : Good exposure to public cloud data platforms such as S3, Snowflake, Redshift, Databricks, BigQuery, etc. Demonstratable understanding of underlying architectures and trade-offs Data Quality & Controls : Exposure to data validation, cleansing, enrichment and data controls Containerization : Fair understanding of containerization platforms like Docker, Kubernetes File Formats : Exposure in working on Event/File/Table Formats such as Avro, Parquet, Protobuf, Iceberg, Delta Others : Experience of using a Job scheduler e.g., Autosys. Exposure to Business Intelligence tools e.g., Tableau, Power BI Certification on any one or more of the above topics would be an advantage. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Digital Software Engineering ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic recruitment agency dedicated to connecting talented individuals with leading organizations across India. We focus on delivering exceptional talent solutions while fostering a culture of collaboration, integrity, and excellence. Our mission is to empower businesses by providing them with the right skill sets to succeed in their industry, while promoting growth and enhancing employee satisfaction. We pride ourselves on our professional approach to hiring and talent management. Job Title: AWS Data Engineer Location: On-site in India We are seeking a skilled AWS Data Engineer to join our team. In this role, you will be responsible for designing and implementing data solutions on AWS infrastructure. You will work closely with cross-functional teams to ensure data integrity and availability, while also developing and optimizing data pipelines to support analytics and reporting needs. Role Responsibilities Design, develop and implement scalable data architectures on AWS. Develop ETL processes for data ingestion from various sources. Optimize AWS data pipelines for performance and cost efficiency. Collaborate with data analysts to understand data requirements. Create and maintain data models for efficient query performance. Conduct data quality assessments and implement necessary improvements. Implement data security and compliance measures on AWS. Maintain documentation for all systems and processes. Monitor and troubleshoot data-related issues on AWS. Integrate third-party services and APIs for data retrieval. Train and mentor junior data engineers. Participate in code reviews and maintain software engineering best practices. Stay up-to-date with AWS features and best practices in data engineering. Contribute to refining data engineering standards and processes. Support migration of legacy data systems to AWS. Qualifications Bachelor’s degree in Computer Science or related field. 3+ years of experience in data engineering or similar roles. Extensive experience with AWS services (S3, Redshift, Lambda, Glue). Strong proficiency in SQL and relational databases. Experience with Python and/or Java for data processing. Knowledge of Big Data technologies like Hadoop or Spark. Familiarity with data warehousing concepts and design. Experience in data modeling and database design. Strong analytical and problem-solving skills. Ability to work collaboratively in a team environment. Excellent communication skills, both verbal and written. Experience with analytics tools and reporting. Understanding of data privacy and security regulations. Hands-on experience with CI/CD processes. Ability to adapt to fast-paced environments. Certifications in AWS (e.g., AWS Certified Data Analytics) are a plus. Skills: data modeling,java,data analysis,scala,aws,ci/cd,python,hadoop,sql proficiency,problem solving,data engineering,sql,big data technologies,data warehousing,etl,spark Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

We are currently hiring with one of the top Healthcare Tech Industry for AIML Software Engineering role at Noida Location Looking for candidates from Top Product companies and Product Engineering background Job Description As an AI/ML Software Engineer, you will contribute towards the development and execution of the vision for enabling solutions. You will utilize your software engineering skills to ensure technical tasks and properly detailed, designed and executed per existing standards. You are expected to participate in low level design activities and breaking down into smaller technical tasks and mentor juniors in execution. Responsibilities Contribute code towards software development of the AI Studio Enterprise Platform. Design, implement, test, deploy and maintain innovative software solutions to transform service performance, durability, cost, and security. Use software engineering best practices to ensure a high standard of quality for all of the team’s deliverables. Write high quality distributed system software. Decompose complex problems into simple, straight-forward solutions Should be flexible, adapting to meet the needs of the team, project, or product. Solicit differing views and are willing to change your mind as you learn more. Required Qualifications Bachelor's degree in computer science or related field. 3+ years industry experience as a Software Engineer, Software Developer, or AI/ML Engineer. 3+ years of experience with demonstrable proficiency in programming languages such as Python or Java. 3+ years of experience working across the full software development life cycle exhibiting high quality of work e.g. via coding standards, code reviews, source control management, build process, testing, and operations. 1+ years of experience with AI/ML concepts and knowledge of CI/CD and MLOps practices. 1+ years of experience with cloud platforms and services (Azure, AWS, GCP, etc.) 1+ years of experience with design activities (architecture, design patterns, reliability, and scaling). Ability to work collaboratively in a team environment and contribute to a positive work culture. Preferred Qualifications Exposure to DevOps practices, including CI/CD, containerization (Docker, Kubernetes), and infrastructure as code. Higher-level understanding of AI/ML use-cases and production implementation. Experience with AI/ML frameworks such as TensorFlow, PyTorch, or Keras; or distributed computing frameworks such as Apache Spark. Show more Show less

Posted 5 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Company Overview Viraaj HR Solutions is a dynamic recruitment agency dedicated to connecting talented individuals with leading organizations across India. We focus on delivering exceptional talent solutions while fostering a culture of collaboration, integrity, and excellence. Our mission is to empower businesses by providing them with the right skill sets to succeed in their industry, while promoting growth and enhancing employee satisfaction. We pride ourselves on our professional approach to hiring and talent management. Job Title: AWS Data Engineer Location: On-site in India We are seeking a skilled AWS Data Engineer to join our team. In this role, you will be responsible for designing and implementing data solutions on AWS infrastructure. You will work closely with cross-functional teams to ensure data integrity and availability, while also developing and optimizing data pipelines to support analytics and reporting needs. Role Responsibilities Design, develop and implement scalable data architectures on AWS. Develop ETL processes for data ingestion from various sources. Optimize AWS data pipelines for performance and cost efficiency. Collaborate with data analysts to understand data requirements. Create and maintain data models for efficient query performance. Conduct data quality assessments and implement necessary improvements. Implement data security and compliance measures on AWS. Maintain documentation for all systems and processes. Monitor and troubleshoot data-related issues on AWS. Integrate third-party services and APIs for data retrieval. Train and mentor junior data engineers. Participate in code reviews and maintain software engineering best practices. Stay up-to-date with AWS features and best practices in data engineering. Contribute to refining data engineering standards and processes. Support migration of legacy data systems to AWS. Qualifications Bachelor’s degree in Computer Science or related field. 3+ years of experience in data engineering or similar roles. Extensive experience with AWS services (S3, Redshift, Lambda, Glue). Strong proficiency in SQL and relational databases. Experience with Python and/or Java for data processing. Knowledge of Big Data technologies like Hadoop or Spark. Familiarity with data warehousing concepts and design. Experience in data modeling and database design. Strong analytical and problem-solving skills. Ability to work collaboratively in a team environment. Excellent communication skills, both verbal and written. Experience with analytics tools and reporting. Understanding of data privacy and security regulations. Hands-on experience with CI/CD processes. Ability to adapt to fast-paced environments. Certifications in AWS (e.g., AWS Certified Data Analytics) are a plus. Skills: data modeling,java,data analysis,scala,aws,ci/cd,python,hadoop,sql proficiency,problem solving,data engineering,sql,big data technologies,data warehousing,etl,spark Show more Show less

Posted 5 days ago

Apply

5.0 - 9.0 years

20 - 27 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Designing and implementing data processing systems using Microsoft Fabric, Azure Data Analytics, Databricks and other distributed frameworks (ex Hadoop, Spark, Snowflake, Airflow) Writing efficient and scalable code to process, transform, and clean large volumes of structured and unstructured data Designing data pipelines: Snowflake Data Cloud uses data pipelines to ingest data into its system from sources like databases, cloud storage, or streaming platforms A Snowflake Data Engineer designs, builds, and fine-tunes these pipelines to make sure that all data is loaded into Snowflake correctly Designing and implementing data processing systems using Microsoft Fabric, Azure Data Analytics, Databricks and other distributed frameworks (ex Hadoop, Spark, Snowflake, Airflow) Writing efficient and scalable code to process, transform, and clean large volumes of structured and unstructured data Designing data pipelines: Snowflake Data Cloud uses data pipelines to ingest data into its system from sources like databases, cloud storage, or streaming platforms A Snowflake Data Engineer designs, builds, and fine-tunes these pipelines to make sure that all data is loaded into Snowflake correctly

Posted 5 days ago

Apply

8.0 - 13.0 years

22 - 27 Lacs

Mumbai, Pune

Work from Office

Naukri logo

Key Responsibilities: Design, build, and maintain CI/CD pipelines for ML model training, validation, and deployment Automate and optimize ML workflows, including data ingestion, feature engineering, model training, and monitoring Deploy, monitor, and manage LLMs and other ML models in production (on-premises and/or cloud) Implement model versioning, reproducibility, and governance best practices Collaborate with data scientists, ML engineers, and software engineers to streamline end-to-end ML lifecycle Ensure security, compliance, and scalability of ML/LLM infrastructure Troubleshoot and resolve issues related to ML model deployment and serving Evaluate and integrate new MLOps/LLMOps tools and technologies Mentor junior engineers and contribute to best practices documentation Required Skills & Qualifications: 8+ years of experience in DevOps, with at least 3 years in MLOps/LLMOps Strong experience with cloud platforms (AWS, Azure, GCP) and container orchestration (Kubernetes, Docker) Proficient in CI/CD tools (Jenkins, GitHub Actions, GitLab CI, etc.) Hands-on experience deploying and managing different types of AI models (e.g., OpenAI, HuggingFace, custom models) to be used for developing solutions. Experience with model serving tools such as TGI, vLLM, BentoML, etc. Solid scripting and programming skills (Python, Bash, etc.) Familiarity with monitoring/logging tools (Prometheus, Grafana, ELK stack) Strong understanding of security and compliance in ML environments Preferred Skills: Knowledge of model explainability, drift detection, and model monitoring Familiarity with data engineering tools (Spark, Kafka, etc. Knowledge of data privacy, security, and compliance in AI systems. Strong communication skills to effectively collaborate with various stakeholders Critical thinking and problem-solving skills are essential Proven ability to lead and manage projects with cross-functional teams

Posted 5 days ago

Apply

8.0 - 13.0 years

20 - 27 Lacs

Mumbai, Pune

Work from Office

Naukri logo

Key Responsibilities: Lead the design and implementation of AI/ML solutions across various business domains. Architect end-to-end GenAI solutions, ensuring scalability, reliability, and security. Collaborate with data scientists, engineers, and product managers to translate business requirements into technical solutions. Evaluate and select appropriate AI frameworks, tools, and build core AI framework. Ensure best practices in software engineering, code quality, and system security. Oversee integration of AI models into existing software products and cloud environments. Mentor and guide development teams on AI/ML best practices and software architecture. Stay updated with the latest advancements in AI, ML, and software engineering. Required Qualifications: 8+ years of experience in software development, with at least 3 years in AI/ML solution architecture. Strong programming skills in Python, Java, or similar languages. Hands-on experience with AI/ML frameworks (TensorFlow, PyTorch, Scikit-learn, etc.). Solid understanding of software architecture patterns, APIs, and microservices. Proven track record of delivering complex AI projects from concept to production. Preferred Skills: Experience with MLOps, CI/CD for ML, and containerization (Docker, Kubernetes). Familiarity with data engineering tools (Spark, Kafka, Airflow). Knowledge of data privacy, security, and compliance in AI systems. Strong communication skills to effectively collaborate with various stakeholders Critical thinking and problem-solving skills are essential Proven ability to lead and manage projects with cross-functional teams

Posted 5 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Join Us as a Learning Facilitator at Sparkling Mindz Global School Where children (and adults) learn to think, feel, question, dream, and change the world. Who We Are In a world that’s evolving faster than ever, we believe schools shouldn’t just prepare children for exams—they should prepare them for life . At Sparkling Mindz , we reimagine learning to make it inspiring, engaging, and empowering . We are a purpose-driven, Reggio-Emilia inspired, game-based school committed to nurturing children as thinkers, doers, dreamers, and changemakers. Who You Are You're not just looking for another job. You're looking for a mission. You believe: That children are capable and deserve voice, choice, and challenge. That learning should spark joy , curiosity, and meaning—for everyone involved. That the classroom is not a performance stage, but a space for co-creation . That you're still learning, and you want to grow in an ecosystem that walks its talk on empathy, creativity, collaboration, and purpose. If this sounds like you, read on. We’d love to meet you. The Role: Learning Facilitator Subjects : English, Hindi, Kannada, Science, Social Studies Grades : Primary & Middle School Location : Kannuru, Hennur-Bagalur Road, North Bangalore Type : Full-Time Timings : Mon–Fri (8 AM–5.30 PM) + alternate Saturdays What You'll Do You will: Plan child-centric learning experiences aligned with our pedagogy Design lessons, games, and provocations that make children wonder, connect, and explore Facilitate immersive sessions with openness to children's ideas and feedback Reflect, document, and co-learn from children's observations Collaborate across disciplines to deliver connected, purpose-based learning Communicate with parents regularly as learning partners Innovate and iterate—because no plan is ever perfect, and every child is different Participate in team meetings, training sessions, and reflection cycles What We're Looking For Qualifications : Graduate/Postgraduate in any discipline (B.Ed optional but valuable if paired with mindset and skills) Experience : 3–5 years working with children or in creative/educational fields English Language Arts Deep understanding of language acquisition, comprehension, and expression Ability to guide students in reading deeply, writing purposefully, and speaking confidently Experience designing reading journeys, author studies, creative writing provocations, and literature circles Comfort with integrating vocabulary, grammar, and structure meaningfully—not in isolation Mindset : Eager to learn and grow Comfortable with uncertainty and innovation Driven by empathy, purpose , and playful rigor Skills : Strong written & spoken communication Creativity, collaboration, and self-leadership Tech fluency (Word, PPT, Excel) Bonus if you have : A background in psychology, facilitation, creative writing, or public speaking Experience in game design, curriculum development, or changemaker education How to Apply We care deeply about fit. Please send: Your resume A short cover letter answering: Why do you want to work at Sparkling Mindz? What makes you a great fit for this role? 📧 Email us at: contact (at) sparklingmindz.in Why Join Us? Be part of a pioneering school redefining education Work with a team of passionate, mission-driven changemakers Experience growth as a facilitator and human being Learn through continuous mentorship, training, and reflection Help children grow into confident, ethical, purposeful leaders of tomorrow Show more Show less

Posted 5 days ago

Apply

6.0 - 8.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Flashed JC 72927, enable some profiles today. Experience: 6-8years Band: B3 Location: Bangalore , Chennai & Hyderabad Below is the JD: 6+ years of hands-on experience in software development or data science Support the company s commitment to protect the integrity and confidentiality of systems and data. Experience building E2E analytics platform using Streaming, Graph and Big Data platform Experience with Graph-based data workflows and working with Graph Analytics Extensive hands on experience in designing, developing, and maintaining software frameworks using Kafka, Spark, Neo4J, Tiger Graph DB, Hands one experience on Java, Scala or Python Design and build and deploy streaming and batch data pipelines capable of processing and storing large datasets quickly and reliably using Kafka, Spark and YARN Experience managing and leading small development teams in an Agile environment Drive and maintain a culture of quality, innovation and experimentation Collaborate with product teams, data analysts and data scientists to design and build data-forward solutions Provide the prescriptive point-solution architectures and guide the descriptive architectures within assigned modules Own technical decisions for the solution and application developers in the creation of architectural decisions and artifacts Manage day-to-day technology architectural decision for limited number of specified assigned modules including making decision on best path to achieve requirements and schedules. Own the quality of modules being delivered, insure proper testing and validation processes are followed. Ensure the point-solution architectures are in line with the enterprise strategies and principles Reviews technical designs to ensure that they are consistent with defined architecture principles, standards and best practices. Accountable for the availability, stability, scalability, security, and recoverability enabled by the designs Ability to clearly communicate with team & stakeholders Neo4j, Tigergraph Db

Posted 5 days ago

Apply

Exploring Spark Jobs in India

The demand for professionals with expertise in Spark is on the rise in India. Spark, an open-source distributed computing system, is widely used for big data processing and analytics. Job seekers in India looking to explore opportunities in Spark can find a variety of roles in different industries.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities have a high concentration of tech companies and startups actively hiring for Spark roles.

Average Salary Range

The average salary range for Spark professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-25 lakhs per annum

Salaries may vary based on the company, location, and specific job requirements.

Career Path

In the field of Spark, a typical career progression may look like: - Junior Developer - Senior Developer - Tech Lead - Architect

Advancing in this career path often requires gaining experience, acquiring additional skills, and taking on more responsibilities.

Related Skills

Apart from proficiency in Spark, professionals in this field are often expected to have knowledge or experience in: - Hadoop - Java or Scala programming - Data processing and analytics - SQL databases

Having a combination of these skills can make a candidate more competitive in the job market.

Interview Questions

  • What is Apache Spark and how is it different from Hadoop? (basic)
  • Explain the difference between RDD, DataFrame, and Dataset in Spark. (medium)
  • How does Spark handle fault tolerance? (medium)
  • What is lazy evaluation in Spark? (basic)
  • Explain the concept of transformations and actions in Spark. (basic)
  • What are the different deployment modes in Spark? (medium)
  • How can you optimize the performance of a Spark job? (advanced)
  • What is the role of a Spark executor? (medium)
  • How does Spark handle memory management? (medium)
  • Explain the Spark shuffle operation. (medium)
  • What are the different types of joins in Spark? (medium)
  • How can you debug a Spark application? (medium)
  • Explain the concept of checkpointing in Spark. (medium)
  • What is lineage in Spark? (basic)
  • How can you monitor and manage a Spark application? (medium)
  • What is the significance of the Spark Driver in a Spark application? (medium)
  • How does Spark SQL differ from traditional SQL? (medium)
  • Explain the concept of broadcast variables in Spark. (medium)
  • What is the purpose of the SparkContext in Spark? (basic)
  • How does Spark handle data partitioning? (medium)
  • Explain the concept of window functions in Spark SQL. (advanced)
  • How can you handle skewed data in Spark? (advanced)
  • What is the use of accumulators in Spark? (advanced)
  • How can you schedule Spark jobs using Apache Oozie? (advanced)
  • Explain the process of Spark job submission and execution. (basic)

Closing Remark

As you explore opportunities in Spark jobs in India, remember to prepare thoroughly for interviews and showcase your expertise confidently. With the right skills and knowledge, you can excel in this growing field and advance your career in the tech industry. Good luck with your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies