Home
Jobs
Companies
Resume

187 Apache Pig Jobs - Page 3

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Back About UsBCE Global Tech is a dynamic and innovative company dedicated to pushing the boundaries of technology We are on a mission to modernize global connectivity, one connection at a time Our goal is to build the highway to the future of communications, media, and entertainment, emerging as a powerhouse within the technology landscape in India We bring ambitions to life through design thinking that bridges the gaps between people, devices, and beyond, fostering unprecedented customer satisfaction through technology At BCE Global Tech, we are guided by our core values of innovation, customer-centricity, and a commitment to progress We harness cutting-edge technology to provide business outcomes with positive societal impact Our team of thought-leaders is pioneering advancements in 5G, MEC, IoT, and cloud-native architecture We offer continuous learning opportunities, innovative projects, and a collaborative work environment that empowers our employees to grow and succeed Responsibilities Lead the migration of data pipelines from Hadoop to Google Cloud Platform (GCP) Design, develop, and maintain data workflows using Airflow and custom flow solutions Implement infrastructure as code using Terraform Develop and optimize data processing applications using Java Spark or Python Spark Utilize Cloud Run and Cloud Functions for serverless computing Manage containerized applications using Docker Understand and enhance existing Hadoop pipelines Write and execute unit tests to ensure code quality Deploy data engineering solutions in production environments Craft and optimize SQL queries for data manipulation and analysis Requirements 7-8 years of experience in data engineering or related fields Proven experience with GCP migration from Hadoop pipelines Proficiency in Airflow and custom flow solutions Strong knowledge of Terraform for infrastructure management Expertise in Java Spark or Python Spark Experience With Cloud Run And Cloud Functions Experience with Data flow, DateProc and Cloud monitoring tools in GCP Familiarity with Docker for container management Solid understanding of Hadoop pipelines Ability to write and execute unit tests Experience with deployments in production environments Strong SQL query skills Skills Excellent teamwork and collaboration abilities Quick learner with a proactive attitude Strong problem-solving skills and attention to detail Ability to work independently and as part of a team Effective communication skills Why Join Us Opportunity to work with cutting-edge technologies Collaborative and supportive work environment Competitive salary and benefits Career growth and development opportunities

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Chennai

Work from Office

Naukri logo

About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Naukri logo

About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Kolkata

Work from Office

Naukri logo

About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!

Posted 2 weeks ago

Apply

5.0 years

7 Lacs

Hyderabad

Work from Office

Naukri logo

Design, implement, and optimize Big Data solutions using Hadoop and Scala. You will manage data processing pipelines, ensure data integrity, and perform data analysis. Expertise in Hadoop ecosystem, Scala programming, and data modeling is essential for this role.

Posted 2 weeks ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

The Apache Spark, Digital :Scala role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Apache Spark, Digital :Scala domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :BigData and Hadoop Ecosystems, Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :BigData and Hadoop Ecosystems, Digital :PySpark domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :Scala role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :Scala domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Big Data (PySpark, Hive) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySpark, Hive) domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Naukri logo

The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Chennai

Work from Office

Naukri logo

The Big Data (PySPark, Python) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySPark, Python) domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The Digital :BigData and Hadoop Ecosystems, Digital :Kafka role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :BigData and Hadoop Ecosystems, Digital :Kafka domain.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Project description Join our data engineering team to lead the design and implementation of advanced graph database solutions using Neo4j. This initiative supports the organization's mission to transform complex data relationships into actionable intelligence. You will play a critical role in architecting scalable graph-based systems, driving innovation in data connectivity, and empowering cross-functional teams with powerful tools for insight and decision-making. Responsibilities Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages EnglishC1 Advanced Seniority Senior

Posted 2 weeks ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

As a BigData Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Big Data Developer, Hadoop, Hive, Spark, PySpark, Strong SQL. Ability to incorporate a variety of statistical and machine learning techniques. Basic understanding of Cloud (AWS,Azure, etc). Ability to use programming languages like Java, Python, Scala, etc., to build pipelines to extract and transform data from a repository to a data consumer Ability to use Extract, Transform, and Load (ETL) tools and/or data integration, or federation tools to prepare and transform data as needed. Ability to use leading edge tools such as Linux, SQL, Python, Spark, Hadoop and Java Preferred technical and professional experience Basic understanding or experience with predictive/prescriptive modeling skills You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions

Posted 2 weeks ago

Apply

11.0 - 15.0 years

50 - 100 Lacs

Hyderabad

Work from Office

Naukri logo

Uber is looking for Staff Software Engineer - Data to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 2 weeks ago

Apply

6.0 - 10.0 years

19 - 25 Lacs

Gurugram

Work from Office

Naukri logo

Lead technology solution design and delivery Create and maintain optimal data solutions architecture and AI models Works with business partners to document complex company-wide acceptance test plans. Work concurrently on several projects, each with specific instructions that may differ from Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvementsautomating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud 'big data' technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with business-critical data insights, technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Troubleshoot production support issues post release deployment and come up with solutions Explain, Socialize and Vet designs internal and external stakeholders Undergraduate degree or equivalent experience. Undergraduate Degree in Engineering or equivalent Over 7 years of experience in Data Engineering and Advanced Analytics Strong experience in build Generative AI based solutions for data management (data pipelines, data standardization, data quality) and data analytics. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Experience building and optimizing 'big data' data pipelines, architectures and data sets. Experience in Cloud technologies and SNOWFLAKE Experience in Kafka development Experience in Python/Java programing Experience in creating business data models Experience in Report development and dashboarding Strong Experience in driving Customer Experience Experience in working with agile teams Experience in Healthcare Clinical Domains Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong analytic skills related to working with unstructured datasets. A successful history of manipulating, processing and extracting value from large, disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment. Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work.SM Diversity creates a healthier atmosphereUnitedHealth Group is an Equal Employment Opportunity/Affirmative Action employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin, protected veteran status, disability status, sexual orientation, gender identity or expression, marital status, genetic information, or any other characteristic protected by law. UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Analyzes and investigates Provides explanations and interpretations within area of expertise Participate in scrum process and deliver stories/features according to the schedule Collaborate with team, architects and product stakeholders to understand the scope and design of a deliverable Participate in product support activities as needed by the team. Understand product architecture, features being built and come up with product improvement ideas and POCs Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience Proven experience using Bigdata tech stack Sound knowledge on Java and Spring framework with good exposure to Spring Batch, Spring Data, Spring Web services, Python Proficient with Bigdata ecosystem (Sqoop, Spark, Hadoop, Hive, HBase) Proficient with Unix/Linux eco systems and shell scripting skills Proven Java, Kafka, Spark, Big Data, Azure ,analytical and problem solving skills Proven solid analytical and communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

15 - 19 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 3 weeks ago

Apply

6.0 - 7.0 years

12 - 17 Lacs

Mumbai

Work from Office

Naukri logo

ole Description : As a SCALA Tech Lead, you will be a technical leader and mentor, guiding your team to deliver robust and scalable solutions. You will be responsible for setting technical direction, ensuring code quality, and fostering a collaborative and productive team environment. Your expertise in SCALA and your ability to translate business requirements into technical solutions will be crucial for delivering successful projects. Responsibilities : - Understand and implement tactical or strategic solutions for given business problems. - Discuss business needs and technology requirements with stakeholders. - Define and derive strategic solutions and identify tactical solutions when necessary. - Write technical design and other solution documents per Agile (SCRUM) standards. - Perform data analysis to aid development work and other business needs. - Develop high-quality SCALA code that meets business requirements. - Perform unit testing of developed code using automated BDD test frameworks. - Participate in testing efforts to validate and approve technology solutions. - Follow MS standards for the adoption of automated release processes across environments. - Perform automated regression test case suites and support UAT of developed solutions. - Work using collaborative techniques with other FCT (Functional Core Technology) and NFRT (Non-Functional Requirements Team) teams. - Communicate effectively with stakeholders and team members. - Provide technical guidance and mentorship to team members. - Identify opportunities for process improvements and implement effective solutions. - Drive continuous improvement in code quality, development processes, and team performance. - Participate in post-mortem reviews and implement lessons learned. Qualifications : Experience : - [Number] years of experience in software development, with a focus on SCALA. - Proven experience in leading and mentoring software development teams. - Experience in designing and implementing complex SCALA-based solutions. - Strong proficiency in SCALA programming language. - Experience with functional programming concepts and libraries. - Knowledge of distributed systems and data processing technologies. - Experience with automated testing frameworks (BDD). - Familiarity with Agile (SCRUM) methodologies. - Experience with CI/CD pipelines and DevOps practices. - Understanding of data analysis and database technologies.

Posted 3 weeks ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Mumbai

Work from Office

Naukri logo

Location Mumbai Role Overview : As a Big Data Engineer, you'll design and build robust data pipelines on Cloudera using Spark (Scala/PySpark) for ingestion, transformation, and processing of high-volume data from banking systems. Key Responsibilities : Build scalable batch and real-time ETL pipelines using Spark and Hive Integrate structured and unstructured data sources Perform performance tuning and code optimization Support orchestration and job scheduling (NiFi, Airflow) Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required : Proficiency in PySpark/Scala with Hive/Impala Experience with data partitioning, bucketing, and optimization Familiarity with Kafka, Iceberg, NiFi is a must Knowledge of banking or financial datasets is a plus

Posted 3 weeks ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies