Home
Jobs

339 Mapreduce Jobs - Page 7

Filter
Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

The PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the PySpark domain.

Posted 2 weeks ago

Apply

175.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Functional Description: American Express is on a journey to provide the world’s best customer experience every day. The GCS Product Analytics team plays a pivotal role within Global Commercial Services by developing cutting-edge data capabilities and leveraging advanced analytics to gain deep insights into client behavior. Our mission is to inform and shape product strategies and deliver connected, personalized experiences that foster deeper client engagement and drive sustainable, profitable growth. The Digital Measurement & Analytics team is a part of GCS Product Analytics team and delivers digital analytics and insights for the GCS suite of digital products & platforms. The team is responsible for innovating and transforming the process to measure and understand the customer behavior towards our digital tools. The analytical work will uncover insights that will drive GCS’ global digital strategy and optimize the customer experience. Through world class innovation and advanced analytics, the team will create segmentations, develop KPIs, models and strategic analytics to solve key business opportunities. This will be achieved through a close collaboration with the digital product teams, marketing, servicing and technologies. Purpose of the Role Deliver actionable insights for GCS Digital Experiences by democratizing digital data, measuring product performance, conducting customer behavior deep dives and go-to-market segmentations. The role requires exhibiting a high level of expertise in driving decisions backed by data insights, strategic and advanced analytics, and data techniques. They will drive improvements in generating data driven actionable strategies to enable business growth initiatives. How will you make an impact in this role? In this role, the incumbent will be part of Digital Measurement & Analytics team. They will apply advanced analytics to drive segmentations, develop KPIs, models and strategic analytics to solve key business opportunities. This will be achieved through a close collaboration with the digital product teams, marketing, servicing, technologies, and field teams. They will– design measurement framework, conduct behavioral deep dives using Amex closed loop data to uncover product improvement opportunities, enable experimentation (AB Testing), work with leadership to define product strategy and maintain product performance reports/dashboards. This role requires candidates with analytical bent of mind and exceptional quantitative, problem-solving, and business story-telling skills. Responsibilities: Specific job responsibilities may vary as per the team responsibilities, but will involve aspects of the below: Perform in-depth data analysis to deliver strategic priorities focused on the product roadmap for GCS Digital Experiences Define KPIs to measure the efficiency of digital channels/products and develop customer segmentation to drive “adoption and engagement” for AXP customers. Power in-depth strategic analysis and provide analytical and decision support by mining digital activity data along with AXP closed loop data. Gain deep functional understanding of the GCS digital channels over time and ensure analytical insights are relevant and actionable. Flawless execution of the development, validation and implementation of statistical projects and automated reports. Evaluate impact on business of different strategies/initiatives and generate insights and recommendations to fuel business growth. Build partnerships with internal partners such as Product, Technologies, Field, Servicing and Marketing to plan and prioritize various initiatives. Empower self-serve by crafting automated dashboards and reports using Adobe analytics suite or Tableau. Continuously broaden and strengthen knowledge of advance analytical methods and tools to further evolve our analytical practices. Minimum Qualification 1 to 3 years of relevant analytics experience. Advanced degree in business administration, computer science, IT or Information management from premium institutes (preferred) Strong analytical, strategic thought leadership and problem-solving skills with ability to solve unstructured and complex business problems. Team player: Able to collaborate with partners and team members to define key business objectives, and to align on solutions that drive actionable items. Strong interpersonal, written, verbal communication, presentation, and storytelling skills enabling ability to interact effectively with business leaders and to present structured and compelling messages addressed to various levels within the organization. Results driven with strong project management skills, ability to work on multiple priorities and ensure track to exceed team goals. Passion for data science and machine learning: Proven track record of independently developing novel analytical solutions optimizing business processes or product constructs. Strong ability to drive results, self-starter. Experience in digital domain preferred. Technical Skills/Capabilities Data Data manipulation – large & complex data sets Segmentation Analytics Business Intelligence & Visualization Machine Learning & AI Statistics & Hypothesis Testing Basic understanding of Agile product development Knowledge of Platforms Big Data – Cornerstone, Hive, MapReduce Digital Tracking – Omniture/Adobe Analytics, Clickstream Visualization - Tableau Compliance Language We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

The PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the PySpark domain.

Posted 2 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Naukri logo

The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Chennai

Work from Office

Naukri logo

The Big Data (PySPark, Python) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySPark, Python) domain.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Project description Join our data engineering team to lead the design and implementation of advanced graph database solutions using Neo4j. This initiative supports the organization's mission to transform complex data relationships into actionable intelligence. You will play a critical role in architecting scalable graph-based systems, driving innovation in data connectivity, and empowering cross-functional teams with powerful tools for insight and decision-making. Responsibilities Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages EnglishC1 Advanced Seniority Senior

Posted 2 weeks ago

Apply

2.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements. Identify and analyze issues, make recommendations, and implement solutions. Consult with users, clients, and other technology groups on issues, and recommend programming solutions and support customer exposure systems. Apply fundamental knowledge of programming languages for design specifications. Utilize knowledge of business processes, system processes, and industry standards to solve complex issues. Analyze information and make evaluative judgements to recommend solutions and improvements. Conduct testing and debugging, utilize script tools, and write basic code for design specifications. Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures. Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications. Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Skills And Qualifications 2-5 years of relevant experience. Proficient understanding of distributed computing principles Proficient in SQL coding and Tuning. Experience with Tableau software both on Tableau Desktop and Tableau Server. Experience with integration of data from multiple data sources. Good knowledge of Big Data querying tools, such as Hive and Impala. Knowledge on Hadoop v2, MapReduce, HDFS, Pyspark, and Spark is plus but not mandatory. Education: Bachelor’s degree/University degree or equivalent experience Responsibilities Strong on data analytics with executing SQL queries in hive/impala/spark to analyze data and fix data issue as and when required. Strong experience on data structure. Developing advanced reporting, analytics, dashboards, and other business-intelligence solutions. Creating visually appealing and interactive dashboards is a primary responsibility. Connecting Tableau to different databases, ensuring data accuracy, and maintaining the integrity of data feeds also responsible for cleansing and preparing data to be effectively used in Tableau for analysis and reporting. Responsibility includes improving load times, enhancing responsiveness, and handling large datasets efficiently. Setting up appropriate access controls, monitoring data usage, and protecting sensitive information. Improving performance by twisting SQL queries. Discovering areas of automation for making the business process more efficient. Performing and documenting data analysis, data validation, and data mapping/design. This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less

Posted 2 weeks ago

Apply

3.0 - 5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Title: Data Scientist SDE-2, SDE-3 & SDE-4 (Staff Data Scientist) Location: Noida & Bangalore Job Type: Full Time About us: PhysicsWallah is an Indian online education technology startup based in Delhi, originally created as a YouTube channel in 2014 by Mr. Alakh Pandey. We are the first company aiming to build an affordable online education platform for each Indian student who dreams of IIT & AIIMS but is unable to afford the existing offline/online education providers. We provide e-learning via our YouTube Channel and PhysicsWallah App/Website by providing lectures for JEE Mains and Advanced level, NEET and Board Exams. We are India’s first most viewed Educational channel on Youtube. YouTube Channel- https://youtube.com/c/PhysicsWallah About the Role: Qualification & Eligibility: Bachelor's or higher degree in a quantitative discipline (computer science, statistics, engineering, applied mathematics) Working Experience: SDE-2: 3 to 5 Years SDE-3: 5 to 7 Years SDE-4 (Staff): 7 to 10 Years Startup experience preferred, Edtech work experience bonus Roles & responsibilities: Help teams understand what data science can do for them and set the right expectations. Use deep learning, machine learning, and analytical methods to create scalable solutions for business problems. Create innovative solutions and applications utilising advanced NLP algorithms/architectures including (but not limited to ) LLMs for tasks such as text generation, summarization, translation, entity extraction and concept recognition, clustering, and more. Contribute to the execution of our vision for NLP-based technology solutions using various NLP toolkits like Huggingface, Spacy, CoreNLP, OpenNLP, etc. Perform relevant data analysis and benchmark the NLP solutions to improve our offerings. Be able to clearly communicate results and recommendations to various stakeholders. Evaluate the effectiveness of the solutions and improve upon them in a continuous manner. We expect one to have a mix of a strong technical background, the ability to understand the business implications of their work, and the ability to empathise with our users and work towards helping PhysicsWallah give them the best experience. Help and mentor junior members to become better data scientists. Skill Sets: Experience in building NLP (ML/DL) models. Strong foundational knowledge in transformers ( BERT, GPTs, T5s, etc.), embedded. Expertise in SQL and Python is a must. Hands-on experience with latest GenAI modelling (GPTs, Mistral, Falcon, and LLama) and approaches (RAG, Langchain, etc.). Experience using machine learning (structured, text, audio/video data) libraries (preferably in Python), deep learning (Tensorflow, PyTorch) Good to have: Foundational knowledge of any cloud (AWS/Azure/GCP) Expertise in querying relational, non-relational, graph databases. Experience in big data technologies (Spark, MapReduce, Pig, and Hive) Show more Show less

Posted 2 weeks ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Big Data Engineer (Remote, Contract 6 Months+) ig Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. #KeyResponsibilities Design, develop, and maintain scalable data pipelines and big data solutions. Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop. Process large data volumes from diverse sources using Hadoop ecosystem tools. Build end-to-end data workflows for batch and streaming pipelines. Optimize data storage and retrieval processes in HBase, Hive, and other NoSQL databases. Collaborate with data scientists and business stakeholders to design robust data infrastructure. Ensure data integrity, consistency, and security in line with organizational policies. Troubleshoot and tune performance for distributed systems and applications. #MustHaveSkills in Data Engineering / Big Data Tools: Snowflake (Snowpark), Spark, MapReduce, Hadoop, Sqoop, Pig, HBase Data Ingestion & ETL, Data Pipeline Design, Distributed Computing Strong understanding of Big Data architectures & performance tuning Hands-on experience with large-scale data storage and query optimization #NiceToHave Apache Airflow / Oozie experience Knowledge of cloud platforms (AWS, Azure, or GCP) Proficiency in Python or Scala CI/CD and DevOps exposure #ContractDetails Role: Senior Big Data Engineer Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Duration: 6+ Months (Contract) Apply via Email: navaneeta@suzva.com Contact: 9032956160 #HowToApply Send your updated resume with the subject: "Application for Remote Big Data Engineer Contract Role" Include in your email: Updated Resume Current CTC Expected CTC Current Location Notice Period / Availabilit

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Role: At Simplify Healthcare AI, we are focused on building industry-focused AI models. We are seeking a highly skilled and passionate Senior Data Scientist to join our dynamic team. In this role, you will be at the forefront of designing, developing, and implementing advanced machine-learning models that will have a direct impact on our products and offerings. Responsibilities: Data Preprocessing : Organize, categorize, and prep data for processing. Identify incomplete, incorrect, inaccurate or irrelevant portions of data and interpret, cleanse or replace/delete them as needed. Advanced Data Analysis : Utilize advanced statistical techniques to analyze complex, large sets of data, and extract actionable insights that will help drive decision-making processes across various functions of the business. Model Development : Develop, refine, and implement machine learning algorithms and statistical models that will help create data-driven solutions to business problems. Also responsible for validating and testing these models to ensure accuracy. Collaboration : Work closely with the team of engineers and product designers to understand technical and business requirements, convert these to analytic solutions, and integrate them into our product suite. Insight-Driven Business Decisions : Utilize and translate complex results from the data analyses and models into strategic recommendations for both technical and non-technical stakeholders. Presentations : Represent the data science team in cross-functional teams to ensure correct and impactful data science principles are incorporated. Effectively present complex statistical concepts and results, as well as their implications, to a non-technical audience. Mentoring and Leadership : Provide mentorship and guidance to junior data scientists in the team, educate them on best practices, and guide them in their career paths. Required Skills: Minimum of 5 (Five) years of working experience as a data scientist. Strong knowledge of machine learning models, data mining, and databases. Experience with programming languages such as Python, SQL, etc. Knowledge of distributed data/computing tools like MapReduce, Hadoop, Hive, and Spark. Exceptional problem-solving skills, attention to detail, and a team player. Strong communication skills, with the ability to present complex technical findings to non-technical stakeholders. Experience in mentoring junior data scientists or leading a data science team is preferred.

Posted 3 weeks ago

Apply

11.0 - 15.0 years

50 - 100 Lacs

Hyderabad

Work from Office

Naukri logo

Uber is looking for Staff Software Engineer - Data to join our dynamic team and embark on a rewarding career journey Liaising with coworkers and clients to elucidate the requirements for each task. Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed. Reformulating existing frameworks to optimize their functioning. Testing such structures to ensure that they are fit for use. Preparing raw data for manipulation by data scientists. Detecting and correcting errors in your work. Ensuring that your work remains backed up and readily accessible to relevant coworkers. Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description YOUR IMPACT Are you passionate about developing mission-critical, high quality software solutions, using cutting-edge technology, in a dynamic environment? OUR IMPACT We are Compliance Engineering, a global team of more than 300 engineers and scientists who work on the most complex, mission-critical problems. We build and operate a suite of platforms and applications that prevent, detect, and mitigate regulatory and reputational risk across the firm. have access to the latest technology and to massive amounts of structured and unstructured data. leverage modern frameworks to build responsive and intuitive UX/UI and Big Data applications. Compliance Engi neering is looking to fill several big data software engineering roles Your first deliverable and success criteria will be the deployment, in 2025, of new complex data pipelines and surveillance models to detect inappropriate trading activity. How You Will Fulfill Your Potential As a member of our team, you will: partner globally with sponsors, users and engineering colleagues across multiple divisions to create end-to-end solutions, learn from experts, leverage various technologies including; Java, Spark, Hadoop, Flink, MapReduce, HBase, JSON, Protobuf, Presto, Elastic Search, Kafka, Kubernetes be able to innovate and incubate new ideas, have an opportunity to work on a broad range of problems, including negotiating data contracts, capturing data quality metrics, processing large scale data, building surveillance detection models, be involved in the full life cycle; defining, designing, implementing, testing, deploying, and maintaining software systems across our products. Qualifications A successful candidate will possess the following attributes: A Bachelor's or Master's degree in Computer Science, Computer Engineering, or a similar field of study. Expertise in java, as well as proficiency with databases and data manipulation. Experience in end-to-end solutions, automated testing and SDLC concepts. The ability (and tenacity) to clearly express ideas and arguments in meetings and on paper. Experience in the some of following is desired and can set you apart from other candidates : developing in large-scale systems, such as MapReduce on Hadoop/Hbase, data analysis using tools such as SQL, Spark SQL, Zeppelin/Jupyter, API design, such as to create interconnected services, knowledge of the financial industry and compliance or risk functions, ability to influence stakeholders. About Goldman Sachs Goldman Sachs is a leading global investment banking, securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals. Founded in 1869, the firm is headquartered in New York and maintains offices in all major financial centers around the world. Show more Show less

Posted 3 weeks ago

Apply

4.0 - 8.0 years

12 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Analyzes and investigates Provides explanations and interpretations within area of expertise Participate in scrum process and deliver stories/features according to the schedule Collaborate with team, architects and product stakeholders to understand the scope and design of a deliverable Participate in product support activities as needed by the team. Understand product architecture, features being built and come up with product improvement ideas and POCs Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience Proven experience using Bigdata tech stack Sound knowledge on Java and Spring framework with good exposure to Spring Batch, Spring Data, Spring Web services, Python Proficient with Bigdata ecosystem (Sqoop, Spark, Hadoop, Hive, HBase) Proficient with Unix/Linux eco systems and shell scripting skills Proven Java, Kafka, Spark, Big Data, Azure ,analytical and problem solving skills Proven solid analytical and communication skills At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 weeks ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Overall Responsibilities: Data Pipeline Development: Design, develop, and maintain highly scalable and optimized ETL pipelines using PySpark on the Cloudera Data Platform, ensuring data integrity and accuracy. Data Ingestion: Implement and manage data ingestion processes from a variety of sources (e.g., relational databases, APIs, file systems) to the data lake or data warehouse on CDP. Data Transformation and Processing: Use PySpark to process, cleanse, and transform large datasets into meaningful formats that support analytical needs and business requirements. Performance Optimization: Conduct performance tuning of PySpark code and Cloudera components, optimizing resource utilization and reducing runtime of ETL processes. Data Quality and Validation: Implement data quality checks, monitoring, and validation routines to ensure data accuracy and reliability throughout the pipeline. Automation and Orchestration: Automate data workflows using tools like Apache Oozie, Airflow, or similar orchestration tools within the Cloudera ecosystem. Monitoring and Maintenance: Monitor pipeline performance, troubleshoot issues, and perform routine maintenance on the Cloudera Data Platform and associated data processes. Collaboration: Work closely with other data engineers, analysts, product managers, and other stakeholders to understand data requirements and support various data-driven initiatives. Documentation: Maintain thorough documentation of data engineering processes, code, and pipeline configurations. Software Requirements: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Familiarity with Hadoop, Kafka, and other distributed computing tools. Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Strong scripting skills in Linux. Category-wise Technical Skills: PySpark: Advanced proficiency in PySpark, including working with RDDs, DataFrames, and optimization techniques. Cloudera Data Platform: Strong experience with Cloudera Data Platform (CDP) components, including Cloudera Manager, Hive, Impala, HDFS, and HBase. Data Warehousing: Knowledge of data warehousing concepts, ETL best practices, and experience with SQL-based tools (e.g., Hive, Impala). Big Data Technologies: Familiarity with Hadoop, Kafka, and other distributed computing tools. Orchestration and Scheduling: Experience with Apache Oozie, Airflow, or similar orchestration frameworks. Scripting and Automation: Strong scripting skills in Linux. Experience: 3+ years of experience as a Data Engineer, with a strong focus on PySpark and the Cloudera Data Platform. Proven track record of implementing data engineering best practices. Experience in data ingestion, transformation, and optimization on the Cloudera Data Platform. Day-to-Day Activities: Design, develop, and maintain ETL pipelines using PySpark on CDP. Implement and manage data ingestion processes from various sources. Process, cleanse, and transform large datasets using PySpark. Conduct performance tuning and optimization of ETL processes. Implement data quality checks and validation routines. Automate data workflows using orchestration tools. Monitor pipeline performance and troubleshoot issues. Collaborate with team members to understand data requirements. Maintain documentation of data engineering processes and configurations. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field. Relevant certifications in PySpark and Cloudera technologies are a plus. Soft Skills: Strong analytical and problem-solving skills. Excellent verbal and written communication abilities. Ability to work independently and collaboratively in a team environment. Attention to detail and commitment to data quality.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Job Requirements Job Requirements Role/ Job Title: Data Engineer - Gen AI Function/ Department: Data & Analytics Place of Work: Mumbai Job Purpose The data engineer will be working with our data scientists who are building solutions using generative AI in the domain of text, audio and images and tabular data. They will be responsible for working with large volumes of structured and unstructured data in its storage, retrieval and augmentation with our GenAI solutions which use the said data. Job & Responsibilities Build data engineering pipeline focused on unstructured data pipelines Conduct requirements gathering and project scoping sessions with subject matter experts, business users, and executive stakeholders to discover and define business data needs in GenAI. Design, build, and optimize the data architecture and extract, transform, and load (ETL) pipelines to make them accessible for Data Scientists and the products built by them. Work on end-to-end data lifecycle from Data Ingestion, Data Transformation and Data Consumption layer, versed with API and its usability Drive the highest standards in data reliability, data integrity, and data governance, enabling accurate, consistent, and trustworthy data sets A suitable candidate will also demonstrate experience with big data infrastructure inclusive of MapReduce, Hive, HDFS, YARN, HBase, MongoDB, DynamoDB, etc. Creating Technical Design Documentation of the projects/pipelines Good skills in technical debugging of the code in case of issues. Also, working with git for code versioning Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA Experience Range : 5-10 years of relevant experience Show more Show less

Posted 3 weeks ago

Apply

2.0 - 5.0 years

15 - 19 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 3 weeks ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

What you’ll do?  Lead and mentor a team of data scientists/analysts  Provide analytical insights by analyzing various types of data, including mining our customer data, review of relevant cases/samples, and incorporation of feedback from others.  Work closely with business partners and stakeholders to determine how to design analysis, testing, and measurement approaches that will significantly improve our ability to understand and address emerging business issues.  Produce intelligent, scalable, and automated solutions by leveraging Data Science skills.  Work closely with Technology teams on development of new capabilities to define requirements and priorities based on data analysis and business knowledge.  Developing expertise in specific areas by leading analytical projects independently, while setting goals, providing benefit estimations, defining workflows, and coordinating timelines in advance.  Providing updates to leadership, peers and other stakeholders that will simplify and clarify complex concepts and the results of analyses effectively with emphasis on the actionable outcomes and impact to business. Who you need to be? 4+ years in advanced analytics, statistical modelling, and machine learning. Best practice knowledge in credit risk - strong understanding of the full lifecycle from origination to debt collection. Well-versed with ML algos, BIG data concepts, and cloud implementations. High proficiency in Python and SQL/NoSQL. Collections and Digital Channels experience a plus. Strong organizational skills and excellent follow-through Outstanding written, verbal, and interpersonal communication skills High emotional intelligence, a can-do mentality and a creative approach to problem solving Takes personal ownership, Self-starter - ability to drive projects with minimal guidance and focus on high impact work. Learns continuously; Seeks out knowledge, ideas, and feedback. Looks for opportunities to build owns skills, knowledge, and expertise. Experience with big data and cloud computing viz. Spark, Hadoop (MapReduce, PIG, HIVE) Experience in risk and credit score domains preferred. Show more Show less

Posted 3 weeks ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

You Lead the Way. We’ve Got Your Back. With the right backing, people and businesses have the power to progress in incredible ways. When you join Team Amex, you become part of a global and diverse community of colleagues with an unwavering commitment to back our customers, communities and each other. Here, you’ll learn and grow as we help you create a career journey that’s unique and meaningful to you with benefits, programs, and flexibility that support you personally and professionally. At American Express, you’ll be recognized for your contributions, leadership, and impact—every colleague has the opportunity to share in the company’s success. Together, we’ll win as a team, striving to uphold our company values and powerful backing promise to provide the world’s best customer experience every day. And we’ll do it with the utmost integrity, and in an environment where everyone is seen, heard and feels like they belong. Join Team Amex and let's lead the way together. As part of our diverse tech team, you can architect, code and ship software that makes us an essential part of our customer's digital lives. Here, you can work alongside talented engineers in an open, supportive, inclusive environment where your voice is valued, and you make your own decisions on what tech to use to solve challenging problems. Amex offers a range of opportunities to work with the latest technologies and encourages you to back the broader engineering community through open source. And because we understand the importance of keeping your skills fresh and relevant, we give you dedicated time to invest in your professional development. Find your place in technology on #TeamAmex. Responsibilities include but are not limited to: Owns all technical aspects of software development for assigned applications. Performs hands-on architecture, design, and development of systems. Functions as member of an agile team and helps drive consistent development practices wrt tools, common components, and documentation. Typically spends 80% of time writing code and testing, and remainder of time collaborating with stakeholders through ongoing product/platform releases. Develops deep understanding of tie-ins with other Amex systems and platforms within the supported domains. Writes code and unit tests, works on API specs, automation, and conducts code reviews and testing. Performs ongoing refactoring of code, utilizes visualization and other techniques to fast-track concepts, and deliver continuous improvement - Identifies opportunities to adopt innovative technologies. Provides continuous support for ongoing application availability. Works closely with product owners on blueprints and annual planning of feature sets that impact multiple platforms and products. Works with product owners to prioritize features for ongoing sprints and managing a list of technical requirements based on industry trends, new technologies, known defects, and issues. Qualification: Bachelor's degree in computer science, computer engineering, other technical discipline, or equivalent work experience 5+ years of software development experience Demonstrated experience with Agile or other rapid application development methods Demonstrated experience with object-oriented design and coding Demonstrated experience on these core technical skills (Mandatory) Core Java, Spring Framework, Java EE Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) Spark Relational Database (Postgres / MySQL / DB2 etc) Cloud development (Micro-services) Parallel & distributed (multi-tiered) systems Application design, software development and automated testing Demonstrated experience on these additional technical skills (Nice to Have) Unix / Shell scripting Python / Scala Message Queuing, Stream processing (Kafka) Elastic Search Webservices, open API development, and REST concepts Experience with implementing integrated automated release management using tools/technologies/frameworks like Maven, Git, code/security review tools, Jenkins, Automated testing and Junit. We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less

Posted 3 weeks ago

Apply

6.0 - 7.0 years

12 - 17 Lacs

Mumbai

Work from Office

Naukri logo

ole Description : As a SCALA Tech Lead, you will be a technical leader and mentor, guiding your team to deliver robust and scalable solutions. You will be responsible for setting technical direction, ensuring code quality, and fostering a collaborative and productive team environment. Your expertise in SCALA and your ability to translate business requirements into technical solutions will be crucial for delivering successful projects. Responsibilities : - Understand and implement tactical or strategic solutions for given business problems. - Discuss business needs and technology requirements with stakeholders. - Define and derive strategic solutions and identify tactical solutions when necessary. - Write technical design and other solution documents per Agile (SCRUM) standards. - Perform data analysis to aid development work and other business needs. - Develop high-quality SCALA code that meets business requirements. - Perform unit testing of developed code using automated BDD test frameworks. - Participate in testing efforts to validate and approve technology solutions. - Follow MS standards for the adoption of automated release processes across environments. - Perform automated regression test case suites and support UAT of developed solutions. - Work using collaborative techniques with other FCT (Functional Core Technology) and NFRT (Non-Functional Requirements Team) teams. - Communicate effectively with stakeholders and team members. - Provide technical guidance and mentorship to team members. - Identify opportunities for process improvements and implement effective solutions. - Drive continuous improvement in code quality, development processes, and team performance. - Participate in post-mortem reviews and implement lessons learned. Qualifications : Experience : - [Number] years of experience in software development, with a focus on SCALA. - Proven experience in leading and mentoring software development teams. - Experience in designing and implementing complex SCALA-based solutions. - Strong proficiency in SCALA programming language. - Experience with functional programming concepts and libraries. - Knowledge of distributed systems and data processing technologies. - Experience with automated testing frameworks (BDD). - Familiarity with Agile (SCRUM) methodologies. - Experience with CI/CD pipelines and DevOps practices. - Understanding of data analysis and database technologies.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

JobOpening Big Data Engineer (Remote, Contract 6 Months+) Location: Remote | Contract Duration: 6+ Months | Domain: Big Data Stack We are looking for a Senior Big Data Engineer with deep expertise in large-scale data processing technologies and frameworks. This is a remote, contract-based position suited for a data engineering expert with strong experience in the Big Data ecosystem including Snowflake (Snowpark), Spark, MapReduce, Hadoop, and more. #KeyResponsibilities Design, develop, and maintain scalable data pipelines and big data solutions. Implement data transformations using Spark, Snowflake (Snowpark), Pig, and Sqoop. Process large data volumes from diverse sources using Hadoop ecosystem tools. Build end-to-end data workflows for batch and streaming pipelines. Optimize data storage and retrieval processes in HBase, Hive, and other NoSQL databases. Collaborate with data scientists and business stakeholders to design robust data infrastructure. Ensure data integrity, consistency, and security in line with organizational policies. Troubleshoot and tune performance for distributed systems and applications. #MustHaveSkills in Data Engineering / Big Data Tools: Snowflake (Snowpark), Spark, MapReduce, Hadoop, Sqoop, Pig, HBase Data Ingestion & ETL, Data Pipeline Design, Distributed Computing Strong understanding of Big Data architectures & performance tuning Hands-on experience with large-scale data storage and query optimization #NiceToHave Apache Airflow / Oozie experience Knowledge of cloud platforms (AWS, Azure, or GCP) Proficiency in Python or Scala CI/CD and DevOps exposure #ContractDetails Role: Senior Big Data Engineer Location: Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Duration: 6+ Months (Contract) Apply via Email: navaneeta@suzva.com Contact: 9032956160 #HowToApply Send your updated resume with the subject: "Application for Remote Big Data Engineer Contract Role" Include in your email: Updated Resume Current CTC Expected CTC Current Location Notice Period / Availabilit

Posted 3 weeks ago

Apply

11.0 - 20.0 years

30 - 45 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

We are currently seeking a Senior Principal Consultant in our Technical Services department within the Oracle NetSuite Customer Success Consulting team. This position requires heavy interaction with customers, other NetSuite application developers, NetSuite implementation teams, and NetSuite partners. This individual will be a part of the team that scopes, designs, develops, and deploys custom scripts, integrations, and workflow solutions for our customers. Job description displayed in the job posting An experienced consulting professional who has a broad understanding of solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Typical Workload: Each day can be very different. It is a great job for people looking to step out of normal routines. We have a broad range of responsibilities. The typical workload breaks down into the rough percentages: • 30% scripting, QA, creating and executing test plans. • 30% is customer facing, walking though use cases, reviewing test plans, and sometimes just brought in to solve problems. • 15% attend internal meetings including knowledge transfers, mentoring sessions for less experienced resources, and other strategic initiatives • 10% integration consulting with customer and third-party systems • 10% executive updates, documentation, and overall project management. • 5% data migration Responsibilities include: Track and report project progress to appropriate parties using NetSuite and Jira • Assist in defining custom scripts on NetSuites SuiteCloud platform • Collaborate with other NetSuite consultants to validate business/technical requirements through interview and analysis • Produce system design documents and participate in technical walkthroughs • Lead technical work streams, design scripts, or validate scripts, coordinate with other developers, Quality Assurance (QA) and deployment activities • Conduct Code Reviews • Conduct user acceptance testing for complex solutions • Assist in development, QA, and deployment processes as necessary to meet project requirements • Ability to work in a global team environment • Mentor less experienced consultants Preferred Qualifications/Skills include: • 10+ years of NetSuite or other ERP / CRM Solutions. NetSuite highly preferred. • A degree in mathematics, computer science or engineering • NetSuite SuiteCloud Development/Design/Testing/Code Review experience, including 3rd party Integration • Experience leading technical work streams including other developers and including global delivery teams. • Exposure to system architecture, object-oriented design, web frameworks and patterns, experience strongly preferred • Ability to author detailed documents capturing workflow processes, use cases, exception handling, and test cases • Consulting role experience • Software development lifecycle (SDLC) methodology knowledge and use • Software development (JavaScript preferred) • Proficiency in error resolution, error handling and debugging. • Experience with IDEs (WebStorm preferred), source control systems (GIT preferred), unit-testing tools and defect management tools • Experience with XML/XSL and Web Services (SOAP, WSDL, REST, JSON) • Experience developing web applications using JSP/Servlets, DHTML, and JavaScript • Experience with Jira • Strong interpersonal and communication skills

Posted 3 weeks ago

Apply

6.0 - 11.0 years

25 - 40 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

We are currently seeking a Senior Principal Consultant in our Technical Services department within the Oracle NetSuite Customer Success Consulting team. This position requires heavy interaction with customers, other NetSuite application developers, NetSuite implementation teams, and NetSuite partners. This individual will be a part of the team that scopes, designs, develops, and deploys custom scripts, integrations, and workflow solutions for our customers. Job description displayed in the job posting An experienced consulting professional who has a broad understanding of solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Typical Workload: Each day can be very different. It is a great job for people looking to step out of normal routines. We have a broad range of responsibilities. The typical workload breaks down into the rough percentages: • 30% scripting, QA, creating and executing test plans. • 30% is customer facing, walking though use cases, reviewing test plans, and sometimes just brought in to solve problems. • 15% attend internal meetings including knowledge transfers, mentoring sessions for less experienced resources, and other strategic initiatives • 10% integration consulting with customer and third-party systems • 10% executive updates, documentation, and overall project management. • 5% data migration Responsibilities include: Track and report project progress to appropriate parties using NetSuite and Jira • Assist in defining custom scripts on NetSuites SuiteCloud platform • Collaborate with other NetSuite consultants to validate business/technical requirements through interview and analysis • Produce system design documents and participate in technical walkthroughs • Lead technical work streams, design scripts, or validate scripts, coordinate with other developers, Quality Assurance (QA) and deployment activities • Conduct Code Reviews • Conduct user acceptance testing for complex solutions • Assist in development, QA, and deployment processes as necessary to meet project requirements • Ability to work in a global team environment • Mentor less experienced consultants Preferred Qualifications/Skills include: • 6+ years of NetSuite or other ERP / CRM Solutions. NetSuite highly preferred. • A degree in mathematics, computer science or engineering • NetSuite SuiteCloud Development/Design/Testing/Code Review experience, including 3rd party Integration • Experience leading technical work streams including other developers and including global delivery teams. • Exposure to system architecture, object-oriented design, web frameworks and patterns, experience strongly preferred • Ability to author detailed documents capturing workflow processes, use cases, exception handling, and test cases • Consulting role experience • Software development lifecycle (SDLC) methodology knowledge and use • Software development (JavaScript preferred) • Proficiency in error resolution, error handling and debugging. • Experience with IDEs (WebStorm preferred), source control systems (GIT preferred), unit-testing tools and defect management tools • Experience with XML/XSL and Web Services (SOAP, WSDL, REST, JSON) • Experience developing web applications using JSP/Servlets, DHTML, and JavaScript • Experience with Jira • Strong interpersonal and communication skills

Posted 3 weeks ago

Apply

1.0 - 4.0 years

1 - 5 Lacs

Mumbai

Work from Office

Naukri logo

Location Mumbai Role Overview : As a Big Data Engineer, you'll design and build robust data pipelines on Cloudera using Spark (Scala/PySpark) for ingestion, transformation, and processing of high-volume data from banking systems. Key Responsibilities : Build scalable batch and real-time ETL pipelines using Spark and Hive Integrate structured and unstructured data sources Perform performance tuning and code optimization Support orchestration and job scheduling (NiFi, Airflow) Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Skills Required : Proficiency in PySpark/Scala with Hive/Impala Experience with data partitioning, bucketing, and optimization Familiarity with Kafka, Iceberg, NiFi is a must Knowledge of banking or financial datasets is a plus

Posted 3 weeks ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the world’s technology leader. Come to IBM and make a global impact Responsibilities: Responsible to manage end to end feature development and resolve challenges faced in implementing the same Learn new technologies and implement the same in feature development within the time frame provided Manage debugging, finding root cause analysis and fixing the issues reported on Content Management back end software system fixing the issues reported on Content Management back end software system Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, more than 6 years of experience with more than 4+ years of Strong Hands on experience in Python and Spark Strong technical abilities to understand, design, write and debug to develop applications on Python and Pyspark. Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure strong problem-solving skill Preferred technical and professional experience Good to Have;- Hands on Experience on cloud technology AWS/GCP/Azure

Posted 3 weeks ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Linkedin logo

Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Senior Analyst, Data Strategy Overview The Data Quality team in the Data Strategy & Management organization (Chief Data Office) is responsible for developing and driving Mastercard’s core data management program and broadening its data quality efforts. This team ensures that Mastercard maintains and increases the value of Mastercard’s data assets as we enable new forms of payment, new players in the ecosystem and expand collection and use of data to support new lines of business. Enterprise Data Quality team for is responsible for ensuring data is of the highest quality and is fit for purpose to support business and analytic uses. The team works to identify and prioritize opportunities for data quality improvement, develops strategic mitigation plans and coordinates remediation activities with MC Tech and the business owner. Role Support the processes for improving and expanding merchant data, including address standardization, geocoding, incorporating new merchant data sources to assist in improvements. Assess quality issues in merchant data, transactional data, and other critical data sets. Support internal and external feedback loop to improve data submitted through the transaction data stream. Solution and present data challenges in a manner suitable for product and business understanding. Provide subject matter expertise on merchant data for the organization’s product development efforts Coordinate with MC Tech to develop DQ remediation requirements for core systems, data warehouse and other critical applications. Manage corresponding remediation projects to ensure successful implementation. Lead organization-wide awareness and communication of data quality initiatives and remediation activities, ensuring seamless implementation. Coordinate with critical vendors to manage project timelines and achieve quality deliverables Develop and implement with MC Tech data pipelines that extracts, transforms, and loads data into an information product that supports organizational strategic goals Implement new technologies and frameworks as per project requirements All About You Hands-on experience managing technology projects with demonstrated ability to understand complex data and technology initiatives Ability to lead and influence others to advance deliverables Understanding of emerging technologies including but not limited to, cloud architecture, machine learning/AI and Big Data infrastructure Data architecture experience and experience in building data models. Experience deploying and working with big data technologies like Hadoop, Spark, and Sqoop. Experience with streaming frameworks like Kafka and Axon and pipelines like Nifi, Proficient in OO programming (Python Java/Springboot/J2EE, and Scala) Experience with the Hadoop Ecosystem (HDFS, Yarn, MapReduce, Spark, Hive, Impala), Experience with Linux, Unix command line, Unix Shell Scripting, SQL and any Scripting language Experience with data visualization tools such as Tableau, Domo, and/or PowerBI is a plus. Experience presenting data findings in a readable and insight driven format. Experience building support decks. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-249888 Show more Show less

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies