Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
4.0 - 9.0 years
5 - 8 Lacs
Gurugram
Work from Office
RARR Technologies is looking for HADOOP ADMIN to join our dynamic team and embark on a rewarding career journey. Responsible for managing the day-to-day administrative tasks Provides support to employees, customers, and visitors Responsibilities:1 Manage incoming and outgoing mail, packages, and deliveries 2 Maintain office supplies and equipment, and ensure that they are in good working order 3 Coordinate scheduling and meetings, and make arrangements for travel and accommodations as needed 4 Greet and assist visitors, and answer and direct phone calls as needed Requirements:1 Experience in an administrative support role, with a track record of delivering high-quality work 2 Excellent organizational and time-management skills 3 Strong communication and interpersonal skills, with the ability to interact effectively with employees, customers, and visitors 4 Proficiency with Microsoft Office and other common office software, including email and calendar applications
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Back About UsBCE Global Tech is a dynamic and innovative company dedicated to pushing the boundaries of technology We are on a mission to modernize global connectivity, one connection at a time Our goal is to build the highway to the future of communications, media, and entertainment, emerging as a powerhouse within the technology landscape in India We bring ambitions to life through design thinking that bridges the gaps between people, devices, and beyond, fostering unprecedented customer satisfaction through technology At BCE Global Tech, we are guided by our core values of innovation, customer-centricity, and a commitment to progress We harness cutting-edge technology to provide business outcomes with positive societal impact Our team of thought-leaders is pioneering advancements in 5G, MEC, IoT, and cloud-native architecture We offer continuous learning opportunities, innovative projects, and a collaborative work environment that empowers our employees to grow and succeed Responsibilities Lead the migration of data pipelines from Hadoop to Google Cloud Platform (GCP) Design, develop, and maintain data workflows using Airflow and custom flow solutions Implement infrastructure as code using Terraform Develop and optimize data processing applications using Java Spark or Python Spark Utilize Cloud Run and Cloud Functions for serverless computing Manage containerized applications using Docker Understand and enhance existing Hadoop pipelines Write and execute unit tests to ensure code quality Deploy data engineering solutions in production environments Craft and optimize SQL queries for data manipulation and analysis Requirements 7-8 years of experience in data engineering or related fields Proven experience with GCP migration from Hadoop pipelines Proficiency in Airflow and custom flow solutions Strong knowledge of Terraform for infrastructure management Expertise in Java Spark or Python Spark Experience With Cloud Run And Cloud Functions Experience with Data flow, DateProc and Cloud monitoring tools in GCP Familiarity with Docker for container management Solid understanding of Hadoop pipelines Ability to write and execute unit tests Experience with deployments in production environments Strong SQL query skills Skills Excellent teamwork and collaboration abilities Quick learner with a proactive attitude Strong problem-solving skills and attention to detail Ability to work independently and as part of a team Effective communication skills Why Join Us Opportunity to work with cutting-edge technologies Collaborative and supportive work environment Competitive salary and benefits Career growth and development opportunities
Posted 2 months ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Title: ETL Tester About Us “Capco, a Wipro company, is a global technology and management consulting firm. Awarded with Consultancy of the year in the British Bank Award and has been ranked Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With our presence across 32 cities across globe, we support 100+ clients across banking, financial and Energy sectors. We are recognized for our deep transformation execution and delivery. WHY JOIN CAPCO? You will work on engaging projects with the largest international and local banks, insurance companies, payment service providers and other key players in the industry. The projects that will transform the financial services industry. MAKE AN IMPACT Innovative thinking, delivery excellence and thought leadership to help our clients transform their business. Together with our clients and industry partners, we deliver disruptive work that is changing energy and financial services. #BEYOURSELFATWORK Capco has a tolerant, open culture that values diversity, inclusivity, and creativity. CAREER ADVANCEMENT With no forced hierarchy at Capco, everyone has the opportunity to grow as we grow, taking their career into their own hands. DIVERSITY & INCLUSION We believe that diversity of people and perspective gives us a competitive advantage. MAKE AN IMPACT Job Title: ETL Tester Location: Pune Exp-5+years Key Responsibilities: Extensive experience in validating ETL processes, ensuring accurate data extraction, transformation, and loading across multiple environments. Proficient in Java programming, with the ability to understand and write Java code when required. Advanced skills in SQL for data validation, querying databases, and ensuring data consistency and integrity throughout the ETL process. Expertise in utilizing Unix commands to manage test environments, handle file systems, and execute system-level tasks. Proficient in creating shell scripts to automate testing processes, enhancing productivity and reducing manual intervention. Ensuring that data transformations and loads are accurate, with strong attention to identifying and resolving discrepancies in the ETL process. Focused on automating repetitive tasks and optimizing testing workflows to increase overall testing efficiency. Write and execute automated test scripts using Java to ensure the quality and functionality of ETL solutions. Utilize Unix commands and shell scripting to automate repetitive tasks and manage system processes. Collaborate with cross-functional teams, including data engineers, developers, and business analysts, to ensure the ETL processes meet business requirements. Ensure that data transformations, integrations, and pipelines are robust, secure, and efficient. Troubleshoot data discrepancies and perform root cause analysis for failed data loads. Create comprehensive test cases, execute them, and document test results for all data flows. Actively participate in the continuous improvement of ETL testing processes and methodologies. Experience with version control systems (e.g., Git) and integrating testing into CI/CD pipelines. Tools & Technologies (Good to Have): Experience with Hadoop ecosystem tools such as HDFS, MapReduce, Hive, and Spark for handling large-scale data processing and storage. Knowledge of NiFi for automating data flows, transforming data, and integrating different systems seamlessly. Experience with tools like Postman, SoapUI, or RestAssured to validate REST and SOAP APIs, ensuring correct data exchange and handling of errors. Show more Show less
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Bengaluru
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Hyderabad
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Chennai
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Mumbai
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 2 months ago
4.0 - 8.0 years
6 - 10 Lacs
Kolkata
Work from Office
About Us: Soul AI is a pioneering company founded by IIT Bombay and IIM Ahmedabad alumni, with a strong founding team from IITs, NITs, and BITS We specialize in delivering high-quality human-curated data, AI-first scaled operations services, and more Based in SF and Hyderabad, we are a young, fast-moving team on a mission to build AI for Good, driving innovation and positive societal impact We are seeking skilled Scala Developers with a minimum of 1 year of development experience to join us as freelancers and contribute to impactful projects Key Responsibilities: Write clean, efficient code for data processing and transformation Debug and resolve technical issues Evaluate and review code to ensure quality and compliance Required Qualifications: 1+ year of Scala development experience Strong expertise in Scala programming, functional programming concepts Experience with frameworks like Akka or Spark Should be skilled in building scalable, distributed systems and big data applications Why Join Us Competitive pay (‚1000/hour) Flexible hours Remote opportunity NOTEPay will vary by project and typically is up to Rs 1000 per hour (if you work an average of 3 hours every day that could be as high as Rs 90K per month) once you clear our screening process Shape the future of AI with Soul AI!
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Our Purpose Mastercard powers economies and empowers people in 200+ countries and territories worldwide. Together with our customers, we’re helping build a sustainable economy where everyone can prosper. We support a wide range of digital payments choices, making transactions secure, simple, smart and accessible. Our technology and innovation, partnerships and networks combine to deliver a unique set of products and services that help people, businesses and governments realize their greatest potential. Title And Summary Lead Software Engineer - Java/Scala Development, Hadoop, Spark Overview As a Lead Software Engineer at the Loyalty Rewards and Segments Organization, you will be responsible for designing, developing, testing, and delivering software frameworks in the areas of event-driven architecture and zero trust for use in large-scale distributed systems. Loyalty Rewards and Segments is an organisation within Mastercard that provide end to end loyalty management solution for banks, merchants and Fintechs. The ideal candidate for this role will have a strong background in software design, development, and testing, with a passion for technology and software development. They will be highly motivated, intellectually curious, and analytical, with a desire to continuously learn and improve. As a member of the Loyalty Rewards and Segments team, you will have the opportunity to work on cutting-edge technologies and collaborate with cross-functional teams to deliver software frameworks that meet the needs of Mastercard's customers. Role Key Responsibilities Lead the technical direction, architecture, design, and engineering practices. Prototype and proving concepts for new technologies, application frameworks, and design patterns to improve software development practices. Design and develop software frameworks using industry-standard best practices and methodologies Write efficient and maintainable code that meets feature specifications Debug and troubleshoot code to resolve issues and improve performance Validate software functionality, including performance, reliability, and security Collaborate with cross-functional teams to architect and deliver new services Participate in code reviews to ensure code quality and consistency Document software design, development, and testing processes Balance trade-offs between competing interests with judgment and experience. Identify synergies and reuse opportunities across teams and programs. Key Expectations Focus on individual and team objectives as an active participant in the Agile/Scrum development process, completing assignments on time, with the necessary quality, and in accordance with the project timeline Continuously learn and keep up-to-date with the latest software development technologies and methodologies Communicate effectively and professionally with team members and stakeholders Proactively identify opportunities for process improvements and efficiency gains Demonstrate a commitment to quality, best practices, and continuous improvement All About You Current, deep, hands-on software engineering experience in architecture, design, and implementation of large-scale distributed systems. Rich experience and deep knowledge in event-driven architecture is a must, and zero trust architecture expertise is highly desirable. Proficiency in Java, Scala & SQL (Oracle, Postgres, H2, Hive, & HBase) & building pipelines Expertise and Deep understanding on Hadoop Ecosystem including HDFS, YARN, MapReduce, Tools like Hive, Pig/Flume, Data processing framework like Spark & Cloud platform, Orchestration Tools - Apache Nifi / Airflow, Apache Kafka Expertise in Web applications (Springboot Angular, Java, PCF), Web Services (REST/OAuth) and tools ( Sonar, Splunk, Dynatrace) is must Expertise SQL, Oracle and Postgres Experience with XP, TDD and BDD in the software development processes Familiar with secure coding standards (e.g., OWASP, CWE, SEI CERT) and vulnerability management Strong understanding of software engineering principles, design patterns, and best practices Excellent analytical and excellent problem-solving skills and experience working in an Agile environment. Strong verbal and written communication to demo features to product owners; strong leadership quality to mentor and support junior team members, proactive and has initiative to take development work from inception to implementation. Passion for technology and software development, with a strong desire to continuously learn and improve Comfortable taking thoughtful risks and acquiring expertise as needed. Able to foster a comfortable environment for tough technical discussions where everyone can be heard. Corporate Security Responsibility All activities involving access to Mastercard assets, information, and networks comes with an inherent risk to the organization and, therefore, it is expected that every person working for, or on behalf of, Mastercard is responsible for information security and must: Abide by Mastercard’s security policies and practices; Ensure the confidentiality and integrity of the information being accessed; Report any suspected information security violation or breach, and Complete all periodic mandatory security trainings in accordance with Mastercard’s guidelines. R-246306 Show more Show less
Posted 2 months ago
0 years
0 Lacs
Pune, Maharashtra, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, Good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands Preferred Technical And Professional Experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python Show more Show less
Posted 2 months ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
LinkedIn is the world’s largest professional network, built to create economic opportunity for every member of the global workforce. Our products help people make powerful connections, discover exciting opportunities, build necessary skills, and gain valuable insights every day. We’re also committed to providing transformational opportunities for our own employees by investing in their growth. We aspire to create a culture that’s built on trust, care, inclusion, and fun, where everyone can succeed. Join us to transform the way the world works. This role will be based in Bangalore, India. At LinkedIn, we trust each other to do our best work where it works best for us and our teams. This role offers a hybrid work option, meaning you can both work from home and commute to a LinkedIn office, depending on what’s best for you and when it is important for your team to be together. As part of our world-class software engineering team, you will be charged with building the next-generation infrastructure and platforms for LinkedIn, including but not limited to: an application and service delivery platform, massively scalable data storage and replication systems, cutting-edge search platform, best-in-class AI platform, experimentation platform, privacy and compliance platform etc. You will work and learn among the best, putting to use your passion for distributed technologies and algorithms, API design and systems-design, and your passion for writing code that performs at an extreme scale. LinkedIn has already pioneered well-known open-source infrastructure projects like Apache Kafka, Pinot, Azkaban, Samza, Venice, Datahub, Feather, etc. We also work with industry standard open source infrastructure products like Kubernetes, GRPC and GraphQL - come join our infrastructure teams and share the knowledge with a broader community while making a real impact within our company. Responsibilities - You will design, build and operate one of the online data infra platforms that power all of Linkedin’s core applications. - You will participate in design and code reviews to maintain our high development standards. - You will partner with peers, leads and internal customers to define scope, prioritize and build impactful features at a high velocity. - You will mentor other engineers and will help build a fast-growing team. - You will work closely with the open-source community to participate and influence cutting edge open-source projects Basic Qualifications - BA/BS Degree in Computer Science or related technical discipline, or related practical experience - 5+ years industry experience in software design, development, and algorithm related solutions. - 5+ years experience programming in object-oriented languages such as Java, Python, Go, and/or Functional languages such as Scala or other relevant coding languages - Hands on experience developing distributed systems, large-scale systems, databases and/or Backend APIs Preferred Qualifications - Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto) - Experience with industry, open-source projects and/or academic research in data management, relational databases, and/or large-data, parallel and distributed systems - Experience with open-source project management and governance - Experience with cloud computing (e.g., Azure) is a plus. Suggested Skills: - Distributed systems - Backend Systems Infrastructure - Java You will Benefit from our Culture: We strongly believe in the well-being of our employees and their families. That is why we offer generous health and wellness programs and time away for employees of all levels. India Disability Policy LinkedIn is an equal employment opportunity employer offering opportunities to all job seekers, including individuals with disabilities. For more information on our equal opportunity policy, please visit https://legal.linkedin.com/content/dam/legal/Policy_India_EqualOppPWD_9-12-2023.pdf Global Data Privacy Notice for Job Candidates This document provides transparency around the way in which LinkedIn handles personal data of employees and job applicants: https://legal.linkedin.com/candidate-portal Show more Show less
Posted 2 months ago
5.0 years
7 Lacs
Hyderabad
Work from Office
Design, implement, and optimize Big Data solutions using Hadoop and Scala. You will manage data processing pipelines, ensure data integrity, and perform data analysis. Expertise in Hadoop ecosystem, Scala programming, and data modeling is essential for this role.
Posted 2 months ago
2.0 - 6.0 years
4 - 8 Lacs
Bengaluru
Work from Office
The Apache Spark, Digital :Scala role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Apache Spark, Digital :Scala domain.
Posted 2 months ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
The Digital :BigData and Hadoop Ecosystems, Digital :PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Digital :BigData and Hadoop Ecosystems, Digital :PySpark domain.
Posted 2 months ago
2.0 - 4.0 years
4 - 6 Lacs
Bengaluru
Work from Office
The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.
Posted 2 months ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
The Big Data (PySpark, Hive) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySpark, Hive) domain.
Posted 2 months ago
2.0 - 5.0 years
4 - 7 Lacs
Bengaluru
Work from Office
The PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the PySpark domain.
Posted 2 months ago
175.0 years
0 Lacs
Gurugram, Haryana, India
On-site
At American Express, our culture is built on a 175-year history of innovation, shared values and Leadership Behaviors, and an unwavering commitment to back our customers, communities, and colleagues. As part of Team Amex, you'll experience this powerful backing with comprehensive support for your holistic well-being and many opportunities to learn new skills, develop as a leader, and grow your career. Here, your voice and ideas matter, your work makes an impact, and together, you will help us define the future of American Express. Functional Description: American Express is on a journey to provide the world’s best customer experience every day. The GCS Product Analytics team plays a pivotal role within Global Commercial Services by developing cutting-edge data capabilities and leveraging advanced analytics to gain deep insights into client behavior. Our mission is to inform and shape product strategies and deliver connected, personalized experiences that foster deeper client engagement and drive sustainable, profitable growth. The Digital Measurement & Analytics team is a part of GCS Product Analytics team and delivers digital analytics and insights for the GCS suite of digital products & platforms. The team is responsible for innovating and transforming the process to measure and understand the customer behavior towards our digital tools. The analytical work will uncover insights that will drive GCS’ global digital strategy and optimize the customer experience. Through world class innovation and advanced analytics, the team will create segmentations, develop KPIs, models and strategic analytics to solve key business opportunities. This will be achieved through a close collaboration with the digital product teams, marketing, servicing and technologies. Purpose of the Role Deliver actionable insights for GCS Digital Experiences by democratizing digital data, measuring product performance, conducting customer behavior deep dives and go-to-market segmentations. The role requires exhibiting a high level of expertise in driving decisions backed by data insights, strategic and advanced analytics, and data techniques. They will drive improvements in generating data driven actionable strategies to enable business growth initiatives. How will you make an impact in this role? In this role, the incumbent will be part of Digital Measurement & Analytics team. They will apply advanced analytics to drive segmentations, develop KPIs, models and strategic analytics to solve key business opportunities. This will be achieved through a close collaboration with the digital product teams, marketing, servicing, technologies, and field teams. They will– design measurement framework, conduct behavioral deep dives using Amex closed loop data to uncover product improvement opportunities, enable experimentation (AB Testing), work with leadership to define product strategy and maintain product performance reports/dashboards. This role requires candidates with analytical bent of mind and exceptional quantitative, problem-solving, and business story-telling skills. Responsibilities: Specific job responsibilities may vary as per the team responsibilities, but will involve aspects of the below: Perform in-depth data analysis to deliver strategic priorities focused on the product roadmap for GCS Digital Experiences Define KPIs to measure the efficiency of digital channels/products and develop customer segmentation to drive “adoption and engagement” for AXP customers. Power in-depth strategic analysis and provide analytical and decision support by mining digital activity data along with AXP closed loop data. Gain deep functional understanding of the GCS digital channels over time and ensure analytical insights are relevant and actionable. Flawless execution of the development, validation and implementation of statistical projects and automated reports. Evaluate impact on business of different strategies/initiatives and generate insights and recommendations to fuel business growth. Build partnerships with internal partners such as Product, Technologies, Field, Servicing and Marketing to plan and prioritize various initiatives. Empower self-serve by crafting automated dashboards and reports using Adobe analytics suite or Tableau. Continuously broaden and strengthen knowledge of advance analytical methods and tools to further evolve our analytical practices. Minimum Qualification 1 to 3 years of relevant analytics experience. Advanced degree in business administration, computer science, IT or Information management from premium institutes (preferred) Strong analytical, strategic thought leadership and problem-solving skills with ability to solve unstructured and complex business problems. Team player: Able to collaborate with partners and team members to define key business objectives, and to align on solutions that drive actionable items. Strong interpersonal, written, verbal communication, presentation, and storytelling skills enabling ability to interact effectively with business leaders and to present structured and compelling messages addressed to various levels within the organization. Results driven with strong project management skills, ability to work on multiple priorities and ensure track to exceed team goals. Passion for data science and machine learning: Proven track record of independently developing novel analytical solutions optimizing business processes or product constructs. Strong ability to drive results, self-starter. Experience in digital domain preferred. Technical Skills/Capabilities Data Data manipulation – large & complex data sets Segmentation Analytics Business Intelligence & Visualization Machine Learning & AI Statistics & Hypothesis Testing Basic understanding of Agile product development Knowledge of Platforms Big Data – Cornerstone, Hive, MapReduce Digital Tracking – Omniture/Adobe Analytics, Clickstream Visualization - Tableau Compliance Language We back you with benefits that support your holistic well-being so you can be and deliver your best. This means caring for you and your loved ones' physical, financial, and mental health, as well as providing the flexibility you need to thrive personally and professionally: Competitive base salaries Bonus incentives Support for financial-well-being and retirement Comprehensive medical, dental, vision, life insurance, and disability benefits (depending on location) Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need Generous paid parental leave policies (depending on your location) Free access to global on-site wellness centers staffed with nurses and doctors (depending on location) Free and confidential counseling support through our Healthy Minds program Career development and training opportunities American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran status, disability status, age, or any other status protected by law. Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations. Show more Show less
Posted 2 months ago
2.0 - 5.0 years
4 - 7 Lacs
Hyderabad
Work from Office
The PySpark role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the PySpark domain.
Posted 2 months ago
2.0 - 4.0 years
4 - 6 Lacs
Chennai
Work from Office
The Big Data (Scala, HIVE) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (Scala, HIVE) domain.
Posted 2 months ago
2.0 - 5.0 years
4 - 7 Lacs
Chennai
Work from Office
The Big Data (PySPark, Python) role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the Big Data (PySPark, Python) domain.
Posted 2 months ago
3.0 - 5.0 years
11 - 15 Lacs
Bengaluru
Work from Office
Project description Join our data engineering team to lead the design and implementation of advanced graph database solutions using Neo4j. This initiative supports the organization's mission to transform complex data relationships into actionable intelligence. You will play a critical role in architecting scalable graph-based systems, driving innovation in data connectivity, and empowering cross-functional teams with powerful tools for insight and decision-making. Responsibilities Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages EnglishC1 Advanced Seniority Senior
Posted 2 months ago
2.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
The Applications Development Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements. Identify and analyze issues, make recommendations, and implement solutions. Consult with users, clients, and other technology groups on issues, and recommend programming solutions and support customer exposure systems. Apply fundamental knowledge of programming languages for design specifications. Utilize knowledge of business processes, system processes, and industry standards to solve complex issues. Analyze information and make evaluative judgements to recommend solutions and improvements. Conduct testing and debugging, utilize script tools, and write basic code for design specifications. Assess applicability of similar experiences and evaluate options under circumstances not covered by procedures. Develop working knowledge of Citi’s information systems, procedures, standards, client server application development, network operations, database administration, systems administration, data center operations, and PC-based applications. Appropriately assess risk when business decisions are made, demonstrating consideration for the firm's reputation and safeguarding Citigroup, its clients, and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Skills And Qualifications 2-5 years of relevant experience. Proficient understanding of distributed computing principles Proficient in SQL coding and Tuning. Experience with Tableau software both on Tableau Desktop and Tableau Server. Experience with integration of data from multiple data sources. Good knowledge of Big Data querying tools, such as Hive and Impala. Knowledge on Hadoop v2, MapReduce, HDFS, Pyspark, and Spark is plus but not mandatory. Education: Bachelor’s degree/University degree or equivalent experience Responsibilities Strong on data analytics with executing SQL queries in hive/impala/spark to analyze data and fix data issue as and when required. Strong experience on data structure. Developing advanced reporting, analytics, dashboards, and other business-intelligence solutions. Creating visually appealing and interactive dashboards is a primary responsibility. Connecting Tableau to different databases, ensuring data accuracy, and maintaining the integrity of data feeds also responsible for cleansing and preparing data to be effectively used in Tableau for analysis and reporting. Responsibility includes improving load times, enhancing responsiveness, and handling large datasets efficiently. Setting up appropriate access controls, monitoring data usage, and protecting sensitive information. Improving performance by twisting SQL queries. Discovering areas of automation for making the business process more efficient. Performing and documenting data analysis, data validation, and data mapping/design. This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster. Show more Show less
Posted 2 months ago
3.0 - 5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Title: Data Scientist SDE-2, SDE-3 & SDE-4 (Staff Data Scientist) Location: Noida & Bangalore Job Type: Full Time About us: PhysicsWallah is an Indian online education technology startup based in Delhi, originally created as a YouTube channel in 2014 by Mr. Alakh Pandey. We are the first company aiming to build an affordable online education platform for each Indian student who dreams of IIT & AIIMS but is unable to afford the existing offline/online education providers. We provide e-learning via our YouTube Channel and PhysicsWallah App/Website by providing lectures for JEE Mains and Advanced level, NEET and Board Exams. We are India’s first most viewed Educational channel on Youtube. YouTube Channel- https://youtube.com/c/PhysicsWallah About the Role: Qualification & Eligibility: Bachelor's or higher degree in a quantitative discipline (computer science, statistics, engineering, applied mathematics) Working Experience: SDE-2: 3 to 5 Years SDE-3: 5 to 7 Years SDE-4 (Staff): 7 to 10 Years Startup experience preferred, Edtech work experience bonus Roles & responsibilities: Help teams understand what data science can do for them and set the right expectations. Use deep learning, machine learning, and analytical methods to create scalable solutions for business problems. Create innovative solutions and applications utilising advanced NLP algorithms/architectures including (but not limited to ) LLMs for tasks such as text generation, summarization, translation, entity extraction and concept recognition, clustering, and more. Contribute to the execution of our vision for NLP-based technology solutions using various NLP toolkits like Huggingface, Spacy, CoreNLP, OpenNLP, etc. Perform relevant data analysis and benchmark the NLP solutions to improve our offerings. Be able to clearly communicate results and recommendations to various stakeholders. Evaluate the effectiveness of the solutions and improve upon them in a continuous manner. We expect one to have a mix of a strong technical background, the ability to understand the business implications of their work, and the ability to empathise with our users and work towards helping PhysicsWallah give them the best experience. Help and mentor junior members to become better data scientists. Skill Sets: Experience in building NLP (ML/DL) models. Strong foundational knowledge in transformers ( BERT, GPTs, T5s, etc.), embedded. Expertise in SQL and Python is a must. Hands-on experience with latest GenAI modelling (GPTs, Mistral, Falcon, and LLama) and approaches (RAG, Langchain, etc.). Experience using machine learning (structured, text, audio/video data) libraries (preferably in Python), deep learning (Tensorflow, PyTorch) Good to have: Foundational knowledge of any cloud (AWS/Azure/GCP) Expertise in querying relational, non-relational, graph databases. Experience in big data technologies (Spark, MapReduce, Pig, and Hive) Show more Show less
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39817 Jobs | Dublin
Wipro
19388 Jobs | Bengaluru
Accenture in India
15458 Jobs | Dublin 2
EY
14907 Jobs | London
Uplers
11185 Jobs | Ahmedabad
Amazon
10459 Jobs | Seattle,WA
IBM
9256 Jobs | Armonk
Oracle
9226 Jobs | Redwood City
Accenture services Pvt Ltd
7971 Jobs |
Capgemini
7704 Jobs | Paris,France